Animating and rigging the robot in Valley of the Ancient

Jeremiah Grant, Technical Product Manager
Hello, I am Jeremiah Grant, Technical Product Manager for animating in engine at Epic Games. This blog post will walk through the mentality, implementation, and tools used to rig and animate ‘The Ancient One,’ the massive robot encountered at the end of UE5 Early Access project, Valley of the Ancient. Created in close collaboration with Aaron Sims Creative Company, The Ancient One was designed to push the limits of some of our animation and rigging systems by leveraging Nanite and animating 100% in Unreal Engine 5 Early Access.

Let’s dive in.
 

Constructing the Blueprint

When concepting The Ancient One, we knew we were going to use Nanite to push the geometric detail on the creature. This introduced several design and workflow considerations. Rather than relying on Skeletal Meshes and traditional skinning, the robot was designed to be rigidly articulated with high-resolution Nanite meshes attached to the skeleton via a Blueprint. This meant every articulating element needed to be a unique mesh, requiring well-defined organization if we were to avoid madness.

We approached the organization problem in a similar way that we would approach building a level, with clear naming schemes and folder structures. Each mesh element followed a specific naming pattern that was paired with the bone names in the skeleton. Geometry and materials were organized in folders by body region to keep the project organization easier to navigate, for example, head, clavicles, legs. By taking this approach, we were able to quickly derive how the geometry should be attached to the skeleton using a small function in BP_AncientOne. It also meant the same function would automatically attach new geometry and bones as they were added.

The AttachMeshesToRig function simply iterates through all children of the BodyParts component in the BP and uses Attach Component to Component. To add new geometry to the rig, geo was imported into UE, then dragged under the BodyParts component.
Attach Meshes to Rig function in BP_AncientOne.

Skeleton

The skeleton itself was derived from the MetaHuman skeleton, leveraging the same base hierarchy with fewer fingers and removing corrective bones like twists (AncientOne_skeleton_Skeleton). The skeleton was modified to the appropriate proportion in Maya and was skinned to a simple cube to generate a skeletal mesh in Unreal to which we could attach the Nanite meshes. Using the MetaHuman skeleton as a base meant we were able to immediately use the MetaHuman Control Rig to start animating in UE. To allow for easier iteration on the character without greatly affecting the animation or skeleton, the robot was built to be the same proportion as a MetaHuman then scaled in engine. This provided a great deal of flexibility when feeling out the scale of the environment and character interactions.
The Ancient One’s skeleton to which all meshes were attached.

Control Rig

In UE5 Early Access, we’re currently unable to visualize static or Nanite meshes in the Control Rig editor. Since Control Rig is always evaluating, we tested the rig by placing it in the level and making changes in the Control Rig—the rig changes are updated and propagated by compiling, enabling us to visualize and test the performance of the rig on a 50-million-triangle mesh in real time. 
The Level Editor is on the left. The Control Rig editor is on the right.
To create the Control Rig for the robot, we started with the Control Rig from the MetaHuman project and made a couple modifications: AncientOne_Body_CtrlRig. First, we updated the Rig Hierarchy and the Preview Mesh to the new geometry. The Rig Hierarchy can be updated by right-clicking and selecting Refresh Hierarchy, then choosing the correct Skeletal Mesh. This will replace the skeletal hierarchy with the updated one, trying to keep the rest of the rig hierarchy like controls, spaces, and custom bones intact.

Control Rig has three modes: Setup Event, Forward Solve, and Backward Solve. In the MetaHuman rig, the Setup Event adapts all the controls to the imported Rig Hierarchy, enabling us to reuse this rig. It does this by reading the initial values of the bones and setting the Control Offset Transform, placing the controls in the appropriate locations for the skeleton. Since this is a user-generated graph, we can do complex actions like defining where a pole vector is located, placing both FK and IK controls, or setting initial values that we can use later in the graph.

The MetaHuman Control Rig assumes a symmetrical character, so a few adjustments were made to change this for the asymmetrical robot. In the Setup Event graph, we duplicated the Setup Left Arm section, populated the fields with the appropriate information, and connected them into the graph. We also changed the item collections referencing finger bones in the Forward and Backward Solve graphs to reference variables set at the beginning of the Setup Event; this enabled us to easily change the number of fingers being referenced as we were developing the character without having to make changes in multiple places.
Setup Event graph for the left leg, placing IK, FK and Pole Vector controls.
In addition to the changes made to the base MetaHuman Control Rig, controls were also added for armor plates that we may want to animate directly. One specific request by the animation team was to have the arms default to IK rather than FK. This was very easily accomplished by changing the Initial Bool values on the IKFK bool controls, such as arm_l_fk_ik_switch and leg_l_fk_ik_switch

The Ancient One has a lot of internal mechanisms and motion that we wanted to drive procedurally, decreasing the amount of work on the animators and enabling us to test out procedural animation workflows in a cinematic-to-gameplay animation context. Rather than extending the MetaHuman rig to add in the auto-driven elements of the character, we broke out all procedurally driven mechanics into a separate Control Rig: AncientOne_Mechanics_CtrlRig. This Control Rig gets added to a Post-Process Animation Blueprint that is assigned to the “Post Process Anim BP” setting in the Skeletal Mesh Asset Details (AncientOne_Post_AnimBP). Post Process Animation Blueprints evaluate after everything else has been processed, including Level Sequences or State Machines. This means that whether the rig was being animated in Sequencer or playing a firing animation, all the procedural animation would be firing.
 
Arm pistons, gears, hip armor, and more are procedurally animated.

Unreal 5 Early Access introduces Functions into Control Rig. This made it very easy to compartmentalize and reuse behavior on different parts of the rig, for example, pistons on the shoulders and chest of the robot.

Looking at the function clavicle_pistons, we can see how this is structured. Two item collections are passed in, one for the start bones and one for the end bones of the pistons. Inside the function, the start and end bones are matched up by name and aimed at each other. After all items have been looped over, the function returns the final poses before moving onto the next node in the Rig Graph. By reusing the function for different piston groups, we’re able to have a smaller, more legible graph and fix any issues in a single location.
The Piston function in AncientOne_Mechanics_CtrlRig is reused for each grouping of pistons in the rig.
The piston function is a great example of a graph that adjusts a pose based on the incoming performance; in other words, automatically posing the pistons based on how the arms are animated. Looking at another function, core_gears, we can see another use case for Control Rig graphs. Here, we have procedural motion to create clock-like oscillation of the core gearing and discs. The Accumulated Time node creates a time for us to use and feed into other nodes, driving rotation or feeding into sine to create some oscillations. To provide some higher-level control over this behavior, we’ve exposed a gear_weight variable that we feed into the Speed value of the Accumulated Time node, enabling us to disable or blend off the behavior through Blueprints. In this particular case, we use this variable to blend off the gear motion once The Ancient One gets defeated.
The core_gears function provides art-directable control over the procedural gear motion.

Animating in Sequencer

When developing The Valley of the Ancient, the goal was always to animate as much of The Ancient One’s performance in UE5 EA. A great many improvements have recently been made to Sequencer, Unreal Engine’s nonlinear animation tool, and this project gave us the opportunity to see how far we could push it.
 
Interact with the animation rig in the Level Editor, leveraging a growing suite of utilities.

To start creating animations for the robot, we began by creating a Level Sequence in the level and adding BP_AncientOne to it. An important thing to note is that the robot is placed and scaled in the environment via the Actor Transform in the World Details Panel, not Control Rig. This is then keyed in Sequencer so we can ensure it’s always placed appropriately. This means we have the ability to play with the scale of the character and not affect the animation performance. It also means the animations are always relative to its actor-location, maintaining a relatively origin-centered location for gameplay animations and cinematic bookends. This ensures blending back and forth between Level Sequences and Animation Blueprints is seamless.

Now that the Level Sequence is set up, we are able to start animating. The Control Rig body animation rig can be added to Sequencer by clicking the +Track on BP_AncientOne and navigating to Control Rig -> Asset-Based Control Rig -> AncientOne_Body_CtrlRig. Adding the Control Rig track will automatically switch the Level Editor mode to Animation Mode. This can also be done manually by clicking on the 'running-man icon' in the toolbar above the viewport. Expanding the track in the Level Sequence will reveal a list of controls that can be keyed and we can now also see controls in the viewport that can be manipulated and animated.
 
How to setup the robot in Sequencer.

With that understanding of how a basic Level Sequence is set up for the robot, we can take a look at some of the gameplay animation Level Sequences. These sequences can be found in the following folder: /AncientBattle/Characters/AncientOne/Animations/SourceSequences/. Opening the SEQ_Robot_Fire Level Sequence loads the robot in the appropriate location in the world and we’re able to tweak this Level Sequence to adjust our firing animation. 

Note: Opening any of the Source Sequences in a new level will load the robot where it spawns in the main level, not at the origin. This offset Transform is keyed in the Level Sequence.

New to UE5 EA are some animation utilities found in the tool shelf in the Animation Mode window. From here you can quickly change the viewport mode to Select Only Controls, making it much easier to select controls in the Level Editor while avoiding other world objects. Additionally, this is where you can access the new Pose Library, Tween tool, and Snapper tool. More information can be found in the UE5 EA documentation. The Pose Library, in particular, came in handy during the development of the project. The animators were able to create Control Rig poses for later use, such as hand poses and full body poses like the idle pose. Since poses also store what controls they affect, they can also be used as selection sets, enabling you to quickly select controls that you need to frequently manipulate. These poses are saved as assets, enabling you to apply the poses in different Level Sequences. When adjusting the idle pose in the intro sequence, we were able to quickly update the idle poses in the other animation states by storing and pasting the pose in the other Level Sequences.
The Pose Library is one of the new animation utilities added to UE5.
Level Sequences have a newer feature, introduced in 4.26, called Linked Animation Sequences. Linked Animation Sequences enable you to automatically update a skeletal Animation Sequence every time the Level Sequence is saved. What this means is that by animating in Sequencer, you can automatically bake out your animation to use in Animation Blueprints, with zero round-tripping. This was the main method used for creating all the gameplay Animation Sequences for the robot. 

To create a Linked Animation Sequence, right-click on the Actor in Sequencer, BP_AncientOne, and choose Create Linked Animation Sequence. This will prompt you for a location to save the Animation Sequence. After creating this link, be sure to save the Level Sequence and the new Animation Sequence.
Linked Animation Sequences can be created from Sequencer Actor tracks.
Navigating to the linked Animation Sequence can be accomplished by right-clicking on the Actor in Sequencer and selecting Open Linked Animation Sequence. Likewise, getting to the originating Level Sequence from the Animation Sequence can be accomplished by going to the Edit in Sequencer button in the Animation Sequence Editor and clicking on the Open Level Sequence.

Tip: To create additional Level Sequences for different animation states, we just duplicated the Level Sequence assets and deleted the keyframes to start fresh. If the duplicated Level Sequence was linked to an Animation Sequence, be sure to create a new linked Animation Sequence so you don’t accidentally overwrite the wrong Animation Sequence.

A great benefit to using Control Rig in Sequencer combined with Linked Level Sequences is how easy it is to update animations once the skeleton is modified. Since Sequencer is driving controls, not bones, it will respond to any changes in the Control Rig. Simply opening the Level Sequence and re-saving it will update the animation asset, rebaking to the latest skeleton changes.

Full-Body IK


Last but not least, we’ll take a look at the Full-Body IK layer added to extend and adjust the posture of The Ancient One as it aims at our hero. The effect is designed to be additive, enhancing the animator’s intent, not replacing it.
 
Full-Body IK enhances the animation, enabling the animation to react to the player.

In UE5 EA, the Full-Body IK solver has been improved to be much faster and easier to manipulate. Full-Body IK can be accessed as a node in Control Rig, with this particular effect found in CR_AncientOne_ArmAim_CtrlRig

Before we look at the Control Rig asset that is creating the effect, let’s take a look at how it’s triggered. The robot’s Animation Blueprint, AncientOne_AnimBP, controls the behavior with a simple state machine, then passes the pose into a Control Rig node before outputting the final pose. We’re also gathering a few variables to pass into the Control Rig node as well as driving the weight amount to blend on and off the Control Rig node itself via the AlphaCR float variable.
The Arm Aiming is controlled by the Animation Blueprint, after the state machine has evaluated.
With the Alpha_CR variable, we can control when the Full-Body effect is applied and how quickly it blends on or off with the Interp properties of the Control Rig anim node. Selecting the node and looking in the Details Panel, we can see the Alpha Scale Bias Clamp section. A great trick we like to use is having different speeds between the blend on versus the blend off behavior. Adjusting the “Interp Speed Increasing” and “Interp Speed Decreasing” enables us to do just that. The Control Rig blends on quickly but blends off slowly. This is reflected in the node weight pin description: FInterp(Alpha, (10:2))

The Reach Amount variable provides us with a quick 0-1 float value to set to adjust how much the robot will reach toward our protagonist Echo when firing. It’s not driven through gameplay, rather just a value we set in the AnimBP for extra art-directable control. The Laser Location variable represents the position in the world where the laser is hitting the ground. This value is set directly from BP_AncientOne and tells the FBIK in Control Rig what to reach toward.

We can open up the Control Rig by double-clicking the node in the AnimBP. The graph is very simple at first glance. The LaserLocation is converted from its world space to be relative to the robot, then used to drive a control called actor_transform. This control is used to represent where the robot is firing. By bypassing the Set Translation control, we can move the control manually to test and debug the robot directly in the Control Rig editor.
The Arm Aim Control Rig uses functions and branching, allowing for quick testing.
By default, the Control Rig is going to use Full-Body IK, but we did include a BasicIK variant to represent a simplified approach. Double clicking on the AimWithFBIK node takes us into the function creating the full-body IK effect.
Full-Body IK Control Rig function with comments for each logical section.
The AimWithFBIK function can be broken down into several pieces: gathering data, adjusting the hips, calculating the reach amount, applying the Full-Body IK, and finally adjusting the hand orientation.

A quick preface: Creating this setup was a journey of discovery and a few pieces were left in but are unused.

The first node we encounter in the function is Entry node, followed by a Sequence node. We often use Sequence nodes to help break up functionality into logical steps, improving readability and ease of debugging. The ‘A’ pin of the Sequence node handles gathering data and adjusting the hips. To gather the bone transform data, we use a Get Transform node and feed the Transform value to a Set Transform node of a control. Creating controls and setting their transforms to the bone transform is a great way to store the incoming animated pose from the animation blueprint—both because what is occurring is very visual, and it’s easy to debug when something goes wrong. We’re storing several bone transforms on controls, but the ones we end up using are foot_l, foot_r, and hand_r bones and controls.

Following the Get/Set Transform pairings to store our bones on controls is an Offset Transform node and a little bit of logic to ease the pelvis forward. This essentially fakes the robot leaning forward as the laser fires. As the laser blast fires away from the robot, a small amount of that movement is multiplied into that Offset Transform.

Moving down the Sequence node to the B pin, we start refining the data for the full-body IK. The theory behind this is we want to pull the right arm toward Echo when the laser is firing, but without pulling the hand down toward the ground. To accomplish this, we combine the Z position of the hand_r_ctrl with the X and Y positions of the actor_transform. We can now interpolate between our incoming animated hand transform, stored on hand_r_ctrl, and our new transform we just calculated. The output transform from the Interpolate represents our new goal pose for the right hand, which gets fed right into the full-body IK solver.
Calculating the reach pose transform, clamping to a safe pose with a Remap.
Now that all the data has been gathered and refined, we finally feed it all into the solver. It’s important to take the time to ensure any data gathered is clean and providing expected results before feeding it into any solver.
The Full-Body IK solver uses the stored animated transforms of the feet and our calculated hand pose transform to adjust the entire body pose.
Rather than explaining each of the solver settings here, head over to our recently recorded livestream: Motion Warping and Full-Body IK | Inside Unreal. In this video, we deep-dive into the inner workings of the solver and how it is used on The Ancient One. At a high level, all we want to do is keep the feet planted and tweak the rest of the body to lean toward Echo, while reaching forward with the right hand. By adjusting some bone settings on the solver, we’re able to refine which bones can twist and move as the solver tries to reach the transforms we’ve provided.

The last piece of this function is aiming the hand itself at the laser hit location, stored as the actor_transform control. Rather than relying on the solver to get the hand orientation just right, we let the solver do the rough posing pass and refine the pose after the solver’s finished. In this way, we use the solver for its strengths and maintain art-directability without a lot of frustration. For the Aim setup, we’re just identifying the Axis for the palm of the hand and providing the actor_transform as the target. 

Tell your stories

Animation is about telling a story and at Epic, we’re working hard to build tools for storytellers. We’re just now seeing what Sequencer can do with character animation, whether hand-keyed or with motion capture. With tools like Control Rig and Full-Body IK, combined with a runtime engine, the stories we tell will be more dynamic, immersive, and impactful. 

Although the character rig and runtime setup can be complex, by building in layers and compartmentalizing the logic, we were able to build a character that was easier to debug, more flexible and procedurally art-directable. Every part we built was designed to empower the artists and designers; Control Rig provided the dials and switches and was just one tool in a well-stocked toolbox. 

We go into more detail about the Full-Body IK solver and the graph implementation (along with my blunders along the way) in our recent Inside Unreal livestream

UE5 is still in Early Access, and rigging and animation are capabilities we’re dedicated to getting right. We welcome your feedback to help us make Unreal Engine 5 the best it can be. Join us on the forums or even ping me on Twitter.

To learn more about the Valley of the Ancient animation features, along with the rest of the amazing work that went into the project, you can read more in the guided tour in our Content and Samples page.


Additional Links:
Meerkat Sample Project
MetaHumans
Animating with Control Rig in 4.26 | Inside Unreal
Evolving Animating with Unreal Engine

    Get Unreal Engine today!

    Get the world’s most open and advanced creation tool. 
    With every feature and full source code access included, Unreal Engine comes fully loaded out of the box.