New MetaHuman Animator feature set to bring easy high-fidelity performance capture to MetaHumans

We’re very excited to announce that there is a new MetaHuman feature set coming soon.

MetaHuman Animator will enable you to use your iPhone or stereo helmet-mounted camera (HMC) to reproduce any facial performance as high-fidelity animation on MetaHuman characters. With it, you’ll be able to capture the individuality, realism, and fidelity of your actor’s performance, and transfer every detail and nuance onto any MetaHuman to bring them to life in Unreal Engine.

Photorealistic digital humans need high-quality animation to give truly believable performances, but the expertise and time to create this is challenging for even the most skilled creators. It’s incredibly difficult for existing performance capture solutions to faithfully recreate every nuance of the actor’s performance on your character. For truly realistic animation, skilled animators are often required to painstakingly tweak the results in what is typically an arduous and time-consuming process. 

But that’s about to change!

Planned to release in the next few months, MetaHuman Animator will produce the quality of facial animation required by AAA game developers and Hollywood filmmakers, while at the same time being accessible to indie studios and even hobbyists. With the iPhone you may already have in your pocket and a standard tripod, you can create believable animation for your MetaHumans that will build emotional connections with your audiences—even if you wouldn’t consider yourself an animator.

MetaHuman Animator also works with any professional vertical stereo HMC capture solution, including those from Technoprops, delivering even higher-quality results. And if you also have a mocap system for body capture, MetaHuman Animator’s support for timecode means that the facial performance animation can be easily aligned with body motion capture and audio to deliver a full character performance. It can even use the audio to produce convincing tongue animation.

So how does it work?

The key is in ensuring the system understands the unique anatomy of the actor’s face, and how that should relate to your target character. MetaHuman Animator starts by creating a MetaHuman Identity of your performer: a mesh that has the same topology and shares the same standard rig as all MetaHumans.

If you’ve used Mesh to MetaHuman, you’ll know that part of that process creates a MetaHuman Identity from a 3D scan or sculpt; MetaHuman Animator builds on that technology to enable MetaHuman Identities to be created from a small amount of captured footage. The process takes just a few minutes and only needs to be done once for each actor.

With that step complete, the MetaHuman Identity is used to interpret the performance by tracking and solving the positions of the MetaHuman facial rig. The result is that every subtle expression is accurately recreated on your MetaHuman target character, regardless of the differences between the actor and the MetaHuman’s features. In Unreal Engine, you can visually confirm the performance is coming through and compare the animated MetaHuman Identity with the footage of the actor, frame by frame.

Another benefit is that the animation data is clean; the control curves are semantically correct, that is, they are where you would expect them to be on the facial rig controls—just as they would be if they had been created by a human animator—making them easy to adjust if required for artistic purposes.

Since we’ve used a MetaHuman Identity from the outset, you can be safe in the knowledge that the animation you create today will continue to work if you make changes to your MetaHuman in the future, or if you wish to use it on any other MetaHuman.

MetaHuman Animator’s processing capabilities will be part of the MetaHuman Plugin for Unreal Engine, which, like the engine itself, is free to download. If you want to use an iPhone—MetaHuman Animator will work with an iPhone 11 or later—you’ll also need the free Live Link Face app for iOS, which will be updated with some additional capture modes to support this workflow.

Can’t wait to get started? Make sure you’re signed up for the newsletter below, and we’ll let you know when MetaHuman Animator is available.

    Stay in the loop!

    Enter your email address to sign up for the latest news and announcements on MetaHuman, Unreal Engine, and other tools in the Epic Games ecosystem.