July 31, 2019
Virtual production: performance capture for everyone
The goal of performance capture, as the name suggests, is to transfer the actor’s unique acting performance to the character, bringing a level of realism and personality that would be extremely time-consuming (and perhaps even impossible!) to achieve with keyframing. Real-time performance capture adds the element of immediacy, where the actor’s entire performance is instantly transferred to the CG character to create a real-time animated sequence.
From the neck down, real-time motion capture and retargeting has already made its mark in projects ranging from the film Welcome to Marwen to the Fortnite cinematic trailer. While this technique opened new doors for creativity and collaboration on set, the final versions of the characters still needed more work after the mocap session—facial animation had to be done on a separate pass and combined with the body motion afterward.
Adding real-time facial capture and retargeting to the mix opens up an entirely new door: as the actor performs, the filmmaker can see the final pixel of the entire CG character in real time. Companies like Cubic Motion, 3Lateral, and Digital Domain have been deep-diving into facial capture R&D over the past few years, and they also develop custom performance capture solutions for their individual clients.
But what if you want to have your own system in-house, or you simply want to try it out?
If hearing about real-time performance capture has you itching to give it a go, you’re in luck. Here are a couple of facial capture systems you can incorporate into your motion capture pipeline to create a performance capture solution that works anywhere you like.
Apple iPhone XAt the 2018 SIGGRAPH conference, the winner of the Real-Time Live! competition was Kite & Lightning with their presentation Democratizing MoCap: Real-Time Full-Performance Motion Capture with an iPhone X, Xsens, IKINEMA, and Unreal Engine. This project showed a real-time UE4 character, “Beby,” who was driven by a combination of body and facial capture.
The facial capture device was an iPhone X. Cory Strassburger, Co-Founder of Kite & Lightning, performed with the iPhone suspended off a paintball helmet as a head-mounted camera (HMC). For body capture, Strassburger chose to use an XSens MVN mocap suit.
This approach to facial capture has since been repeated by others and is now supported in UE4—a facial AR sample is currently included as a free test example under the Learn tab in the Epic Launcher, using the character asset from the demo project A Boy and his Kite.
The Apple ARKit, the iPhone’s built-in camera technology, and Apple’s visual inertial odometry all work to stabilize the footage and then produce over 50 blend shapes that can drive the virtual face. Hence, people can hand-hold their iPhones and get such systems to drive faces in real time in UE4 without having to attach the iPhone to a helmet. Apple even uses this facial capture approach to enable live Facetime chats using Animoji or Memoji characters.
A handheld iPhone X, or one strapped to an ordinary helmet, is a low-cost and accessible facial capture solution. Couple it with body mocap, and you have a performance capture system you can use pretty much anywhere.
Cubic Motion PersonaCubic Motion has been a part of some of the most significant advances in UE4 performance capture. From the breakout Hellblade and its Senua’s Sacrifice SIGGRAPH demo to the incredibly lifelike Siren, Cubic Motion has been providing the real-time markerless tracking and solving that drives such animation rigs.
The company regularly produces hours of facial solutions for its game clients, and is hired as an outside specialist to help with VFX projects. They are computer-vision and facial experts who know faces and how to solve them, very quickly and extremely well.
Now Cubic Motion is offering these commercial services as a takeaway package called Persona. The system provides hardware, software, and custom solvers as a complete high-end solution for emotion capture and real-time performances.
While the Apple iPhone approach is a great solution for facial capture, the results are not as accurate as those that can be achieved with dedicated hardware. The stability of Persona’s head-mounted camera means that facial expressions can be read more exactly, and its inclusion of customized solvers makes it more responsive to individual actors’ facial movements.
To develop Persona, Cubic Motion took their facial-tracking service, which is one of the most advanced in the world, and packaged it with new custom hardware. The system supports multiple actors, each wearing a 3D printed high-quality HMC, with the on-body animation system facilitating a completely untethered performance that lets the actors move freely around the stage or volume.
Cubic Motion also provides the same service they give to their AAA game clients—at the start of each project, you send a range of motion (ROM) clip to them along with your target animation rig. The team then produces a custom solver for each actor.
Each actor has his or her own small wearable computer in addition to the HMC. The computer does a full track solve, streaming the data back on set wirelessly. Persona also includes a management system which sits on a main computer at the side of the stage. From there, the crew can manage all the configuration and monitoring. The data streams are directed to the Unreal Engine plugin to provide the real-time facial animation.
A key design aspect is the reduced data that flows from each actor’s Persona computers to the main Unreal plugin. As the full video footage is not being sent, the system is robust and able to work without special cabling or transmitters.
Persona is not for everyone, but it does give VFX and games companies the means to achieve exceptionally high-end and advanced real-time facial animation in-house. While Cubic Motion does not exclusively use UE4, they are quick to point out that nearly all their high-end customers choose UE4 for this style of work.
Check out these solutions for yourself, and download Unreal Engine and see what it can do for your projects. And then visit our Virtual Production page to get great podcasts, videos, and articles on virtual production!