February 13, 2019
Digital humans: 3Lateral cracks the code for real-time facial performance
Listen to the full podcast below or read on for an overview, then visit our virtual production hub for more podcasts, videos, articles, and insights.
There’s a good chance you’ve already seen 3Lateral’s work in action. The company was behind the facial capture and animation for AAA games like Marvel's Spider-Man, Activision’s Call of Duty, and Rockstar Games’ Grand Theft Auto and Red Dead Redemption. With Epic Games, 3Lateral has worked on real-time digital humans in projects such as Siren, Hellblade, and Osiris Black with Andy Serkis.
Mastilovic has been fascinated by facial animation since he saw the groundbreaking film The Abyss in 1989, where director James Cameron created the first CG face made entirely of water in film history.
“I was completely mesmerized with that,” says Mastilovic, “and I wanted to know more about how that was done.”
Fifteen years later, Mastilovic would go on to found 3Lateral in his native Serbia, and forever change the way facial motion capture and performance are approached in the industry.
From micro expressions to true performance
Underlying 3Lateral’s work is the Facial Action Coding System (FACS), which categorizes facial muscle movements. FACS was developed in the 1960s by researcher Dr. Paul Ekman, who studied cultures untouched by modernization and discovered that all human beings use the same facial expressions to express emotion.
Ekman’s research led him to develop the science of micro expressions, the facial contractions common to all peoples of the world. “I love the fact that there is a nonverbal universal language between every human being,” says Mastilovic.
While FACS has long been used by the VFX community to inform manually keyframed facial animation, 3Lateral has taken things a step further. Part of 3Lateral’s secret sauce is “rig logic,” which goes beyond traditional facial capture systems to retain the essence of the live actor’s performance while targeting it realistically to a digital character. As an actor performs live on camera, 3Lateral’s system translates the actor’s facial movements to FACS in real time, which then tells the target facial rig what to do.
To inform the transfer of human performance to a completely different facial structure, 3Lateral introduced the notion of digital DNA. “DNA is a set of measurements that we are observing on a particular face,” explains Mastilovic. “The DNA file contains the offset from the general model that is specific to a person. And that DNA file then becomes the key for translating geometric data into semantic data.”
3Lateral recently showcased this technology in the Osiris Black project, where actor Andy Serkis’s performance of a Shakespearean monologue was targeted to an alien character in real time. The video shows the power of this virtual production technique where, despite the fact that Serkis has facial features very different from the target character, the realism, emotion, and power of the original performance remain. “That's a technology that translates the performance Into this nonverbal universal language,” says Mastilovic. ”We basically just transferred his [Serkis’s] performance into FACS code and then just loaded it on the alien.”
The result stunned the industry, and opened up a new world for real-time facial performance capture in virtual production.
Beyond the rigs: The future with Epic Games
Epic acquired 3Lateral last month. Mastilovic, with his lifelong passion for realizing lifelike human performance in CG, looks forward to expanding 3Lateral’s R&D, and especially wants to develop tools that can be made publicly available.
He laments that due to increasing demand for facial capture and animation, 3Lateral has had to turn away more than 95% of projects offered to them in the last year. Mastilovic hopes that with the new availability of tools that anyone can use, no one will have to walk away disappointed.
Mastilovic sees Unreal Engine as a tool for much more than gaming, with future applications in machine learning, communication, personal self-improvement, social research, and more. Within filmmaking, he sees immediate uses in virtual production beyond simple facial capture. For example, parts of an actor’s performance can be remixed in real time to try out different emotional pacings for a scene.
“The applications are too long to list,” he says. “It creates a wonderful new world of opportunities where the users of this tech will inspire us back and show us the ways that this can be used, that we didn’t even imagine.”
This podcast interview with Mastilovic is part of our Visual Disruptors series. Visit our Virtual Production page to get more great podcasts, videos, and articles on virtual production!