New Live Link Face iOS app now available for real-time facial capture with Unreal Engine

Ryan Mayeda |
July 8, 2020
With each release of Unreal Engine, we aim to bring features that feel like the future of virtual production into the present for filmmakers to easily pick up and use. In the 4.25 release, this meant building something that ships alongside instead of inside the engine through a new iOS app we are proud to unveil—Live Link Face for Unreal Engine—available starting today on the App Store.
Live Link Face streams high-quality facial animation in real time from your iPhone directly onto characters in Unreal Engine. The app’s tracking leverages Apple’s ARKit and the iPhone’s TrueDepth front-facing camera to interactively track a performer’s face, transmitting this data directly to Unreal Engine via Live Link over a network. Designed to excel on both professional capture stages with multiple actors in full motion capture suits as well as at a single artist’s desk, the app delivers expressive and emotive facial performances in any production situation.
Live Link Face is ready for use in professional performance capture pipelines.
Collaborative virtual production is a particular emphasis of the app, with multicast networking to stream Live Link data to all machines in the Multi-User Editor session simultaneously in order to minimize latency. Robust timecode support and precise frame accuracy enable seamless synchronization with other stage components like cameras and body motion capture. Live Link Face also has Tentacle Sync integration, which allows it to connect to the stage master clock using Bluetooth, ensuring a perfect editorial lineup with all of the other device recordings from the shoot. Sophisticated productions can also make use of the OSC (Open Sound Control) protocol support that lets external applications control the app remotely to do things like initiate recording on multiple iPhones with a single click or tap.
Live Link Face’s feature set goes beyond the stage and provides additional flexibility for other key use cases. Streamers will benefit from the app’s ability to natively adjust when performers are sitting at their desk rather than wearing a head-mounted rig with a mocap suit, as Live Link Face can include head and neck rotation data as part of the facial tracking stream to provide more freedom of movement for their digital avatars with just the iPhone. Animators can also take advantage of the option to record both the raw blendshape data (CSV) and a front-facing video (MOV), each striped with timecode, to use as reference material for the performance if further adjustments need to be made in-engine.
Addy Ghani, Verizon Media’s RYOT
"The Live Link Face app harnesses the amazing facial capture quality of iPhone ARKit and turns it into a streamlined production tool,” said Addy Ghani, Director of Animation Technology at Verizon Media's RYOT. “At RYOT we believe in the democratization of capture technology and real-time content, and this solution is perfect for a creator at home or a professional studio team like ours."
Live Link Face for Unreal Engine app
Continued democratization of real-time tools for virtual production is one of the primary goals of Unreal Engine development. With Live Link Face, we aim to make facial capture easier and more accessible to creators going forward and are excited to see what you make! To get started, download the app today and learn more about it in our documentation.

    Get Unreal Engine today!

    Get the world’s most open and advanced creation tool. 
    With every feature and full source code access included, Unreal Engine comes fully loaded out of the box. 
    News
    August 19

    Unreal Engine 4.27 released!

    Creators across all industries have something to celebrate with this release: In‑camera VFX goes next-level with a slew of efficiency, quality, and ease-of-use improvements, while other highlights include path tracing for stunning final images, out-of-the-box access to Oodle and Bink, production-ready Pixel Streaming, and much more.
    News

    Unreal Engine 4.27 released!

    Creators across all industries have something to celebrate with this release: In‑camera VFX goes next-level with a slew of efficiency, quality, and ease-of-use improvements, while other highlights include path tracing for stunning final images, out-of-the-box access to Oodle and Bink, production-ready Pixel Streaming, and much more.
    Spotlight
    September 20

    Balenciaga blurs real with Unreal in Fortnite

    Fortnite players can now dress in stunning digital versions of Balenciaga silhouettes. Find out how the pioneering fashion house has created a mind-bending, world-blending transmedia experience, reusing those same Fortnite character models for real-world 3D billboards, traditional promo materials, and more.
    Spotlight

    Balenciaga blurs real with Unreal in Fortnite

    Fortnite players can now dress in stunning digital versions of Balenciaga silhouettes. Find out how the pioneering fashion house has created a mind-bending, world-blending transmedia experience, reusing those same Fortnite character models for real-world 3D billboards, traditional promo materials, and more.
    Spotlight
    September 7

    Mold3D Studio to share Slay animated content sample project with Unreal Engine community

    In a bid to inspire and educate artists, Mold3D Studio is sharing its decades of experience in the industry by creating a sample project for animated content. With a distinctive style that’s a hybrid of anime and realism, Slay is rendered entirely in Unreal Engine.
    Spotlight

    Mold3D Studio to share Slay animated content sample project with Unreal Engine community

    In a bid to inspire and educate artists, Mold3D Studio is sharing its decades of experience in the industry by creating a sample project for animated content. With a distinctive style that’s a hybrid of anime and realism, Slay is rendered entirely in Unreal Engine.