September 13, 2018

Multi-user collaboration and Unreal Engine real-time production

By Dana Cowley

At SIGGRAPH 2018, real-time filmmaking with Unreal Engine took another giant leap forward. As part of the Real-Time Live! curated showcase, Epic and ILMxLAB presented “Ray-Traced, Collaborative Virtual Production in Unreal Engine.” The workflow incorporated performance capture, virtual reality, multi-user collaborative editing and final pixels through real-time ray tracing to introduce a new pipeline that yields photorealistic, feature-quality results. To top it off, this feat in real-time production ran silky smooth on a single Quadro RTX 6000 GPU, the latest professional graphics card from NVIDIA.
 
Epic and ILMxLAB present at Real Time Live! (presentation starts at 37:33)

Set in the Star Wars™ universe, the cinematic shows two highly reflective stormtroopers and a dazzling Captain Phasma interacting in the interior of a First Order ship. The collaboration is an extension of the “Reflections” real-time ray tracing demo, which ran on an NVIDIA DGX-1 Station equipped with four NVIDIA Quadro GV100 GPUs when it was first revealed just five months ago and now hits the same frame rate on a single Quadro RTX 6000 GPU that costs a fraction of the price. With hardware improving at such a fast pace, we’re able to improve real-time technology more rapidly than we ever thought possible.

Blog-body-img1.jpg

Our Real-Time Live! presentation walked through how to integrate the various components of cinematography, sets, props, lights, cameras, motion-captured characters and finalized animations, and also how to direct and produce a final shot live onstage.

During the demonstration, ILMxLAB Director of Immersive Content Mohen Leo directed an actor, operated a camera and adjusted lighting in the subsequent shot, while Epic Games Creative Cinematic Director Gavin Moran and Senior Cinematic Designer Grayson Edge acted as mocap actor and technician, respectively, to perform the production steps in real time.

“The nice thing with multi-user functionality is that a whole team of people can work together in the same scene at the same time, just like we do on a film set,” Leo said.

Moran wore a Vive Pro and Xsens MVN suit in order to step into virtual reality and act against other CG characters in the “Reflections” scene. As his performance was being captured, IKINEMA LiveAction retargeted the body data onto the stormtrooper directly in Unreal Engine.

Edge recorded the performance using Sequence Recorder and then edited the shot and dropped it into the master sequence using the Sequencer cinematic tool. He and Leo also worked collaboratively to adjust lighting and position cameras, showing how the ray-traced reflections update instantly as the light is moved and adjusted. Edge then played back the pixel-perfect, modified scene in real time in front of the entire crowd.  
 
The original “Reflections” cinematic created with experimental real-time ray tracing

“With multi-user collaboration and editing, you can have set decorators, camera operators, actors, DPs and directors all working together just like they would on a film set,” said Epic Games CTO Kim Libreri. “Whether in VR or on a workstation, everyone is working together in the same environment, all in real time.”

Because creative teams can now instantly produce images that are near-final or final within a live production environment, they can validate their work without waiting for offline renders. Unreal Engine turns virtual production into real-time production.

This points to how Unreal Engine could become the cornerstone of a virtual pipeline where all production roles—director, set designer, lighting designer, actor, mocap tech, camera operator, SFX artist—can work in parallel in real time to produce finished pixels right out of the engine.

A number of factors have converged to make real-time ray tracing possible for real-time production. Microsoft DXR technology now includes ray-tracing capabilities; denoising algorithms, which are essential for efficient ray tracing, are faster and more sophisticated than ever; and hardware has advanced to the point where NVIDIA Quadro RTX 6000, for example, can handle 10 million light rays per second.

Although the ray-traced features shown to date are still experimental, we’re working to integrate them into Unreal Engine’s feature set, and we’ll also ship the multi-user functionality in a future release. In keeping with tradition, all of this will be available to the entire Unreal Engine development community, for free in the main branch, with source code access.

At SIGGRAPH, we also debuted “The Speed of Light,” a collaboration across Epic, Porsche and NVIDIA, featuring the gorgeous Porsche 911 Speedster Concept. The short film and interactive lighting configurator demonstrate real-time ray tracing, including a very early implementation of ray-traced diffuse global illumination, to show how Unreal Engine can produce final pixels for film, television and more.

Special thanks to NVIDIA for their engineering contributions to our real-time ray tracing demonstrations. Our plan is to release code for real-time ray tracing features on GitHub by the end of this year, and to ship support in a binary version next year.

If you’d like to learn more about the making of “Reflections” and “Ray-Traced, Collaborative Virtual Production in Unreal Engine,” check out this SIGGRAPH’s interview with ILMxLAB’s Mohen Leo and Epic’s Kim Libreri and Gavin Moran.

Try out real-time production for yourself and download Unreal Engine for free. We’d love to see what you create, and you can share your work with us on social media by tagging @UnrealEngine and #UE4.