As our studio partners around the world adjust to working from home, we are seeing exciting new use cases for virtual production—with key features in Unreal Engine facilitating unprecedented productivity and collaboration, even with dispersed teams.
Introduced in Unreal Engine 4.22 and continuously extended and refined since then, our virtual production toolset was originally developed for on-set use as a way to enable different departments to simultaneously collaborate in both physical and digital environments. Virtual production workflows enable the director, cinematographer, production designer, visual effects supervisor, and other key creative stakeholders to work together on lighting, virtual location design, and other aspects of a scene in real time to achieve the final look on set, and even to capture accurate lighting and visual effects in camera. Productions such as Disney’s The Mandalorian and HBO’s Westworld and Run are just a few recent titles that have used Unreal Engine for virtual production.
Over the past few months, the extended production shutdown presented a unique opportunity for us to reimagine the possibilities of these existing features and of virtual production workflows in general. Even without a set or an LED wall as the final output, creative teams can still leverage tools like Multi-User Editing and Virtual Scouting to collaboratively develop a scene from the safety of their respective shelter-in-place locations.
This goes far beyond remote direction via video conference—this new process is being used by studios such as Netflix and BRON Digital to enable at-home collaboration of department leads to previsualize content in real time. The resulting process is an innovative form of remote production that was not previously possible with traditional digital creative tools, and is today solving key problems amid shuttered stages and travel restrictions.
Now, creators can visually establish as much as possible ahead of time so that crews can hit the ground running for live-action production down the line.
"The partnership between Epic and our virtual production initiative, NLAB, unlocked the ability to connect a DP in New York with VFX artists in London, a director in Los Angeles with an art department in Japan, and performance capture talent with in-house animation supervisors," said Girish Balakrishnan, Director of Virtual Production at Netflix. "The power to previsualize in real-time within the same scene – even across continents – allows our productions, and the industry as a whole, to continue moving forward while talent is safely working from home.”
In this video of a multi-user production session, you can see a creative team of eight (including two motion capture performers) work together to move props and other set elements, experiment with different camera angles and lighting, and block out character performance, all in real time. For details on how to set this up, see our Remote Multi-User Editing guide.
On this virtual production project, we used Multi-User Editing to enable all participants to work on the same scene simultaneously, and to be able to make any changes as a group, and Virtual Scouting for art direction, scene layout, blocking, directing, and lighting. Live Link enabled us to stream motion capture data live into the Multi-User Session, and Take Recorder let us record the live motion capture performance (using suits and the MVN Animate motion capture system from Xsens) and any changes needed per take. Sequencer was used for editing and playing back the recorded performance capture, and to keyframe some of the camera animation; in addition, virtual cameras were created inside Unreal Engine using an iPad and Live Link Face.
All the digital characters, environments, and assets used to create this video are available as free downloads from the Unreal Engine Marketplace. All virtual production features are freely available to all users in the current Unreal Engine 4.25 release.