February 20, 2020
Forging new paths for filmmakers on "The Mandalorian"
So when I began to learn more about filmmaking, it was surprising to realize that it’s common for critical departments on a traditional visual effects-heavy production to be decentralized. Weeks or months can pass between the on-set work of key creatives and the post-production work to fully realize the vision. This seemed like an opportunity where real-time game engine technology could make a real difference.
Fortunately, Jon Favreau is way ahead of the curve. His pioneering vision for filming The Mandalorian presented an opportunity to turn the conventional filmmaking paradigm on its head.
When we first met with Jon, he was excited to bring more real-time interactivity and collaboration back into the production process. It was clear he was willing to experiment with new workflows and take risks to achieve that goal. Ultimately, these early talks evolved into a groundbreaking virtual production methodology: shooting the series on a stage surrounded by massive LED walls displaying dynamic digital sets, with the ability to react to and manipulate this digital content in real time during live production. Working together with ILM, we drew up plans for how the pieces would fit together. The result was an ambitious new system and a suite of technologies to be deployed at a scale that had never been attempted for the fast-paced nature of episodic television production.
By the time shooting began, Unreal Engine was running on four synchronized PCs to drive the pixels on the LED walls in real time. At the same time, three Unreal operators could simultaneously manipulate the virtual scene, lighting, and effects on the walls. The crew inside the LED volume was also able to control the scene remotely from an iPad, working side-by-side with the director and DP. This virtual production workflow was used to film more than half of The Mandalorian Season 1, enabling the filmmakers to eliminate location shoots, capture a significant amount of complex VFX shots with accurate lighting and reflections in-camera, and iterate on scenes together in real time while on set. The combination of Unreal Engine’s real-time capabilities and the immersive LED screens enabled a creative flexibility previously unimaginable.
Ultimately, being part of The Mandalorian Season 1 was one of the highlights of my career – the scope of what we were able to achieve with real-time technology was unlike anything else I’ve worked on. Giving filmmakers like Jon Favreau, executive producer and director Dave Filoni, visual effects supervisor Richard Bluff, cinematographers Greig Fraser and Barry Baz Idoine, and the episodic directors the freedom and opportunities to make creative decisions on the fly, fostering live collaboration across all departments, and letting everyone see their full creative vision realized in mere seconds, was a truly gratifying experience. ILM, Golem Creations, Lux Machina, Fuse, Profile, ARRI, and all of the amazing collaborators on this project were deeply inspiring to work with and I'm proud to have been a part of it. But what's even more exciting is that the techniques and technology we developed on The Mandalorian are only the tip of the iceberg – I can’t wait to see what the future has in store.The Mandalorian was not only an inspiring challenge, but a powerful test bed for developing production-proven tools that benefit all Unreal Engine users. Our multi-user collaboration tools were a big part of this, along with the nDisplay system to allow a cluster of machines to synchronously co-render massive images in real time, and our live compositing system that enabled the filmmakers to see real-time previews. We also focused on abilities to interface with the engine from external sources, such as recording take data into Sequencer or manipulating the LED wall environment from the iPad. All of these features are available now in 4.24 or coming soon in 4.25.