February 6, 2019
Unreal Engine helps unlock virtual production for Robert Zemeckis’ “Welcome to Marwen”
Kevin Baillie and Robert Zemeckis first began brainstorming on how to achieve the visual style and fidelity they wanted back in 2013. Their main goal was for viewers to feel the full weight of each actor’s performance through his or her doll counterparts, without losing any quality through a traditional motion-capture approach. To achieve this effect, Baillie started with a CG doll model and then added live-action facial performances captured with motion picture cameras on a mocap stage. Lighting was critical to get this technique to work because the successful integration of the actors’ live-action facial footage with their CG doll counterparts in the digital world of Marwen relied on a complete lighting match.
“This was a very non-traditional motion capture process because we’re mocapping the actors and the cameras, but we also have to light the actors as if that footage is in the final movie,” Baillie says. “We worked with a really talented team of Unreal developers who designed an iPad control system that allowed our Director of Photography, C. Kim Miles, to use intuitive controls to dial sun height and direction and how much fill light there was. He was actually able to go through and pre-light the entire Marwen section of this movie before we ever filmed a single frame of it. That allowed us to walk away from the motion-capture stage having everything we needed, knowing that our compositions were going to work at the end of the day.” Baillie’s team built out a full version of Marwen in Unreal Engine that could be populated by the actors’ performances—as their doll counterparts—in real time on set. The Unreal assets were specifically built to be movable so that on-the-fly adjustments could be made during shooting as needed. This virtual production process enabled the entire crew to engage in a deeply collaborative creative experience on set.
“One of the real benefits of having pre-lit the whole movie in Unreal is that once we got to the mocap stage, we were able to look at one monitor that showed what the production camera was capturing, and then another monitor that showed us Marwen, with the dolls in their proper costumes and performing in this beautiful environment,” Baillie says. “That environment was all built out in Unreal with wavy grass, and it was really visually spectacular.” The ability to immediately see how their dolls would look on screen prompted the actors to adjust certain physical movements because they could get a realistic sense of how they would work with the dolls’ proportions. Zemeckis and the actors were highly engaged, knowing that the performances would ultimately read exactly the way they intended.
“The virtual production capability was hugely important not only for the camera team and Bob Zemeckis as the director, but it also really inspired the actors on stage,” Baillie says. “The more that we can bring a high level of quality to the look of real-time feedback on set, the greater the intangible benefits, right then and there, for the actors and for the other creatives on the film. In a way this virtual mirror was a great rehearsal tool.”
Beyond the production phase, efficiencies from a real-time workflow had a huge impact on visual effects as well. With virtual production, Zemeckis and Baillie were able to make more creative decisions on set, thereby saving time for VFX artists because the looks were more locked in up-front. This also reduced the need for VFX artists to rush to complete temporary VFX shots for dailies or early cuts. The Unreal footage from set more than sufficed for many scenes. These workflow improvements delivered huge efficiency and cost savings. “As a result of embracing technology and the value that these tools bring to the process, there’s a lot of satisfaction knowing that we were able to make this movie for double the efficiency of a normal film, and spend less money doing it,” Baillie says.
Looking ahead, Baillie sees real-time technology as a game changer for visual effects. “The visual effects industry is maturing in what we’re able to deliver, so for me, the next frontier in our field is efficiency,” he says. “Efficient tools are going to open up tremendous possibilities for filmmakers who only have limited budgets to get their stories on screen. Real-time technology is one of those things that is going to help people iterate on their ideas and get them up on the screen quickly, and at budget levels that would have been unattainable before. I think it’s going to liberate a new generation of storytellers to bring their imagination to the screen.”
Want to bring your own stories to life? Download Unreal Engine now to start creating an in-house virtual production pipeline with real-time feedback and more! Also, you can listen to our podcast on virtual production with Robert Zemeckis as part of our Visual Disruptors series.