Unreal Engine helps unlock virtual production for Robert Zemeckis’ “Welcome to Marwen”
Kevin Baillie and Robert Zemeckis first began brainstorming on how to achieve the visual style and fidelity they wanted back in 2013. Their main goal was for viewers to feel the full weight of each actor’s performance through his or her doll counterparts, without losing any quality through a traditional motion-capture approach.
“This was a very non-traditional motion capture process because we’re mocapping the actors and the cameras, but we also have to light the actors as if that footage is in the final movie,” Baillie says. “We worked with a really talented team of Unreal developers who designed an iPad control system that allowed our Director of Photography, C. Kim Miles, to use intuitive controls to dial sun height and direction and how much fill light there was. He was actually able to go through and pre-light the entire Marwen section of this movie before we ever filmed a single frame of it. That allowed us to walk away from the motion-capture stage having everything we needed, knowing that our compositions were going to work at the end of the day.”
“One of the real benefits of having pre-lit the whole movie in Unreal is that once we got to the mocap stage, we were able to look at one monitor that showed what the production camera was capturing, and then another monitor that showed us Marwen, with the dolls in their proper costumes and performing in this beautiful environment,” Baillie says. “That environment was all built out in Unreal with wavy grass, and it was really visually spectacular.”
“The virtual production capability was hugely important not only for the camera team and Bob Zemeckis as the director, but it also really inspired the actors on stage,” Baillie says. “The more that we can bring a high level of quality to the look of real-time feedback on set, the greater the intangible benefits, right then and there, for the actors and for the other creatives on the film. In a way this virtual mirror was a great rehearsal tool.”
Beyond the production phase, efficiencies from a real-time workflow had a huge impact on visual effects as well. With virtual production, Zemeckis and Baillie were able to make more creative decisions on set, thereby saving time for VFX artists because the looks were more locked in up-front. This also reduced the need for VFX artists to rush to complete temporary VFX shots for dailies or early cuts. The Unreal footage from set more than sufficed for many scenes.
Looking ahead, Baillie sees real-time technology as a game changer for visual effects. “The visual effects industry is maturing in what we’re able to deliver, so for me, the next frontier in our field is efficiency,” he says. “Efficient tools are going to open up tremendous possibilities for filmmakers who only have limited budgets to get their stories on screen. Real-time technology is one of those things that is going to help people iterate on their ideas and get them up on the screen quickly, and at budget levels that would have been unattainable before. I think it’s going to liberate a new generation of storytellers to bring their imagination to the screen.”
Want to bring your own stories to life? Download Unreal Engine now to start creating an in-house virtual production pipeline with real-time feedback and more! Also, you can listen to our podcast on virtual production with Robert Zemeckis as part of our Visual Disruptors series.