Love, Death + Robots | Courtesy of Netflix

How Sony Pictures Imageworks created a real-time thriller for Netflix’s Love, Death + Robots

What would you do if you came face to face with Cthulhu? For the heroes in one of the goriest shorts of Love, Death + Robots yet, the answer is simple: Anything to survive. Called In Vaulted Halls Entombed, volume three’s eighth episode centers on a squad of soldiers that has the dangerous job of recovering a hostage held by terrorists—only to discover the ancient Lovecraftian horror as they venture deep underground.

It was a story that instantly attracted the team at Sony Pictures Imageworks, who had previously delivered VFX on everything from Jumanji to Spider-Man: Far From Home. “What was really exciting about Love Death + Robots was that animators had a sandbox to explore creatively,” begins Jerome Chen, In Vaulted Halls Entombed Director. “[They could do] whatever they wanted, without trying to fit into a franchise or a particular audience.”

For Chen’s team, that freedom meant embarking on their most ambitious experiment yet: Transforming a decades-old pipeline into one built for real-time rendering in Unreal Engine. “We learned that if you try to push traditional workflows into a real-time environment, it's just not going to work,” says Doug Oddy, Senior VFX Producer at Sony Pictures Imageworks. “Our traditional pipeline was extremely stable, robust, and reliable. It had a huge code base, our rigs, our shaders, and our rendering setups. Every little detail that we had worked to improve on and all of the proprietary tools that we had written—we literally stepped away from all of it.”

By the time the short was complete, it was clear the gamble had paid off. A team of 45 artists had created six photoreal human characters complete with perfectly executed animated fights, dialogue, and mutilation FX; four distinct environments ranging from the Afghan mountains to an evil lair; and even a Cthulhu monster, all in just four months of production—despite having no previous real-time experience. It was a project that has now changed the way the Sony Pictures Imageworks team works for good.
 

The road to real time

The first step to creating the short was assembling the right crew. A poll was sent to 900 Sony Pictures Imageworks artists asking who was interested in real-time rendering. Those who threw their hats in the ring were then interviewed and briefed on the project. “[We were clear it was] going to be hard,” adds Chen. “There was no assurance that we were going to complete the project successfully. It was a leap of faith.”

Once the crew of 45 was in place, they began making what Oddy refers to as the Nerf level: A basic build of the short’s world and camera layout. They then blocked each scene by storyboarding the 15-minute film completely in engine, together with assets imported from ZBrush and Maya. Three actors were scanned for the three hero characters, while MetaHuman Creator was used to generate three redshirt characters in minutes. Interestingly enough, this represented the first time a MetaHuman was integrated into a real-time linear content production, a feat that not only turned out great, but helped propel MetaHuman Creator forward, thanks to Sony Pictures Imageworks’ insights during early development.
Love, Death + Robots | Courtesy of Netflix
“I think MetaHumans will be a real game-changer,” says Chen. “The fact that [your character is] already rigged and the rig is set up perfectly for Unreal Engine, and it works in the real-time world makes it really powerful. Creatives [can] sit and bring in characters and see them acting…without spending all that time doing the asset development, scanning a character, texturing, and rigging. You just drop right in.”
Love, Death + Robots | Courtesy of Netflix

A streamlined production

After layout and blocking were complete, the team moved on to animation. For this, Xsens suits were used to complete the first animation pass using motion capture. Thanks to Take Recorder, performers on stage were able to see their movements linked to characters in the Unreal Engine environment, so there was no need to imagine the final shot on set. This meant production could be approached more like a live-action film, with decisions made on the fly.

“Most times, we didn’t go into a shoot having our sequences completely defined,” confirms Jeremy Sikorski, Asset and Real-Time Supervisor on the short. “We didn’t know exactly how we wanted to do it, but we knew the general action that needed to take place. It was then very easy for me to use Unreal Engine’s Sequencer to basically pick different takes. That meant that if the performance wasn’t the right one, I could easily look through and find another option, rather than request and wait for a re-rendered shot."

For Oddy, this ability to make changes meant the team could work in a much faster, more collaborative way—without being slowed down by potential reworks. “With traditional pipelines, as you get further into production, things become more galvanized, and you're less likely to start opening up the shots without having to go all the way back to the beginning of the pipeline,” he explains. “With real-time, you don't go backward. You're just working live—in the moment. Even with fewer crew, you're able to continuously create brand-new ideas all along the way. You can recut the movie in the 11th hour, no problem.”

Once the mocap animation pass was complete, results were cleaned up in Maya before the FBX rig data was sent back into Unreal Engine. All final-pixel rendering was then done in Unreal Engine, with minimal post-production work.
Love, Death + Robots | Courtesy of Netflix
“Unreal Engine lets us visualize our ideas in a complete form, not just play-blasted gray renders. Ultimately, that meant our final rendered frames through Movie Render Queue look just like what we were seeing in the viewport, with accurate motion blur,” adds Sikorski. “Our post process was very straightforward, we essentially added film grain and lens flares to certain shots. Plus, when it came to things like the bugs in the tunnel or exploding creature effects, that was all in the render. We didn’t have to add a thing.”

Ending silos

As well as streamlining on-set production and post, using a real-time workflow promoted artist-driven creativity for the In Vaulted Halls Entombed team. Right off the bat, the silos between Sony Pictures Imageworks departments were broken down. Artists could see sequences play out in front of them without waiting hours for a render, and were able to make adjustments without needing another department to get involved. This meant individuals could wear many hats: refining everything from animation to lighting all from the same platform.

“I would encourage anybody to jump right into Unreal Engine. With it, a high-fidelity story could come together in a matter of days or weeks,” reveals Sikorski. “We are used to waiting months or years to see our first visual, so we see this as a huge shift in the way basically everything will be produced in the future. It made us rethink a pipeline that’s been with us for over 25 years, as we’re sure Unreal Engine and real-time rendering will be a big part of wherever we go. With it, we see the dawn of an exciting moment in our history as filmmakers and CG artists, where the speed and creativity involved in making a shot democratizes filmmaking in the digital space.”

In Vaulted Halls Entombed, an episode of Love, Death + Robots: Vol 3, is now streaming on Netflix.

    Get Unreal Engine today!

    Get the world’s most open and advanced creation tool.
    With every feature and full source code access included, Unreal Engine comes fully loaded out of the box.