February 27, 2019
Virtual production: ILMxLAB and NVIDIA on Star Wars, immersive entertainment, and real-time ray tracing
Scene from “Star Wars: Secrets of the Empire”
Leo, Creative Director and Visual Effects Supervisor at ILMxLAB, explains that ILM's interest in virtual production is what led his group to develop experiences like Star Wars: Secrets of the Empire. "ILM as a whole realized that if you can use these techniques to let directors interact with the world of Star Wars, then audiences would want to interact with the world of Star Wars," he says. "We're trying to basically have the end result be something that's actually available to consumers."
Mohen Leo of ILMxLAB
While embracing real-time rendering, ILMxLAB is ever-mindful of maintaining the visual standards ILM is known for. They chose location-based entertainment, as opposed to home VR, as the vehicle for their VR experiences because they felt this would provide the highest-quality sensory experience. With location-based entertainment, participants visit a fixed location set up specifically for the VR experience, such as an amusement park or museum, where the hardware can include enough computing power to ensure the visual fidelity meets the ILMxLAB standard.
Scene from “Star Wars: Secrets of the Empire”As a result of this exploration into virtual production and real-time rendering, ILMxLAB teamed up with The VOID to produce the immersive VR experience Star Wars: Secrets of the Empire, where up to four players can enter VR together as avatars to meet a challenge, interacting with physical objects and even feeling wind on their skin. "Unreal Engine and NVIDIA have been helping us because they get the tools that we use," he says. "You don't have to apply a whole different mindset to trying to get a realistic-looking image in real time anymore."
Together, ILMxLAB and The VOID also produced Ralph Breaks VR, an immersive world where players, as cartoon avatars, join familiar characters from the Wreck-It Ralph film series in an animated adventure. ILMxLAB worked toward making sure each of these experiences "isn't just something you watch, but you can actually step into and become part of," says Leo. And they've got a whole lot more coming, including a VR experience based on a Marvel series.
In the second part of the podcast, Rick Grandy, Senior Solutions Architect at NVIDIA, explains the RTX technology that makes real-time ray tracing possible.
NVIDIA stunned the VFX community at GDC 2018 when, in conjunction with Epic Games and ILMxLAB, they used RTX to showcase real-time ray tracing in Unreal Engine on some very shiny Star Wars characters in the Reflections cinematic. After that, "pretty much every studio was calling us asking how they get their hands on it—now!" laughs Grandy. He explains that RTX architecture introduces two new types of cores for distributing rendering tasks, Tensor and RT, with RT devoted to accelerating ray tracing. Grandy also explains how NVIDIA's denoising techniques reduce render time for reflections and shadows while keeping visual quality sharp.
Ray tracing with RTX isn’t yet part of Unreal Engine, but NVIDIA continues to work with Epic to push the envelope and advance this technology, as evidenced by The Speed of Light demo at SIGGRAPH 2018.
"You have to present an entire platform in order for you to get widespread adoption," Grandy says. "Everyone wants RTX. Now we're mainly supporting the software vendors to help them get the software out to market, so people can actually utilize this within their production workflows."
There's a whole lot more packed into this podcast, so be sure to listen to all of it! And visit our Virtual Production hub to get more videos, podcasts, and insights into this emerging field.