In one of the most memorable sequences from The Matrix, Morpheus teaches Neo a valuable lesson about his hidden abilities in a Japanese-style dojo. As Morpheus explains, this is a sparring program built to mimic the physics of the Matrix—and where Neo will learn to question his preconceptions about what’s possible in an unreal world.
When DNEG was tasked with reprising this iconic scene for last year’s The Matrix Resurrections movie, it felt apt to use a game engine to build the environment. “Just from a story point of view, it's an artificial construct that Neo is entering,” explains Huw Evans, VFX Supervisor on the film. “So we could create our own Matrix within a Matrix of visual effects—we can use this game engine within our VFX pipeline, which is quite exciting to us.”
Long used to create stunning worlds and environments for video games, game engine technology has advanced to the point where it can produce CG visuals that are indistinguishable from real life. In essence, you can create a fully interactive virtual environment that looks just like the physical world—a bona fide Matrix.
We caught up with some of the VFX experts that worked on the film for DNEG to hear why real-time technology is changing the game—and how the dojo scene on screen is final-pixel imagery coming from Unreal Engine.
Realistic simulated environments
Dan Glass is a senior VFX supervisor at Warner Bros. His team was tasked with paying homage to the dojo scene from the original Matrix film—but this time, there was a twist. “Lana [Wachowski] wanted to have the dojo itself on a small lake that sits in a wooded environment,” he explains.
Writer and Director Lana Wachowski worked with Epic to help develop the setting. “It was structured as three separate lakes with two bridges joining them together, surrounded by a forest of beautiful autumnal trees,” says Evans.
As Roel Coucke, a CG supervisor who worked on the film points out, this concept immediately presented a test of the engine’s capabilities. “It's in the middle of a lake,” he says. “So we knew we would have to make sure that we are able to achieve the reflections exactly how we wanted—and to make sure we sell the water, because that was such an essential element of the environment.”
Water is notoriously difficult to simulate. It has specific physical and reflective properties, and the human eye is highly adept at spotting a fake. “From a pure technical rendering standpoint, it’s fairly difficult to do—especially with a game engine,” explains Quentin Marmier, Lead Technical Artist at Epic Games.
Thanks to Unreal Engine’s real-time ray tracing capabilities, the team were able to produce incredibly realistic water visuals that satisfied even the high standards of the discerning key stakeholders on the movie. “And all of this is just happening in real time,” says Marmier. “Once you create your environment, it’s done—it just works.”
Photorealistic real-time ray tracing is just one of a number of technological leaps forward that game engines have made in recent years. Combined with real-time compositing, film-quality post-process effects, and advanced particles, physics, and destruction, creators now have everything they need to create final-pixel output for both live-action and animated content at their fingertips.
“What you see in the final image for the sequence is mostly all coming directly from Unreal,” says Coucke. “That is quite a unique thing to say. Traditionally, we would use Unreal for previs and postvis, but this is the first time where what you see in the final film is coming from Unreal Engine.”
Real-time rendering for visual effects
When offline rendering was the only way to achieve final-pixel quality, the VFX industry had a notorious reputation for long hours spent waiting for shots to render.
The speed and efficiency afforded by game engine technology has flipped the script. “In rendered VFX, we're so used to seeing things take hours, sometimes days to render,” says Evans. “But having that in real time and being able to see it at a really decent quality still blows my mind.”
Real-time technology provides the ability to see your changes instantly. You can iterate at the speed of your imagination and make creative choices when it matters most. “You want to tweak a light in the scene? You want to change the sun angle for a specific shot? You get that instant feedback,” says James Tomlinson, Environment Generalist Supervisor, DNEG. “You don't have to press a button to hit render and wait for 25 minutes. Instant results—everyone's happy.”
As Coucke explains, time is the enemy in the visual effects industry. “Render time is very expensive,” he says. “Real time allows you to bring the usual day turnaround for a note, bring that all the way down to just the mere minutes or hours.”
For artists, that means more time spent using their artistry rather than being constrained by the technical side of things. “I think that is a huge benefit to Unreal and real-time workflows,” says Evans.
Game engines for film production
Unreal Engine is fast becoming a mainstay for VFX studios looking to deliver high-quality 3D content for film. The engine includes a host of features and functionality that has been developed specifically with video production in mind.
Movie Render Queue is a case in point. Previously, one lighting artist or environment artist might have been able to submit two or three shots per day—if they were lucky. “With Unreal Engine and the Movie Render Queue integrated into our pipeline, we had the ability to submit over a dozen on a daily basis,” says Tomlinson.
Movie Render Queue is the feature that enables you to create high-quality, film-standard media directly from Unreal Engine without post-processing, thanks to the ability to render movies and stills with accumulated anti-aliasing and motion blur.
Since Unreal Engine 4.27, you can use Movie Render Queue to render from multiple cameras as a batch process, without having to go through complicated Sequencer setups. This makes it easy to repeatedly create a series of large stills from different viewpoints, as you work through variations or iterations.
The DNEG team were surprised to discover they could adapt the feature to leverage the engine in ways they had not realized were possible. In one case, they needed to create a separate render pass for the volumetric fog, something that was not provided as a standard preset. They were able to use Blueprint logic to create a preset that modified the scene in such a way that all the assets turned black. “That gave us purely the volumetric that was written into an EXR for the compositing department, so they can fully control how much or how little fog they want within their shot,” says Coucke. “And this was all seamless for the artist.”
Real-time filmmaking workflows
Working in collaboration with industry partners like DNEG, Epic is continually adding new and improved features to Unreal Engine to better facilitate filmmaking workflows. Recent releases have brought a raft of functionality, including next-generation in-camera VFX, an enhanced Virtual Camera system, significantly easier ways to modify virtual set lighting, and more. “It’s exciting to be part of helping shift and develop the Unreal tool towards production moviemaking,” says Glass.
The visual quality it’s now possible to achieve in game engines points to a future in which more and more film and TV content is created using the technology. At many forward-thinking studios, it’s changing the way artists think about VFX. “That idea of taking a real-time engine and pushing out final-quality renders has been super exciting,” says Evans.