February 6, 2020
Real-time ray tracing in Unreal Engine - Part 4: media and entertainment
As we’ve previously discussed, the key benefits of ray tracing over rasterization lie in more accurate dynamic shadows whose softness varies dependent on distance from the casting object, and more accurate and richer reflections that correctly capture off-screen and moving elements. Together with other ray-traced effects like global illumination, ambient occlusion, and translucency, these offer a much more photorealistic result that helps viewers suspend disbelief and fully buy into virtual worlds.
A further benefit of ray tracing is that setting up shadows and reflections is significantly faster and easier, eliminating the need for techniques like light baking and the manual placement of reflection probes and planes.
And surprisingly, ray-traced shadows could actually be faster to compute than rasterized Cascading Shadow Maps, resulting in an improvement in performance in cases where shadows are required for moving objects, such as trees with leaves that sway in the wind.
So let’s take a look at how real-time ray tracing is already benefiting media and entertainment projects specifically.
More believable real-time entertainment
From AAA games to live broadcast, real-time ray tracing is just starting to move from technology demo to commercial offering, and we can expect to see many more examples over the coming months.One of the first titles to offer players in-game real-time ray tracing is Deliver Us The Moon, a Sci-Fi thriller from Dutch indie developer KeokeN Interactive set in an apocalyptic near future. Despite having almost completed the game when real-time ray tracing became available, the team was able to retroactively enable ray tracing for shadows and reflections in Unreal Engine, tweaking materials and lighting setups to get the desired look. The RTX-enabled update to the original game, which also supports NVIDIA’s performance-boosting DLSS, launched in December 2019 and has garnered many positive reviews.
Live broadcast is another area where real-time ray tracing is making rapid inroads. In September 2019, The Future Group announced that RIOT Games had used its Unreal Engine-based AR solution Pixotope—along with Cubic Motion for real-time facial animation, Animatrik for motion capture, and Stype for camera tracking—to deliver the first live broadcast containing real-time ray tracing and real-time facial animation.
The event, which took place at The League of Legends LPL Regional Finals in Shanghai, featured one of the game’s characters, Akali, dancing with live performers, and being interviewed by a real-life presenter. Thanks to ray tracing, the quality of the shadows and subtle reflections on the character differentiate this live-to-air broadcast offering from previous live shows of a similar nature.
At IBC, also in September, Dutch broadcast services and media solutions provider NEP The Netherlands demonstrated a real-time ray-traced virtual studio powered by Zero Density’s Reality Engine, another software offering that uses Unreal Engine’s renderer. The studio showed how reflections from offscreen objects, made possible by ray tracing, significantly increase the set’s credibility.
Final-pixel output in less time
While real-time performance is not a necessity for linear content—such as episodic television, commercials, and game trailers—the ability to render at blazingly fast speeds and achieve results previously only possible from offline renderers is a boon.Creative agency Capacity recently delivered its third installment of an intro cinematic for the Rocket League Championship Series (RLCS), the official competitive league of the hit sports-action game, Rocket League. As with the previous two intros, the team used the original game assets provided by developer Psyonix, increasing their resolution and retexturing them for cinematic fidelity and added realism. This time, however, due to the advent of ray tracing in Unreal Engine, the team was able to move the project to real-time rendering instead of using an offline rendering solution.
As well as being able to achieve the requisite high-quality lighting and accurate reflections at incredible speeds, the team identifies an enhanced creative process as a key benefit.
"We're thrilled about what the real-time workflow has done for our creative process in terms of iteration and being able to quickly adapt to feedback,” says Ellerey Gave, Creative Director at Capacity. “And with the quality that we're able to achieve through real-time ray tracing and practically instantaneous renders (compared to traditional pre-rendered pipelines), it's quickly become our preferred way to work."
Capacity is not alone in employing real-time ray tracing in this way. Part of what makes Unreal Engine different from other real-time engines is that we “eat our own dog food,” production-testing features on our own internal projects and portfolio of published titles. The Fortnite team at Epic Games has been putting Unreal Engine’s ray tracing implementation through its paces since its inception, and is currently using it for trailers and cinematics. The Fortnite Season 11 Chapter 2 trailer is a great illustration of this. It’s also an interesting example of the fact that, even for stylized content that doesn’t attempt to be photorealistic, ray tracing lends an extra layer of richness that draws the viewer into the world. However, when photorealism is the aim, it’s fascinating to see what’s possible. In this example, artist Sertac Tasdemir experimented with recreating the Batmobile from The Dark Knight, aiming to emulate the film’s lighting to the greatest possible extent. As Senior Editor of community website 80 Level Arti Sergeev says: “The results are kind of amazing, and it makes you think about the future of the industry, and the way engines like Unreal Engine could be used in film and animation projects.”
Accelerated lighting and look development
It’s not just in rendering the finished frames that real-time ray tracing speeds things up. When used in the lighting and look development process, Unreal Engine enables lighters to see their end results directly in the editor, and make adjustments on the fly to refine their look in real time.With real-time ray tracing, the requirement to bake lights—a time-consuming, iterative process that requires many adjustments to UVs and settings to achieve a high-quality result—can, in some cases, be completely eliminated. A real-time ray tracing solution provides an immediacy that enables lighters to make the most of their creative talents.
More accurate techvis
Real-time ray tracing can also be used to good effect during shot planning, as we discovered when working on a project with Lux Machina, Magnopus, Profile Studios, Quixel, ARRI, and DP Matt Workman to test-drive our new virtual production toolset. Revealed at SIGGRAPH 2019, one of the things the project demonstrated was how LED walls can not only provide environments for real-world actors and props, but also light them and cast reflections on them, making it possible to capture final pixels in camera.Set in a desert environment at sunset, the shot featured an actor approaching and mounting a motorcycle, and donning a pair of reflective sunglasses. The aim was to achieve the best-looking shot with accurate reflections from the LED walls.
As CG Supervisor on the project, I used real-time ray tracing to plan where exactly the motorcycle and actor needed to be placed in relation to the LED wall and the camera. This enabled me to quickly see how I could get the best reflections on the subjects, and to judge how much light I was going to get from the LED walls. I was able to validate that the layout of the LED panels was giving me the most reflection coverage on the motorcycle and the actor’s glasses.
You can get a more in-depth behind-the-scenes look at the entire project in this blog post.
This is just the beginning!
As we embark on a new decade, there’s little question that technologies like real-time ray tracing will continue to evolve at an incredible pace and that virtual worlds will become completely indistinguishable from physical worlds. It’s exciting to imagine the new and innovative ways in which creators will harness this power to change the way we live, work, and play, and we’re thrilled to be part of that journey.Ready to see what real-time ray tracing can bring to your media and entertainment projects right now? Download Unreal Engine today.
This article is part of a series on real-time ray tracing in Unreal Engine. You can also read Part 1: the evolution, Part 2: architectural visualization, and Part 3: automotive design and visualization.