September 5, 2019
Real-time ray tracing in Unreal Engine - Part 1: the evolution
What is ray tracing, anyway?
But first, let’s quickly define ray tracing, and review at a high level how it differs from rasterization, the rendering method traditionally used in game engines prior to the advent of real-time ray tracing.Ray tracing is a method of determining the color of each pixel in a final render by tracing the path of light from the camera as it bounces around a scene, collecting and depositing color as it goes. Because it mimics the physical behavior of light, it delivers much more photorealistic results than rasterization, like soft, detailed shadows and reflections from off-screen objects.
Rasterization, on the other hand, works by drawing the objects in a scene from back to front, mapping 3D objects to the 2D plane via transformation matrices. It determines the color of each pixel based on information (color, texture, normal) stored on the mesh, combined with the lighting in the scene. It is generally much faster than ray tracing, but cannot portray effects that rely on bounced light like true reflections, translucency, and ambient occlusion.
The technology catalyst
Ray tracing has been used in feature films for decades by those striving for the most realistic imagery. But until very recently, ray tracing a single frame at film resolution could take several hours, even days. The idea of rendering ray-traced frames at 24 frames per second (fps) or higher was hard to imagine.Then came two concurrent game-changing technology advances: Microsoft’s DXR (DirectX Raytracing) framework, and NVIDIA’s RTX platform. Working with NVIDIA and ILMxLAB, Epic Games was able to show real-time ray tracing for the first time at GDC 2018 with our Reflections tech demo, which featured characters from Star Wars: The Last Jedi.
Image from “Reflections”
The piece demonstrated textured area lights, ray-traced area light shadows, ray-traced reflections, ray-traced ambient occlusion, cinematic depth of field, and NVIDIA GameWorks ray tracing denoising—all running in real time. Because of Unreal Engine’s hybrid approach, passes that don’t benefit from ray tracing continued to be rasterized, boosting overall performance.
Ray-traced area light shadows
These were early days. The prototype Unreal Engine code was not yet available to users; the hardware required to run the demo in real time consisted of an NVIDIA DGX Station with four Tesla V100 GPUs, with a price tag somewhere in the region of $100,000—so neither the software nor the hardware was exactly what you could call accessible to the average user.
Democratizing real-time ray tracing
Five months later, at SIGGRAPH 2018, again working with NVIDIA and joined this time by Porsche, we launched “The Speed of Light” Porsche 911 Speedster Concept, a real-time cinematic experience. With Moore’s Law in full effect, this demo only required two NVIDIA Quadro RTX cards—still quite an investment at over $6,000 each at the time, but significantly less expensive than the earlier setup.Image from “The Speed of Light”
In addition to the previously demonstrated features, The Speed of Light showed off ray-traced translucent and clear coat shading models, plus diffuse global illumination. As well as producing the cinematic, the team created an interactive demo to illustrate the real-time nature of the scene.
Fast forward to GDC 2019 in March of this year, and we were ready to make the ray tracing functionality available as Beta features in Unreal Engine 4.22. To show how far the technology had come within 12 months, our partners Goodbye Kansas and Deep Forest Films created Troll, a real-time cinematic featuring a digital human.
Image from “Troll”
Troll is an order of magnitude more complex than Reflections. While both pieces run at 24 fps, Troll has 62M triangles compared to Reflections’ 5M, 16 lights compared to four, and the ray-traced passes were rendered at full 1080p resolution, as opposed to Reflections, where they were rendered at half resolution and scaled up. Troll also required the ray tracing of particles, and interoperability with other fundamental components of the Unreal Engine cinematographic pipeline including Alembic geometry cache, hair, and skin. Despite all this, Troll was rendered on a single NVIDIA RTX 2080 Ti, with a current list price of $1,199.Comparison between screen-space reflections (L) and ray-traced reflections (R) in “Troll”
Today, with the software available for free in Unreal Engine, and graphics cards requirements that won’t break the bank, real-time ray tracing is accessible to everyone.
Ray tracing in Unreal Engine 4.23
When we were working on the initial implementation of ray tracing in Unreal Engine, we were writing our code at the same time as Microsoft was putting the finishing touches on the DXR framework, and NVIDIA was working on new hardware and drivers. In addition, in order to support ray tracing, we underwent a major rendering pipeline refactor. In Unreal Engine 4.23, previewed at SIGGRAPH 2019, we’ve been able to spend effort on addressing instabilities caused by both of these factors.Performance has also been improved for many areas including the sky pass, global illumination, and denoising. Denoising is critical to real-time ray tracing since the budget for the number of rays we can trace is very limited.
Global illumination was put to good use in “emergence”, a short film by Evil Eye Pictures
We’ve also implemented intersection shaders, enabling you to support your own primitive types and render complex geometry such as hair or fractals. We’ve begun work on supporting multiple GPUs; the low-level support added in 4.23 offers control over which GPU will render a certain pass. And we’re continuing to support additional materials and geometry types, including landscape geometry, instanced static meshes, procedural meshes, and Niagara sprite particles.
The road ahead for real-time ray tracing in UE4
We’re not done yet! The team at Epic is still hard at work on improving our implementation. While we’ve made big strides in performance and stability with 4.23, there’s still more work to do in those areas before the feature gets the official “production ready” stamp. Although, try telling that to so many of our users who are happily using the current Beta code in production. We love that they’re stress-testing the code for us, along with our internal Fortnite cinematics team, who are certainly keeping us on our toes.As we mentioned earlier, we added low-level multi-GPU support in 4.23, but we’re looking to make that much more “plug and play.” And we’re always looking to support more geometry types. One of the most challenging examples is foliage with world position offset—required for effects such as the animation of leaves blowing in the wind—and we’re working on developing some ideas there.
If you’d like to get a more in-depth view of our work so far, check out this Tech Talk from SIGGRAPH 2019.
We’d like to thank everyone who’s already tried out real-time ray tracing in Unreal Engine and is actively sending us their valuable feedback. If you haven’t taken it for a test drive yet, why not download the latest Unreal Engine today? We look forward to hearing what you think.