The Human Race
4 mai 2017

The Human Race: an inside look at the technology behind the groundbreaking real-time film from Epic Games, The Mill, and Chevrolet

Par Barbara Marshall

Like the cars themselves, automotive advertising comes in all styles and categories—flashy, safety-conscious, rugged, and elegant. But for marketers, there are universal problems that plague the ad creation process: not having access to the newest car models, and, if they are available, reluctance to put the new models on the road due to potential leaks to competitors and the public.

A year ago that all changed with The Mill’s BLACKBIRD, a groundbreaking fully adjustable car rig that can quickly transform its chassis to match almost any car for a live action shoot. This allows VFX artists to later render a completely photoreal CG version of the new model in question—eliminating issues of availability and risk. Though a major development for marketers, one nagging problem remained – the inability to see the actual car in the context of its filmed environment. This is key for creatives to get a real world sense of how the light and surroundings are reflected in the car’s glossy exterior. Traditionally, it can take hours or days to render a single high-res frame, meaning any questions on design or lighting will arise long after the shoot has wrapped.


“BLACKBIRD was an incredible innovation, it lets directors have greater flexibility, but at the end of the day when you’re on set, you’re still not looking at the final car—so we started thinking about how our real-time engineers could work with The Mill to make BLACKBIRD even better,” explained Kim Libreri, CTO of Epic Games. “We proposed combining our knowledge to enable an augmented reality experience, where directors and cinematographers can see the car on set as it’s meant to be.”

Unreal Engine’s high-fidelity real-time rendering capabilities proved to be the key to building this augmented reality process, letting creatives instantly visualize a CG car model within live action shots and adjust accordingly—and a project for Chevrolet celebrating the 50th anniversary of the Camaro was the perfect opportunity to test the technology in action. The project, a short called “The Human Race,” came with two of the traditional problems as The Mill needed to incorporate both the top-secret Chevrolet Camaro ZL1, and the Chevrolet FNR, an autonomous concept car that doesn’t yet exist in the real world.


“For this project we knew we would use BLACKBIRD for at least the Camaro, but we were charting new territory a bit with incorporating the FNR. The proposal from Epic was interesting because the gaming world is accustomed to an environment where CG assets respond to you in real time and can change as you go—so it was exciting to apply that technology to this traditional live action VFX space. Working with Unreal, we were able to leverage existing technology to create a real-time vision of both cars on set, to make sure that everyone was completely confident in the look and feel of these vehicles early enough that adjustments could still be made while shooting,” said Alistair Thompson, EVP International, The Mill.

Epic’s engineers began building the solution through the media framework functionality that allows users to bring MPEG or QuickTime video into Unreal Engine 4; they pushed it one step further by implementing support for industry standard uncompressed EXR files. This new capability set the bar for professional-grade augmented reality, enabling real-time rendering with the final high-res footage.

Tracking and managing the data to seamlessly augment the digital car was another challenge. This was powered by Mill Cyclops, The Mill’s proprietary virtual production toolkit which combines The Mill BLACKBIRD, it’s four mounted RED cameras, ARRAIY's tracking software, and the Unreal Engine. Altogether, this enabled real time, photoreal lighting and reflection composites—something that was not previously possible with BLACKBIRD—meaning the director and cinematographer could adjust where to point the camera based on what they saw on the monitor.


Drawing on the responsive nature of gaming, the solution marks an unprecedented shift in the boundaries between production and post, where creatives can render and integrate photoreal assets into live action footage in real time. For Chevy, the result is a futuristic film that features the two cars in a heated race along a winding mountain road.

“This really pushed virtual production to the next level,” said Libreri. “With the data being fed into Unreal, we could produce realistic, nuanced, and high-quality imagery of both the 2017 Camaro ZL1 and the Chevrolet FNR instantly and make changes on the fly. All the light and reflections that would normally be carefully crafted during compositing, that was all being generated live. It’s a much more complex process than simply creating an AR shadow—this is sophisticated, CG image-based lighting that looks indistinguishable from reality.”

Thompson added: “Through Unreal technology, the real-time renders incorporate the feed from the cameras on set, ensuring that the CG car looks its best because you have the whole environment and all the correct lighting and framing applied. The director can look ahead and see the BLACKBIRD car, and then look on the monitor and see a completely realistic portrayal of the Chevy ZL1 and FNR. You can do the whole shoot as if the cars were actually there. Directors, cinematographers and actors can now take the guesswork out of CG-intensive scenes.”


The potential for this hybridization of film and gaming goes far beyond car commercials—with implications for all kinds of fields. Audiences can have more control over the creative storytelling in a range of entertainment properties, changing the path of their favorite TV show or film. Consumers can have more flexibility when shopping for a new car or other products, tailoring what they want to see instantly.

“The Human Race’ blends cinematic storytelling and real-time visual effects to define a new era of narrative possibilities," said Angus Kneale, Chief Creative Officer at The Mill in New York. “This is a pivotal moment for film VFX and the coming era of augmented reality production. Using Mill Cyclops, paired with Unreal’s cutting-edge game engine technology, filmmakers are able to see their photoreal digital assets on location in real time. It also means the audience can effect change in films in ways previously unimagined, giving interactive control over vehicles, characters and environments within a live action cinema experience.”

With Unreal Engine, professionals are now poised to create high-end augmented reality content for any application imaginable. Libreri concluded: “What this means for Unreal users is that the engine does not limit you. The only limitations are your skills and your imagination.”

To maximize the efficiency of Unreal Engine within this new environment, several new toolsets were developed, including:

  • Multiple streams of uncompressed EXR images (1.8GBps)
  • Dynamic Skylight IBL (image-based lighting) for lighting the cars
  • Multi-element compositing, with edge wrap, separate BG and FG motion blur
  • Fast Fourier Transform (FFT) blooms for that extra bling
  • PCSS shadows with directional blur settings
  • Bent Normal AO with reflection occlusion
  • Prototype for next-generation Niagara particles and FX
  • Compatibility with NVIDIA Quadro graphics cards
  • Support for Google Tango-enabled devices (currently Lenovo Phab 2 Pro)