Image courtesy of Codemasters/Electronic Arts

REALTIME delivers cinematics in record time on Codemasters’ new F1 2021 game

Ben Lumsden
Working on media for a major sports game franchise poses a complicated set of challenges. Not least of these is the weight of expectations from the fans. From soccer aficionados to race car enthusiasts, devotees expect the content around sports games to be as up-to-date as possible with the current season’s players, kits, and sponsors. 

Kudos, then, to the studios tasked with delivering the highest-quality supporting content for these franchises, often getting the information they need just weeks before the deadline. 

With reference material coming in hot just before the start of the racing season, REALTIME faced exactly this challenge on a recent project for Codemasters when creating content for the ‘Braking Point’ story mode as part of the new F1® 2021 game: turn over two and a half hours of cinematics in just a few weeks.

Workflow efficiencies with Unreal Engine 

With its background in producing immersive and lifelike car configurators for the automotive industry, REALTIME is a creative studio that knows a thing or two about crafting realistic CG vehicles. 

That’s one of the reasons the team there was chosen to create five versions of 35 minutes of cinematic cutscenes for the ‘Braking Point’ story mode in racing game F1 2021. Each version showed a different team, with different branding, colors, and uniforms for the players to select from.
Image courtesy of Codemasters/Electronic Arts
“We needed to build in flexibility and be prepared to make fundamental changes to the look of our assets during the final stages of the project,” says Ian Jones, Project Director at REALTIME. “After all, these cutscenes were intended to reflect a F1 season that wouldn't start for many months. Until two months before delivery, we don’t even know what the cars are going to look like or the suits—everything is top secret!”

Roughly two thirds of the work was character-based drama with CG protagonists and a third was track action—cars racing and collisions.
Image courtesy of Codemasters/Electronic Arts
Jones was on the project from pre-production in March 2020 until delivery in May 2021. “The creative process really stepped up with the arrival of the script, which is the point I got involved full-time on the project,” he recalls. 

Racing game developer Codemasters provided bios for each character with directions on their characteristics and what they wanted them to look like, so the first task was casting. REALTIME cast the roles of the key characters and hired a live-action director for the actual shoot, while Jones focused on the technical side of the project.
Image courtesy of Codemasters/Electronic Arts
After a smooth four-and-a-half-day shoot, Jones had to recreate blockouts and previs based on what was captured. The team did a layout with the data loaded onto the proxy characters and that edit was then created in 3D with all the cameras updated. That was the jumping off point for everyone else to get involved: the characters were in the right place, the cameras were correct, and they had the edit. At this point, the team was working solely in 3ds Max and Premiere. 

These scenes went to the animators, and when they started to do their fixes and clean up, REALTIME rolled the data into Unreal Engine. “We had the published characters with skeletal meshes inside the engine which could receive the mocap for the animators, with both the body and face shapes,” explains Jones. 

The modelers started out using a traditional pipeline, which meant modelling the whole project in 3ds Max and Z-Brush and then importing the entire layout into Unreal Engine as one big scene. “They very quickly realized that Unreal Engine was a lot quicker for propagating a scene and for set dressing,” says Jones. 

As a result, the modelers ended up building the bare bones of the scene in 3ds Max and then using Unreal Engine to do all the set dressing, placing props including tables, chairs, and umbrellas around the various scenes. “Unreal Engine was a very useful tool for them,” says Jones.
Image courtesy of Codemasters/Electronic Arts
The team also discovered efficiencies in using a game engine when it came to templating for the characters. “The structure of Unreal Engine and that it’s based on game design meant that we were able to build a library of characters that all shared the same skeletal mesh,” says Jones. “We could then pick a character, drop it into a scene, and load a performance onto it. That made it so easy to deal with the Unreal Engine process and the scene development.”

It also meant that once the team had completed the first version with one racing team, they could swap them out relatively easily. 

“It was literally a few clicks to swap a character from one team to another with the same animation once all the alternative assets had been created and published in UE4,” says Jones. “The only difficulty was that the teams went from having a bright white garage to having a dark grey or black garage in the background, so we had to re-light a lot of the scenes to get the light on the characters to work—but again, it all happened in real time.”

Performance capture and real-time rendering 

While the project entailed many of processes that REALTIME were already comfortable with, to do over thirty minutes of animated cutscenes was a step change in terms of scale. 

“Just the amount of scene development, lighting, character work, and rendering—we couldn’t have done it without Unreal Engine,” says Jones. “We could not have turned around that amount of rendering.”
Image courtesy of Codemasters/Electronic Arts
The team had to have one version six weeks from delivery and then the other four versions at delivery—each version at 35 minutes. For Jones, doing it any other way than with real-time rendering was unthinkable. “We literally rendered two and a half hours of additional footage in the last few weeks,” he says. 

“It also meant that we didn't need to manipulate our renders and we avoided a complex compositing process. If we’d used CPU rendering, we’d have to render it out in layers and do lots of it in comp. It would have meant that we’d had to cheat a lot of things. With UE4, we could just render the whole thing out as beauty passes.”

The other critical requirement on the project was the ability to produce the highest-quality performance capture possible. “Part of our pitch to Codemasters was that for the best possible result, you want the absolute most singular performance capture,” explains Jones. 

The team knew they wanted to cast someone who would do the body mocap and facial mocap, and that they were going to do it all at the same time. They also knew that world-leading facial performance capture company DI4D had a head-mounted camera (HMC) system and caught up with them to see where they were up to with that. 

“Luckily, they had a new system, PURE4D, that they wanted to test out, so it was the perfect opportunity for us to collaborate,” says Jones. 

Instead of an actor sitting in a chair and miming along to actions that they’d previously performed, they did the whole performance at once. “We used the DI4D HMC, which is a helmet fitted to the head to reduce motion, with a boom that comes out in front of the head with two cameras and a light,” says Jones. “Those two cameras are medical-grade, high-resolution digital cameras. With mocap suits, you usually have markers on different points of the body to describe the motion, but these cameras can pick up skin pores.”

The output from these cameras is uncompressed 2K raw files. Between the two cameras, the system triangulates the face with all those points and completes a scan at 60 fps. 

“Traditionally, their system outputs a point cache, which is nice for pre-rendered data, but it’s a bit hefty for Unreal,” explains Jones. “DI4D had a new piece of tech that they wanted to test out with us, which was a hybrid—a combination of scan data and then facial recognition, driving morph targets.”

This morph targets-based system was compatible with Unreal Engine, so rather than having to move huge Alembic cache files out of 3ds Max and into the engine, the team was able to import animation data to drive and combine any number of blend shapes—as morph targets are also known—within Unreal Engine to achieve the required expression for each frame. “This fundamental shift from Alembic to keyed blend shapes made the entire process more manageable, and the whole Unreal pipeline worked because of it,” says Jones. 
Image courtesy of Codemasters/Electronic Arts

Fast and flexible workflows 

Key information for the 2021 Formula One season would not be available until the last few months before this project was due for release. REALTIME had always anticipated coming under pressure in these final months, as they knew from the very beginning to expect to deliver additional versions of their scenes. Unlike many of their other projects, they wouldn't be able to lock designs or visuals early in the project. 

It was essential that the team had the flexibility to swap in the correct assets at the last minute. All the race action was developed using hybrid generic cars, because they didn’t know what the final cars were going to look like until just before delivery. 
Image courtesy of Codemasters/Electronic Arts
“For the suits, Codemasters gave us their best guess at what they may look like, but we didn’t know until March 2021 when the manufacturers revealed the cars and branding,” recalls Jones. “We couldn’t make physically accurate versions of things until then. This meant that we were on the project for 10 months before we had the final reference material.”

The speed and flexibility of real-time rendering technology became crucial to getting the job done on time. “The UE4 toolset is designed to do this kind of job,” says Jones. “Having an asset library and an asset pool that we can dip into, pull characters in, load animation onto them, and get real-time feedback was incredibly helpful. The fact that we could build the project in UE4 and use that for the scene development and for the lighting and get that instant feedback made the job doable.”
Image courtesy of Codemasters/Electronic Arts
The team made extensive use of Sequencer, which they set up for each shot using Python. Python was used to communicate with ShotGrid to get the information to build each sequence and check the latest animations were in use. All shots were rendered using an early version of the Deadline plugin for Unreal Engine. “We also used ray tracing heavily, because these are all shiny cars and we wanted environments to be realistic,” says Jones. 
Image courtesy of Codemasters/Electronic Arts
The team used the Blueprint visual scripting system for everything from component-based character assembly with livery swaps and car assembly to crowd creation and management. “Not having many coders on our team meant that the ability for an artist to set up each character to follow a set of universal rules defined in Blueprints was invaluable,” says Jones.

For example, Blueprint provided a way for the team to set up characters in advance so that liveries and clothing could be swapped as required by the artist responsible for the shot, without fear of things getting forgotten or misplaced. 

“This was especially useful at the end of the project when a very fast turnaround of renders in various liveries was required,” says Jones. “Also, we would not have been able to contemplate this project without the hair and groom system in UE4, which was (when we first began the project) only in the Experimental phase but was thankfully developed during the course of the project.”

You can see just how far the engine’s hair and fur system, which is now production-ready, has come in this short film by Weta Digital. 

The sky's the limit 

REALTIME has always prided itself on its high-quality rendering. The team first started using Unreal Engine when creating visuals that were entirely representative for a game. For example, a lot of their work for RARE was required to sit seamlessly with the game developer’s game, so the choice of Unreal Engine was more about matching the client’s solves. 

“F1 2021 was our first major cinematic project where UE4 was used to replace a CPU renderer, and we were able to make that decision because it can now do hair and ray tracing, and the image quality is incredible,” says Jones. “Even though it's a game, the sky was the limit as to how good Codemasters wanted it to look.”

    Get Unreal Engine today!

    Get the world’s most open and advanced creation tool. 
    With every feature and full source code access included, Unreal Engine comes fully loaded out of the box.