Image courtesy of FOX Sports

FOX Sports kicks off the NFL season with a groundbreaking multicam virtual production studio

You’ve chilled your favorite beverage, ordered pizza, and invited your friends over. After months of anticipation, you finally turn on the TV. It’s the start of a new NFL season, and like millions of Americans, you’ll probably be watching FOX NFL Sunday. FOX Sports’ pregame show has been an integral part of game day for almost three decades, with previews, analysis, and panel discussions that have won four Emmy Awards.

This year, however, the show’s impressive broadcast coverage had something different going on behind the scenes. For the first time, FOX NFL Sunday was produced in their brand-new LED virtual production facility powered by Unreal Engine. Featuring groundbreaking multicam capabilities, the facility had taken more than six months and 30,000 combined staff hours to complete.
 

The start of a new era

For Executive Vice President and Creative Director at FOX Sports, Gary Hartley, the move to the new facility is just another example of FOX’s innate ability to combine storytelling with unique visuals. “When the sports division started, it was run by a guy named David Hill and his approach was innovative,” Hartley explains. “He would always question things. He’d ask ‘why don’t I know what the score is?’ and get broadcast graphics to help add it to the screen. Back then, our mission statement was ‘same game, new attitude.’ That has been the underlying ethos at FOX Sports for almost 30 years: To not only show the game, but push the envelope of what a sports broadcast can be.”

This ethos was evident in 2019, when FOX Sports became one of the first broadcasters to make significant inroads into game graphics after seeing Unreal Engine technology demonstrated at NAB. The resulting virtual NASCAR set in Charlotte, NC gave the team the perfect opportunity to experiment with real-time technology. The only problem? The studio was constructed with green screens rather than LEDs. That meant issues with green spill and no way for talent to see or interact with the visuals on set. 

“Before long, we noticed there had been a lot of noise made around LEDs and XR,” remembers Senior Vice President of Graphic Technology and Integration at FOX Sports, Zac Fields. “We knew people were using LEDs in the film and episodic world, but no one was doing it live yet. We saw a couple of early demos and knew we had to try this new technology out. By using LEDs, our team could easily bring talent to any environment and get photoreal lighting effects on set, without having to worry about keying a green screen or pre-rendering complex elements like player graphics ahead of time.”
 

A multicam production 

Built on the Fox lot in Los Angeles, FOX Sports’ new virtual production studio features two stories, 5,130 sq. ft. of LED wall, and floor panels powered by 25 Unreal Engines for clustered rendering, so visuals can be displayed in 4K. There are seven AR-ready cameras on set, including four that can handle live AR/XR within the LED volume. All tracking, AR compositing, and color matching is handled in StypeLand; while Erizos is used for on-set studio control. 11 Vizrt engines power the LED screens outside of the LED volume.
Image courtesy of FOX Sports
“The LED wall at the new studio is 10,000 pixels across without even counting the floor. It's just a massive thing to output to 4K,” says Alex Seflinger, Lead Technical Artist on the project. “It was already a complex setup, but we wanted to do even more. We wanted to make the volume work with multicam.”
Image courtesy of FOX Sports
Building multicam capabilities onto an LED volume, however, was something that had never been attempted before in live broadcast. “In past iterations of LED, the director does not typically see the live preview before it is displayed on the LED wall,” Seflinger continues. In order to make the virtual stage behave like a broadcast, the team needed to create multiple camera feeds that could be previewed and controlled. Each had to look great from any angle of the LED wall, so the director could cut between different cameras as the show was broadcasting live.
Image courtesy of FOX Sports
To do this, the team used GhostFrame from MegaPixel: a new technology that can combine four different 60 Hz video streams to a 240 Hz refresh rate LED wall. Once the engineering team sets each frame, a director can then see four separate camera feeds with their own unique perspective, all of which appear in real time. 

“When we first thought of using multicam on a virtual production, we thought it would be impossible,” says Fields. “Luckily, we had GhostFrame technology. It was the secret sauce to making the live broadcast happen.”
Image courtesy of FOX Sports

More dynamic sets

Once the team had gotten GhostFrame working, the next step was to allow the set to be extended, so that augmented reality (AR) elements could be seen on top of each shot. “That meant that in addition to the 16 Unreal Engines running the LED walls, we had another seven compositing engines that layered AR elements onto the ceiling of each scene, and another two applied to the monitors, so we could display different scenes with parallax,” adds Daryl Moore, Vice President of Systems Engineering at FOX Sports.
Image courtesy of FOX Sports
In order to minimize latency and the multiple points of failure involved in the large setup, the FOX Sports team built each Unreal Engine machine to be identical, with the exact same internals and graphics cards. The only change was the part of the volume each engine was feeding, depending on whether it was AR elements or the LED walls and floors. 

“A big thing I’ve noticed with Unreal Engine is that it is a very stable product day in and day out,” Moore adds. “We could build rich, layered scenes that had virtual graphics everywhere from the floor to the sky above the talent. That meant we could tell better, more dynamic stories, all while knowing we could trust Unreal Engine to render it all without crashing live on air.” 
Image courtesy of FOX Sports

Changing the future of sport broadcasting

According to Seflinger, building FOX’s new virtual production studio was a risk that has massively paid off. “There's a lot that goes into a sports broadcast show, especially one like FOX NFL Sunday,” he explains. “Everything is very timed, very methodical. It's a show that’s been running for 30 years and here we were, trying to introduce this brand new virtual element to it that hadn't been done before—on any show.”
Image courtesy of FOX Sports
Less than a month after the show’s debut, the stage has been such a success that the team has plans to use it for other sports like baseball and basketball, and there are already plans in place to experiment with more interactive AR. “We see ourselves as storytellers and Unreal Engine is giving us the tools to tell our stories in brand new ways,” Fields concludes. “Five years ago we saw the potential of real-time technology with our NASCAR stage. Now, with our new LED volume, we can do just about anything.”
Image courtesy of FOX Sports

    Get Unreal Engine today!

    Get the world’s most open and advanced creation tool.
    With every feature and full source code access included, Unreal Engine comes fully loaded out of the box.