Image courtesy of The Famous Group

Mixed reality puts the purrr-fect twist on Carolina Panthers games

Craig Laliberte |
December 16, 2021
The first Sunday of the NFL season is always full of exciting surprises, but this year one in particular had more teeth than most. As the Carolina Panthers took on a major opponent, 70,000 fans at the Bank of America Stadium were wowed by a giant mixed reality (MR) panther that leaped onto the Jumbotron and pounced on a New York Jets flag before bounding away with a roar. A video of the spectacle went viral, with 5.9 million views on Twitter alone.
 

“Right now, mixed reality has a big wow factor,” says Greg Harvey, CIO and co-founder of The Famous Group, one of the key partners behind the impressive display. “Fans have that ‘WTF just happened!?’ moment. People don’t really get how this is happening and how these digital creatures interact with the physical world.”

Well-executed mixed reality experiences are like the T-Rex in the original Jurassic Park, he explains—they light up the imagination and blur the lines of reality. Panthers fans love the MR panther appearing right before the game starts. “They get super fired up now and start yelling and screaming as soon as they hear its roar,” says Harvey. “The impact is much more powerful than running traditional content on the video boards.” 

For Jake Burns, the fans' response has been even better than expected. “We’re always looking for new and innovative experiences for our fans on game day and we felt strongly that the mixed-reality panther would be well received, but the reaction was even better than we hoped,” says the Chief Revenue Officer for Tepper Sports & Entertainment, the Carolina Panthers’ parent company. “It has been amazing.”

Mixed reality events are here

Harvey believes we’re only just starting to scratch the surface of what’s possible when real-time technology and live events collide. “There is a seismic shift happening right now,” he says. “Immersive is the future of the live events industry and we will see over time what form that will take—whether it is augmented reality, virtual reality, mixed reality, a combination, or something completely new.” 

The Famous Group had the foresight to see this shift coming back in 2019. Their first Unreal Engine project was a mixed reality live experience for the Baltimore Ravens, which saw a virtual raven swoop down from the sky and land on the physical goal post in the end zone of the Ravens’ M&T Bank Stadium. 

This led the company to a slew of MR projects including for Super Bowl LIV at the Hard Rock Stadium in Miami and their first permanent mixed reality installation at the NRG Stadium, home of the Houston Texans.
Interestingly, it was the prospect of new sponsorship revenues that drove the Texans decision to install the new MR system. “There is a fun Kroger shopping cart race on a Hot Wheels-style track that fills the entire stadium, a Texas lottery activation in which a live lottery game is played out on the field, and a waste management experience with the trash truck on the field,” explains Harvey.

“These would have been done as more traditional sponsorship activations on the video display boards but bringing them to life inside the physical venue through MR adds more impact in the venue and beyond, more immersion into the story of the visual experience, and more excitement.”
The eyeball-pulling power of immersive technologies was not lost on the Carolina Panthers, either.

“During our conversations, we got, ‘could we use this technology to bring one of the panther statues outside the stadium to life and have it terrorize the opposing team? Wouldn’t that be cool? Something like the raven but next level?’ recalls Harvey. “And that was the birth of the MR panther that debuted at the first home game and melted down the internet!”
 

How is mixed reality used in events?

Unleashing the MR panther on the stadium involved traditional / post-production concepts of compositing animations onto video footage, but used the power of Unreal Engine and physical tracking technology to accomplish the process in real time.
Image courtesy of The Famous Group
To bring the panther to life, TruePoint Laser Scanning did a full 3D scan of the stadium and panther statue outside it. This provided a scale model of the panther and all the physical elements with which it would interact. 

Then, the animation team at Zoic Studios used that 3D model of the stadium as a guide for laying out the path of the panther and pre-visualizing how to direct the camera operators to capture the motion live. 

Zoic crafted the panther’s every movement—from jumping onto the scoreboard, grabbing the visiting team’s flag, and ripping it to shreds to releasing a ferocious roar and exiting the stadium. “The flag was a key element and could not have been done using traditional rendering techniques without dozens of iterations,” says Scott Rosekrans, CG Supervisor at Zoic Real Time Group.
Image courtesy of The Famous Group
“We developed a shader in Unreal to allow us—or in this case, the end user running the XR scene—to swap team logos, adjust the flag’s shape and size, and layer three different levels of masking to animate the tearing fabric to a realistic degree before geo-swapping to an actual torn flag.” 

After Zoic was done with the animation, a lighting artist at The Famous Group fine-tuned the lighting on the panther inside Unreal Engine to match the stadium’s lighting at the time of each activation, making the CG feline look natural inside the stadium on game day.
Image courtesy of The Famous Group
Meanwhile, the panther’s roar was created by the sound design team at Field Day Sound, who made sure the sound effect would work inside an echoey open-air stadium, blasting through PA speakers.

The Famous Group installed tracking systems and media server systems that are purposely built to integrate into the team’s broadcast control room as part of the physical integration of equipment onsite. You can get a sense of the drama and atmosphere in the control for a mixed-reality activation like this in this video.

Quince Imaging provided consultation and strategy on the permanent installation of this hardware and stYpe provided camera heads that included encoders for AR data output. These two companies were responsible for making all the components connect and talk to each other inside the stadium.

At each camera location, the tracking systems capture all the pan, tilt, focus, and zoom data in real time and send that information to the media servers. The media servers ingest the tracking data and update the virtual camera to mirror the same position as the physical camera. Then, the live camera feed is overlaid with the tracked animation to achieve the real-time composite of the panther animation.

By integrating Pixotope into Unreal Engine, the virtual stadium can be adjusted to fit perfectly into the real stadium—allowing for pinpoint precision of the panther’s movement and touch points during the activation.
Image courtesy of The Famous Group
The fans inside the stadium participate in the MR experience by watching on the large video boards. “There’s really no way to integrate MR into a live mobile device experience but this will be coming,” says Harvey. “We’re working on several concepts to gamify the experience. This will enable fans in the stadium to not only see the mixed reality but play with it in real time.”

Unreal Engine for mixed reality experiences

In events—and especially live sports—the same content is played every single game. Using Unreal Engine, The Famous Group can make the content different for each event, influenced by a wide range of inputs. For example, the virtual panther tears up a new team’s flag during each home game and has a variety of animation paths around the stadium.

“This could not be achieved as efficiently in a traditional VFX post-production pipeline,” says Harvey. “The game engine gives us a bigger sandbox to create content that is dynamic and unique each time it is experienced by the fans.”

Using a game engine also enables the team to create more immersive and interactive experiences by programming game logic into projects. “That opens up a completely new level of interaction and takes an experience from something that is linear and passive to something that is gamified and active,” explains Harvey.
Image courtesy of The Famous Group
Harvey also believes that the digital world is on the cusp of a fundamental change—and that game engine technology will be a core part of that evolution. “Unreal Engine, in our opinion, will be used to create the Metaverse in whatever form that takes place in the future,” he says. “In our industry, digital twins and virtual experiences are the future. Augmented reality, virtual reality, mixed reality, and the Metaverse are all powered, or will be powered, by game engines.”

Driven by real-time technology, those experiences are set to become more and more interactive. “Unreal Engine is leading the path forward,” says Harvey. “We have a vision of putting a real-time layer over the live event and having the content assets move from the display board to the mixed reality system to a mobile web AR experience to a web-based real-time digital twin. Unreal will make all this possible for us!”

Once fans can actively control and insert themselves into these live immersive experiences in real time, an entire universe of creative expression will open up. “We are only just in the first few minutes of the Ready Player One movie,” concludes Harvey. “Our world will be augmented in ways we can’t even imagine. And the rockets that are driving the new immersive universe forward are powered by game engines like Unreal.”

    Get Unreal Engine today!

    Get the world’s most open and advanced creation tool. 
    With every feature and full source code access included, Unreal Engine comes fully loaded out of the box.
    <em>The Future of Animation</em>
    Event
    June 27

    The Future of Animation

    Join us for The Future of Animation, a month-long event where we celebrate the success of industry leaders and share resources for those who are just getting started.
    <em>The Future of Animation</em>
    Event

    The Future of Animation

    Join us for The Future of Animation, a month-long event where we celebrate the success of industry leaders and share resources for those who are just getting started.
    New release brings Mesh to MetaHuman to Unreal Engine, and much more!
    News
    June 9

    New release brings Mesh to MetaHuman to Unreal Engine, and much more!

    This release of the MetaHuman framework brings not only new features for MetaHuman Creator, but also an exciting new MetaHuman Plugin for Unreal Engine, as well as support for the new character rigging, animation, and physics features in Unreal Engine 5.
    New release brings Mesh to MetaHuman to Unreal Engine, and much more!
    News

    New release brings Mesh to MetaHuman to Unreal Engine, and much more!

    This release of the MetaHuman framework brings not only new features for MetaHuman Creator, but also an exciting new MetaHuman Plugin for Unreal Engine, as well as support for the new character rigging, animation, and physics features in Unreal Engine 5.
    Unreal Engine 5 is now available!
    News
    April 5

    Unreal Engine 5 is now available!

    With this release, we aim to empower both large and small teams to really push the boundaries of what’s possible, visually and interactively. UE5 will enable game developers and creators across industries to realize next-generation real-time 3D content and experiences with greater freedom, fidelity, and flexibility than ever before.
    Unreal Engine 5 is now available!
    News

    Unreal Engine 5 is now available!

    With this release, we aim to empower both large and small teams to really push the boundaries of what’s possible, visually and interactively. UE5 will enable game developers and creators across industries to realize next-generation real-time 3D content and experiences with greater freedom, fidelity, and flexibility than ever before.