Unreal-Engine powered visuals for Phish at Sphere.
RichFury / Sphere Entertainment
Spotlight
December 12, 2024

Moment Factory redefines live concerts with real-time visuals for Phish at Sphere

AnimationBroadcast & Live EventsDisguiseMoment FactoryMusicMyrezePhishSphere

Beloved Vermont jam band Phish is famous for never repeating a setlist. Songs from the albums evolve into something fresh and new as the band improvises on them in live gigs.

So when the band landed a four-night run at Sphere in Las Vegas, true to form, the show they delivered was unlike anything ever seen before.

Sphere is a state-of-the-art venue dominated by a giant 250-feet-high LED screen that envelops the audience. Inspired by this epic music arena, the band and an intrepid crew of creative studios set out on a mission: to produce the world’s largest and most immersive digital experience ever.

In collaboration with Phish, Co-Creative and Show Director Abigail Rosen Holmes, and Lighting Designer Chris Kuroda, Moment Factory co-created the shows, contributing to stage design and content production. Working with the world’s largest screen, the team introduced a future-forward approach to real-time generative video content.

This was to be about way more than delivering great visuals on a very big screen, however.

Like the music, the graphics would organically grow and evolve in the moment. They would need to react in real time to whatever the musicians were playing—no mean feat when the band in question could veer off into a 20-minute jam at the drop of a hat.

Previsualizing the gig in VR

The man heading up the team in charge of the creative ideas and art direction for the Sphere shows was Manuel Galarneau, Multimedia Director at Moment Factory.

Moment Factory has worked on live event content for everyone from Billie Eilish to Kiss, but Galarneau had an inkling the Sphere gigs would be a different beast entirely. “We knew this was going to be an enormous challenge,” he says. “This is the biggest screen on Earth.”
 
Real-time graphics on the screen at Sphere.
Alive Coverage

As well as the sheer scale of the screen, the volume of content required—a quartet of four-hour shows, each with different visuals—was similarly epic in scope.

Zane Kozak, CG Supervisor at Moment Factory points out that the team would ordinarily have a long run-up to deliver on large formats like this. “At least six months to a year to prep everything, get all our planning lined up, figure out how we’re going to transfer 400 TB of content,” he explains. “But in this case, we had three months.”

Pre-rendering everything for a 16K screen was out of the question—the data footprint would be huge and the process prohibitively time-consuming. “But also, this would defeat the purpose: to follow the band in what they do musically,” says Galarneau. “Unreal Engine was a key tool to generate visuals along with them in this gigantic venue.”

While the interactive nature of the visuals live on stage naturally required the use of a real-time engine, Galarneau notes that real-time was also a lifesaver when it came to exploring creative ideas and previsualizing how they would look at the venue.

Moment Factory had been using Unreal Engine for VR previsualization for a number of years. They’d create a digital double of a venue and use it to assess issues or challenges with a particular location and solve them before they arrive on site.

VR previsualization for Phish at Sphere.
Courtesy of Moment Factory

That process became invaluable on the Sphere project, enabling the team to quickly spot any potential problems with scale or animation timing that might arise from projecting onto the Sphere’s gigantic screen and make the necessary adjustments.

“Making any sort of change—let alone big changes—to our pre-rendered content was not going to be a possibility,” says Kozak. “Unreal gave us the freedom to make tweaks and changes and adjust to the art director’s notes and work in a more contextual space.”

When the team had originally tried to pre-render different visuals for the shows, they’d very quickly found that they were incredibly heavy—way beyond anything they had previously seen. “It takes hours, weeks, days to download anything,” says Galarneau.

In contrast, when working in Unreal Engine, the team could make changes in the build and, the minute after, see it on the Sphere’s screen.

“It was very plug-and-play, quick to iterate and see the variations,” says Galarneau. “From the basement of Sphere to the screen, you were there within minutes. It takes longer to walk than to update the scene.”

The ability to make changes on the fly meant the team were able to experiment with creative ideas right up until the last moment, adding finishing touches like detail blur or additional particles. “Those changes, maybe we wouldn’t have done if we were going with a pre-rendered classic route,” says Galarneau.

Building a game for live event visuals

Developing the visuals for Phish’s show started with a regular setlist, like it would for any band. 

But where things got different is that the team didn’t know where the jam was going to go—the band might extend or play different versions of songs. “We needed to figure out ways to have pieces that could evolve and maybe be a five-minute piece anywhere to almost 18 minutes,” says Kozak.

Pre-rendered visuals on a set linear timeline would be of little use. Instead, Moment Factory worked with branding and virtual production company Myreze alongside visual experience platform Disguise to come up with a novel idea: construct miniature video games that could be played live along to the music.

“We had a set of tools, analogous to video games, that we could move,” explains Galarneau. “So we’re actually exploring levels and environments like you would do in a video game, and those were the visuals that people were witnessing.”
 
Real-time graphics for Phish at Sphere.
Courtesy of Moment Factory

As CEO and founder Björn Myreze explains, the in-house experts at Myreze refer to themselves as “virtual engineers.” The firm is driven in large part by a love of tinkering with the nuts and bolts of immersive technology.

It made them the perfect fit to figure out the technical jigsaw puzzle of putting the Phish show together.

Myreze set to work developing a toolkit that would enable an operator to improvise along with the band, symbiotically. But the fact they would not be the ones to use this toolkit posed an immediate challenge.

“Often when we build graphics, we’re the final operators,” says Håvard Hennøy Vikesland, Unreal Engine artist at Myreze. “On this project, we were not. We were going to hand off all of the scenes to someone who didn’t know the inner technical workings.”

Myreze would need to build a console that was simple to use and understand, and that could deliver visuals that blended together seamlessly regardless of the combinations triggered.

To develop this, the team embarked on a process of extensive R&D. “We needed to build systems, mathematics, and shader systems that can be triggered live in a way that’s never been done before,” says Björn Myreze.

After much experimentation, the team settled on ten parameters that could be adjusted to change different aspects of the visuals, including speed, colors, and other aesthetic elements like bloom in the scenes.

These were controlled by a lighting board with multiple sliders that was connected to Unreal Engine’s DMX plugin to direct the lighting and effects on the screen. “Essentially, we are creating a game for the graphics operator,” says Vikesland. “And the goal of the game is to jam along with the band.”

Taking a game-oriented approach meant every scene of the project had to be built out in a far more mathematical and algorithmic way than would normally be the case. That’s because every visual needs to follow the band, and the band could go anywhere.

“If you have a tree growing, it needs to grow along with the band,” explains Vikesland. “You can’t just have a single growing animation, pre-baked. Because what if the band does something you don’t expect them to do? The graphics won’t follow along. We needed to build systems that are flexible enough to go wherever the band goes.”

That meant everything had to be built parametrically—that is, based on a predefined set of rules.

The team built everything from the ground up. There was to be no baking out of animations from other 3D software—they’d animate using math and shader offsets.

This novel approach led to some interesting and unique development methods. The bubbles in the show are a case in point. “You look at it and it’s just bubbles,” Vikesland. “It’s plain. It’s simple. But what’s going on underneath is actually one of the most complicated shaders that we’ve ever built.”

CG bubbles for Phish at Sphere.
Alive Coverage

You might think the bubbles are three-dimensional spheres—but in fact, they’re flat planes.

Using this flat plane and a complicated mathematical setup, the team was able to fake how light would interact with a three-dimensional bubble that had realistic physical properties. “In a sense, it’s ray tracing in the material,” says Vikesland. “But it’s happening on a two-dimensional flat plane. And this ultimately leads to a scene that is extremely performant.”

It was a similar story when it came to the trees.

“What you would think is, you find a tree asset and you populate the scene with that, and then you have a forest,” says Vikesland. “But a pre-built tree model would never be able to do what we need it to do.”

Houdini-built CG trees at the Phish at Sphere.
Alive Coverage

Instead, using Houdini, the team built their own tool for creating trees, which, when exported to Unreal Engine, carried with it a lot of useful attributes that the team wouldn’t normally have access to.

“You have UV maps describing the length and thickness of the branches and you can use this data to animate the tree in interesting ways,” says Vikesland. “You can, of course, grow it and shrink it. But you can also have patterns moving along the tree in a very psychedelic fashion—you can have fireworks going off along the branches of the tree.”

Limitless creative possibilities

None of the tech used in producing Phish Live at Sphere is brand new—game engines have been around for decades and this wasn’t the first gig ever performed at Sphere. 

But with a little imagination and exploration, the teams behind the show produced something that was truly one of a kind. That speaks to the latent creative possibilities afforded by the technology available today. 

For Bjorn, reflecting on how his creative process has evolved over time, this is perhaps the most thrilling aspect of the project for creators. 

“The funny thing is that back in the day, I felt technology was limiting my imagination,” he says. “Now there are no borders—there is no limitation to where Unreal will go. And I think that is the most exciting thing about it.”
 

Broadcast & Live Events

Create show-stopping magic live in real time.

From sports broadcasts to music concerts, Unreal Engine gives you the power to blend live action seamlessly with CG, helping you bring jaw-dropping “did they just do that?” moments to broadcast and live events.
Get UnrealLearn about licensing