Image courtesy of Epic Records

Sony Music's 'digital Madison Beer' sets the virtual concert world on fire

Craig Laliberte
For pop stars, staying in the public eye means constantly standing out—on social, on playlists, on stage, and more. But in a year where pandemic restrictions became the norm, many artists had to find new ways to reach their fans around the world—sometimes to a jaw-dropping extent.

Among the star-studded livestream and innovative virtual performances over the last 12 months, Epic Records artist Madison Beer has released what might be the most photorealistic depiction of a musician yet. The Madison Beer Immersive Reality Concert Experience, a groundbreaking, effects-filled virtual performance that premiered on TikTok LIVE and is now coming broadly to YouTube, VR platforms and more, shows just how far an idea can go when artists set real-time rendering and virtual production loose on their vision.


An ultra-realistic digital avatar of Madison is the centerpiece of a boundary-pushing concert that would be impossible to recreate in real life. Sony Music Entertainment and Verizon worked with Madison to develop a full-scale recreation of New York’s Sony Hall and present a medley of her hits with all the production value you’d expect from a major artist. Only it’s completely virtual—except for the music and performance driving the experience.
Image courtesy of Epic Records
For creatively adventurous artists seeking new and innovative ways to connect with audiences, that can be a good thing. While most concerts are limited by worldly constraints, a virtual concert can be whatever an artist wants it to be, giving them the power to shape fan experiences and realize fantastical concepts at a much higher level than is possible in real life. The Madison Beer Immersive Reality Concert Experience takes this idea and runs with it, turning one piece of content into the type of transmedia campaign that can thrill fans from YouTube to VR. 

Keeping it real 

For all the leeway afforded to them by 3D, the production team—led by Sony Immersive Music Studios, Magnopus, Gauge Theory Creative, and Hyperreal—still saw value in maintaining a measure of realism.

“When we started with a blank canvas, our creative goal was to construct a virtual concert through photoreal recreations of a real venue and a real artist, but which also layered in enough magic to reimagine the concert experience itself,” says Brad Spahr, Head of Sony Immersive Music Studios.

“You start with things that are totally plausible in a physical setting, because that’s what’s going to make your fans get into it and accept the experience,” says Alex Henning, Co-Founder of Magnopus. “Once you’ve got them hooked with that kernel of truth, you start to build on top of that with the fantastical. And the more you can pull off the former, the more “wow” you get out of the latter.”

For Magnopus, this meant the venue and the VFX packages. For Hyperreal, it meant Madison herself.
Image courtesy of Hyperreal and Epic Records
Hyperreal started by capturing Madison’s face and body with two separate arrays of high-resolution camera systems in Los Angeles. The first system produced a volume for her face, neck, and shoulders, as it recorded photometric data at the sub-pore level. By capturing the way she moved from every angle, Hyperreal was able to get enough data to construct an ultra-realistic avatar, or “HyperModel,” that steers clear of the Uncanny Valley.  
With the help of 200 cameras, Madison’s body, muscles, and shape were then recorded in a range of biomechanical positions to ensure deformation accuracy in Hyperreal’s real-time HyperRig system. After adding Madison’s preferred performance gear—outfit, hairstyle, earrings—Hyperreal brought the avatar into Unreal Engine to experiment with movement before the live capture session at PlayStation Studios in LA.
Image courtesy of Hyperreal and Epic Records
While this was happening, Magnopus was hard at work on the venue and VFX systems. Like the HyperModel, the goal was to stay as real as possible to ground the event, so when things like star fields started appearing above Madison, they would seem magical and surprising.
Image courtesy of Epic Records
After considering a full LiDAR scan, Sony Immersive Music Studios decided to construct the venue from scratch to allow them more control over the lighting. They started with the original CAD files, which were imported into Autodesk Maya and given the full artistic treatment, including all the nuances that make Sony Hall unique. Magnopus was then able to build upon that with lighting and VFX to achieve the overall goal of a reimagined concert experience.

“Sony Hall is an intimate venue with a lot of character, detail and beauty, which made it an ideal environment for the experience” says Spahr. 

“It is also great for VR, because of the scale. It’s not a giant, cavernous arena or a tiny hole-in-the-wall club,” says Henning. “It’s got almost the perfect amount of dimension.”
Image courtesy of Epic Records
Since Unreal Engine would be used throughout the creation process, Magnopus made use of its built-in virtual scouting tools to get their cameras set up so they could test the lighting before diving into the special effects. But first, they needed the performance.

The benefits of virtual production for music 

Unlike most motion capture shoots, where everyone could be together, The Madison Beer Immersive Concert Experience was a remote affair driven by teams across the US. In LA, Madison Beer was in a mocap suit and head-mounted camera. In Philadelphia, Hyperreal CEO Remington Scott was directing her in real-time, using a VR headset that not only allowed him to view Madison’s avatar face-to-face live within the virtual Sony Hall, but adhere to the COVID-19 restrictions that were keeping them apart.

Because Unreal Engine operates in real time, virtual productions can use its remote collaboration tools to stream 3D environments anywhere in the world, completely synced across locations. This allowed Madison’s performance to be recorded in one continuous take, with no cuts and no edits, which was important for a team who wanted the performance to feel as authentic as possible. 

After the motion capture shoot was completed and the experience was polished, cameraman Tom Glynn was able to build out the shot selections for the final 9.5 minute performance.

“There are moments where you can’t believe this was done in a game engine,” says Tom Glynn, Managing Director at Gauge Theory Creative. “There’s a 3D world with a performance happening, and it’s happening at the same time that I’m moving a camera around. It’s hard to believe what I was seeing in the viewfinder while I was shooting it. I’m looking at an avatar of Madison Beer and it feels like a real person I’m standing in the room with. It kind of blows my mind.”
Image courtesy of Epic Records
In two days, they recorded hundreds of takes, ensuring that they could get any shot they wanted.

“We were hitting the play button, Madison was performing, and Tom was getting his shots in real time. He was instantaneously watching his shots on the monitor directly in front of him. Then we would stop, readjust, hit play, and the concert would go again and he’d get another shot,” says Spahr. “That real-time feedback was huge. If there was one thing about this that was a challenge, it was, ‘I have so many good shots, I don’t know which one to use!’ It was an embarrassment of riches.” 

Glynn was surprised about how easy a virtual production experience could be on a cameraman, especially for a “concert” shoot. Traditionally, a live performance would necessitate five to twelve different cameramen who would be set up in strategic parts of the venue with a variety of different tripods, dollies, Steadicams, and so on. The team would prepare, shoot it once, and get what they got. In this case, Glynn was able to use all the same equipment, including a handheld rig, but film within a virtual world that allowed for quick takes.
Image courtesy of Hyperreal and Epic Records
Using Unreal Engine, Glynn was also able to overcome some of the physical limitations of the real world with a few quick commands. For instance, sometimes the shot he wanted was a little above or below Madison’s eyeline. So the team just increased his size by a factor of 1.2 or 1.5 within the environment, and he was suddenly “tall” enough to get it. Other times, he wanted to keep up with her quick moves without introducing bumpiness into the take. So they increased the translation scale by 1.5–2x until one step equaled seven. Problem solved.

Moment makers 

Once the footage was “in the can,” it was up to Magnopus to sweeten it with effects that would not only catch the eye, but would be impossible in real life. 

“There’s a sequence where there’s a ring of fire around Madison. There’s a moving experience where raindrops are falling around her. These are things that, due to safety issues, wouldn’t be allowed in a normal concert venue,” says Spahr. “So we are giving a fan the ability to see a concert in a new way, but then we dial it up, with cosmic star fields.”
Image courtesy of Epic Records
Magnopus created all the special and lighting effects within Unreal Engine, using real-time ray tracing and the timeline tools in Sequencer, the engine’s built-in multi-track editor, to jump around as they edited different sections of a song. And with Pixel Streaming at its disposal, Magnopus was able to overcome the hardware limitations that box artists in. Pixel Streaming enables you to run an Unreal Engine application on a server in the cloud, and stream it to any browser on any device, just like a YouTube video.

“In real time, you’ve always got a render budget and you can’t go over it. It’s always locked to whatever the target device’s power is and you’ve got a certain amount of things you can render, at a certain level of quality, in order to get the screen to refresh at a refresh rate that you need it to,” says Henning. “Being able to exceed that and go well beyond it, and not make choices of ‘this or that,’ but to choose both—as in I want a photorealistic human render, and I want fire, and I want rain, and I want smoke, and I want atmospherics—is appealing for an artist.”
Image courtesy of Epic Records

One for all

While The Madison Beer Immersive Concert Experience premiered on TikTok LIVE, it was also meant for bigger things. And because it was created in a game engine, the project could be easily transformed for different channels, removing the need for the production teams to recreate the experience for video, VR, and mobile applications. 

“If you’re natively working in real time from the beginning, and that’s at the core of all of your deliverables, if those can all be slices of the same pie, as opposed to needing to make an entirely new pie every time, it really opens up the door to unifying things in a more interesting way and doing a lot more with less total effort,” says Henning. “The more we can use the same setups to do the VR piece and the mobile AR/interactive version and the pixel stream version and the 2D video extract for YouTube or TikTok, the more you can focus all your energy and creativity on the world itself.”
Image courtesy of Epic Records
But even with incredible experiences like this filtering out into the world, the question still remains: How far will virtual concerts go in the next few years? According to Spahr, really far—where he sees plenty of opportunities for new shows and new ways to use digital avatars to reimagine music.

“Anything an artist can dream up can be brought to life, no matter how fantastical it might be. We don’t have to operate within constraints,” he says. “We don’t have laws of physics. We don’t have fire code and safety [protocols] that we have to abide by. To be able to sit down with an artist and say, ‘Dream up your fantasy experience for your fan. If you could do anything, what would you want to do?’ and to know that the tools and the technology exists to make that a reality is the most exciting thing for artists and the music industry.”
Image courtesy of Epic Records

    Get Unreal Engine today!

    Get the world’s most open and advanced creation tool. 
    With every feature and full source code access included, Unreal Engine comes fully loaded out of the box.