Magnopus brings Amazon’s Fallout series to life with virtual production powered by Unreal Engine

Courtesy of Magnopus/Kilter Films/Amazon MGM Studios
May 13, 2024
For over a decade, Magnopus has been a pioneering technology and creative studio building one-of-a-kind entertainment experiences. 

From immersive VR and AR experiences, to all aspects of virtual production (spanning in-camera VFX, virtual art department, and LED volume operations), Magnopus, an Unreal Engine Service Partner, has been entrusted by today’s top filmmakers and brands to push the limits of technology in service of truly memorable content. 

Recently the studio worked closely with Jonathan Nolan and Lisa Joy’s Kilter Studios on virtual production for the blockbuster Amazon Prime Video TV series, Fallout. We caught up with AJ Sciutto, Director of Virtual Production at Magnopus, to discuss the creative and technical process for bringing this dystopian world to life using ICVFX, the real-time flexibility afforded by Unreal Engine, making virtual production work with 35mm film, and much more.
When did Unreal Engine become a tool in the Magnopus arsenal?

We’ve been building virtual production tools in and around Unreal since 2018, before The Mandalorian started shooting. Working with the Epic team, we set out to make a world-class suite of tools that allowed creatives to enter their virtual worlds and tell incredible stories through the virtual lens. ICVFX is one of those tools we use in our arsenal. When paired with the phenomenal software engineering team at Magnopus, nDisplay and Unreal Engine are as powerful as you can get in displaying incredible imagery on LED walls.

When did Magnopus become involved in Fallout, and what role did you play in the show’s virtual production?

We worked closely with Jonathan Nolan and the Kilter Films team on an R&D project in 2020 and collaborated again at the tail end of 2021 on Westworld Season 4. Sometime during the production of Westworld, we began preliminary conversations about Fallout. At the time, only the pilot had been written and our involvement on the project was much like a creative department head. 

We provided creative input for leveraging virtual production across the entire show and worked with the filmmakers to break down the scripts into scenes and environments that would benefit the most from ICVFX. We landed on four environments – the farm and vault door scenes in Vault 33, the cafeteria setting in Vault 4, and the New California Republic’s base inside the Griffith Observatory. Further, anything happening in a Vertibird was a great candidate for LED process shots.
From there, we worked with Howard Cummings and Art Director Laura Ballinger to determine which pieces of the set should be built physically and which should be built virtually. These decisions had an enormous impact on lighting and VFX, so we used concept art and storyboards to coordinate on unifying the two worlds and figure out exactly where the physical/virtual transitions would be.

A big part of this decision is finding where the practical set ends and the virtual one begins. Beyond that, it’s finding where the 3D virtual world ends and the matte painting begins. Our matte painters, Frank Capezzuto III and Rocco Gioffre, came up with designs that allowed us to take full advantage of the LED wall in panoramic shots. Leaning on their work, we were able to use the “sheltered proscenium” approach that lends itself well to shooting with LED.
What was the process for building out the LED volume for Fallout?

We started by assisting in the initial design of an LED Volume. Our intention was to design a stage that gave the filmmakers the lighting benefits of a curved volume, in addition to the flexibility of longer walk and talks offered by straight sections of a flat wall. 

The horseshoe shape of our stage became a clear winner, and the final result was an LED wall measuring 75’ wide, 21’ tall, and almost 100’ long. Finding a stage that would house a volume that size became our next challenge. Very few stages were available back then, and that’s when Jeffrey Soderberg and Chris Cox from Manhattan Beach Studios (MBS) stepped up to the task. 

They worked with us and the group at Fuse Technical Group, led by Mitch Lathrop and Koby Kobylko, to design and build a volume specific to production’s requirements, and the result was a modular and flexible LED stage with the most advanced technology available on the market, built in Bethpage, Long Island.
On the compute side, Fuse Technical Group did a fantastic job designing the server room and video/data distribution of the stage. They custom-built a server room inside a standard shipping container in LA and then shipped it to New York. The 12 x 4K outputs for the main LED wall and the six other 4K outputs for the various wild walls were all contained in that server room, along with the six operator workstations at mission control that were connected to their boxes via fiber. 

We used Roe BP2 v2s for the LED panels, Brompton Tessera SX40s for the processors, and both the Stype RedSpy for camera tracking and the Stype Follower System for object tracking. Our mechatronics team, led by Fabian Flores, designed and 3D printed mounting brackets for the LED beacons of the Follower System for our wild walls so that they acted as dynamic projection surfaces in Unreal that could be moved and reconfigured in real time.
Our engineering team led by Lily Pitts, Guillermo Quesada, and Ross Rosewarne commissioned the stage post-construction, getting all the software components working with the hardware before the first shoot. Once the hardware was in place, we set up a content management system to pull VAD updates from remote team members and push images to the LED wall. 

We used Unreal and nDisplay to render the scene, and LiveLink to coordinate RedSpy data into the engine in real time. During a take, we used Take Recorder to capture tracking data and wrote custom software to wrangle that data into USD data for easy export to the VFX Teams supervised by Jay Worth and Andrea Knoll. 

What was the biggest challenge in approaching your work on this show?

The biggest challenge was shooting on film. Shooting on film undoubtedly brought a unique and beautiful aesthetic to the show by softening details, accentuating the artistic grit, and creating separation between the layers of the wasteland. The show itself would feel very different if it were shot digitally, so there was never a question of if we were going to shoot on film in the volume, it was only a question of how.
Having shot I on film a year earlier, we had experience working with Arricam sync boxes and pushing genlock to the camera. The lack of a digital full-resolution monitor on the camera meant that we had to be creative and very precise about dialing the look of the images on the LED wall. 

Our solution was to apply the film LUT of the show to the output of a Sony Venice camera, using it as a reference to adjust lighting and match color with a combination of OpenColorIO and color grading tools inside of Unreal Engine. We measured the color temperature of the scenes on the LED Volume and the foreground set lighting with a color spectrometer for each lighting change to ensure a match. 

How did Unreal Engine help with virtual scouting on Fallout?

On Fallout, as our virtual sets were built entirely inside Unreal Engine, we could leverage all of Unreal’s virtual production tools throughout the creative process, including virtual scouting tools that allowed us to put the filmmakers into VR so they could perform tech scouts and eventually block out the action. 

The tools enabled them to place cameras and characters in precise locations and even lens the shot in real time. We could then save that information and use it to create a heatmap of the environment. That allowed us to identify which areas of the sets were most critical to the story and ensure that we focused our creative attention there, refining the lighting and adding more visual fidelity. 

Knowing this also helped with our optimization efforts further down the line – since we knew which directions we were shooting, we could carve the scene up in modular ways that allowed us to hide computationally expensive set pieces when they weren’t on camera.

The visualization team, led by Kalan Ray and Katherine Harris, built 1:1 scale 3D versions of all the sets in Unreal – even the pieces that were only ever meant to be physical installations. This meant that when the filmmakers wanted to block out a scene, they weren’t limited to just the virtual assets; they could visualize the whole scene and compose meaningful shots that informed every department on the show.

From pre-production, to on-set production, to post – walk us through your workflow and how you kept creative options flexible and adaptable throughout production.

Pre-production was all about planning and visualization. Once the sets were destined for the volume, our viz team got to work. While our creative team, led by Craig Barron, worked with visual effects and the art department to hone in on exactly what the physical and virtual footprints of a given set would be, the visualization team was already working on the technical visualization aspects of the show.

To effectively understand the line between virtual imagery and on-set builds, we needed to work with the real-world dimensions of the stage. So we built a full-scale model of the actual warehouse in Bethpage, Long Island, and fit the true dimensions of the LED wall into it. This helped us plan not just the creative aspects of the show (what angles and lenses were “shooting off the wall”) but also helped us identify where to place pick points for the stunt team, where to install overhead lighting, and even where the Unreal operator desks should be. 

This also saved us headaches in unexpected ways – for example, by using this process we discovered that the dimensions of the physical doors were too small to fit some pieces of the Vertibird through! Everyone was very happy to discover this with enough time to find a solution.
Getting into the creative visualization work, our Virtual Production Supervisors Kathryn Brillhart and Kalan Ray worked with the directors of each episode to visualize the blocking of some more complex scenes. Kat Harris, our Virtual Production Lead, set up the initial blocking and integrated multi-user systems to allow the filmmakers to dive in alongside our animators and start moving pieces around in real time. We also visualized the Filly set in this way – even though it was never intended for the volume – to help plan some of the more logistically complex shoot days.

We worked with episode directors Wayne Yip, Clare Kilner, and Fredrick E. O. Toye to block out segments of the New California Republic base at Griffith Observatory and the Vault 4 dining hall and get a spatial context for what they would be shooting. Maximizing the effectiveness of the LED volume meant visualizing scenes you wouldn’t typically see in a previs pass – like dialog scenes – because it helped ensure we brought the fantastical elements of the story into frame. Lucy and Maximus aren’t just eating at a picnic table, they’re inside a massive underground nuclear powered vault – the imagery should support that narrative conceit.

During production days in the volume, our on-set producers Billy Bonifeld and Devon Hubner made sure we worked closely with every team on set. As we were feeding DMX lighting out of Unreal to the 150+ Arri Skypanels on stage, our creative technologist Addison Herr built and integrated that lighting output with the dim board operator on the gaffer’s team. 

We had fully articulated LED wild walls to fill in any gaps in the volume when shooting at extreme angles. Fabian Flores, our Virtual Production Mechatronics Engineer, and Julio Salcedo, our Virtual Production Hardware Engineer, worked with the grip department to dial in how and where to roll that equipment and keep everyone safe. We also worked with the construction teams to ensure the plexiglass used in the Vertibird cockpit wouldn’t create any distortion or artifacting when shot in front of emissive content.

We worked extremely closely with the VFX, Art, and Lighting Departments to ensure our environments were of the highest possible fidelity. After reviewing content on the LED wall with our on-set VAD Artists (Sarah Hudson Semple, Sidney Olluyn, Tony Kwok, Kellie Brewart, Liesbet Segaert, Hugh D. McCullom and Lisa Barber), Virtual Production Producers Gabriel Blake and Andrés Martinez coordinated content reviews with department heads and leads from Art, VFX, and Lighting to ensure everyone was on the same page about the progress of the environments and where assistance was needed to ensure the highest fidelity was reached.
Each team contributed an enormous amount of support for the final images that appeared on the wall, with VFX sharing Lidar and scan data of set pieces, environments, and assets. Ann Bartek (dedicated Art Director for all in-volume sets) provided the exact materials, colors, and textures of the physical set builds so the virtual set builds could match.

On shoot days, our virtual production supervisors coordinated with the episode directors to prepare the LED wall for shooting in a given direction. Once cameras were set and tracked, Unreal operators helped determine which camera’s perspective would take priority when rendering on the wall, and worked with the camera department to find perspectives that minimized frustum overlap and maximized tracking fidelity. 

For example, we collaborated with the Steadicam operator to find a mounting position for the tracking device so it didn’t interfere with the shot, while making sure the data coming through was clean and accurate. Berto Mora and the great Unreal team at All Of It Now were embedded with the camera and grip departments to troubleshoot these issues as they arose.

Our virtual gaffer, Devon Mathis, also worked with the dedicated volume DP, Bruce McCleery, as well as the episodic DPs to adjust the nit values, white balance, and color values of the LEDs to balance with the rest of the scene on a shot by shot basis. This built off work done in the pre-production stage where our lighting team used Unreal to visualize what each lighting setup would look like in each environment and planned for live lighting transitions.

What was the benefit of having a dedicated volume DP to work in conjunction with each episode’s DP?

Bruce McCleery was our dedicated volume DP for the ICVFX work on the Vault 4, Griffith Observatory, and Vertibird sets. Having such a talented and dedicated DP to help dial the content before and after pre-light days, work with our Virtual Gaffer Devon Mathis, and drive the production was a profoundly positive experience that directly translated into beautiful results. We recommend a dedicated volume DP for every large ICVFX production moving forward, particularly episodic work that has faster schedules and sometimes more variety in crew and direction.
We were also very fortunate to have the incredible support of the amazing producer Margot Lulick running the second half of the episodes. She made sure that proper attention was given to the LED portions of the shoot. ICVFX was new territory for the majority of the production team, but Margot and her team spent time learning about the process to ensure that we were leveraging the best parts of the volume creatively and shooting efficiently within it.

Describe how the LED volume plays a unique role in this show, both as a VFX tool and as part of the storyline.

The story lent itself to the technology of the LED volume, as in the series, a Telesonic projector simulates a Nebraska cornfield landscape within the Vault; so the LED volume is used both as a technical tool and a story point. This allowed Jonathan Nolan and the showrunners to have some fun with the idea of playing with the time of day mid-shot.
During the wedding scene in the cornfield, the Overseer has a line that indicates the change from dusk to nighttime, animating a moonrise mid-shot. It also allowed us to play with beautiful FX elements on the wall during the battle between Vault 33 and the surface dwellers. When the projectors are hit by gunfire, our Houdini team, Justin Dykhouse and Daniel Naulin, built out a film-burn effect that eventually took over the entire screen. 

We captured references of film burning, conducted online research on the film types in the 50s and their distinctive burning patterns, and created a captivating effect that gradually spread to all three walls of the vault. In building this effect, we realized the needs of an on-set environment were similar to building the effect for a video game – that is, we wanted the transition from one state to another to happen at a predetermined-but-creatively-flexible time. 

This way the production was not backed into the rigid timing of a linear FX element, rather, it could react to a cue based on the actor’s performance – a unique opportunity afforded to us by working in Unreal.
Why was working in real-time 3D essential for your work on this production?

Unreal offered flexibility, allowing the team to creatively modify the set during the pre-production phase. Having worked with this production team before, we knew we needed to be flexible on set and be able to make a change to the lighting or set design at the last minute if required. 

In fact, while we were shooting the Vault Door set, the idea to have the lights turn on during the shot while Lucy was exiting the elevator to ‘reveal the environment’ was brought to our team. Our VAD Lead Devon Mathis and our Lookdev/Lighting Supervisor Jeremy Vickery quickly came up with a plan, built that solution into a new level before lunch, and loaded the level onto the set so it could be used to shoot the opening shot of the scene, complete with a dynamic lighting change mid-shot. That never would’ve been possible with pre-rendered content.
Similarly, on the last day of shooting, production approached us with an opportunity to shoot the post-credits teaser of the finale episode. We quickly created a simple level using the environment light mixer to light and shoot Overseer Hank in the Power Armor suit as he wanders the desert. Due to the flexibility of the system, we were able to build the scene and load it up onto the wall in under ten minutes.

What role did Unreal Engine play in helping to plan and schedule shoot days?

Before a shoot, we conducted as many prelight days as the schedule permitted. On those days, we dialed the output of the LED wall with set lighting and shot the result using the camera body and film stock of the show. We’d bracket these tests with the camera’s exposure, as well as with Unreal’s manual exposure tools. 

The result of these tests allowed us to measure at what point our digital reference camera fell out of sync with our film stock, and what adjustments we could make to keep everything looking great in a variety of lighting scenarios. For work like this, using Lumen in Unreal allowed us to make the virtual lighting just as available as the physical lighting was on set. What would normally require multiple levels with different light bakes could instead be done in real time.

Furthermore, having real-time sets provided ample opportunities for the filmmakers to make creative decisions on framing, lighting, and action blocking very early in development. They even helped the ADs schedule shoot days by visualizing how large pieces of equipment would enter the set and where they could be staged.

The Vault 33 farm set in the cornfield was designed to be symmetrical so that when we needed to ‘turn around’ to shoot the other direction, only a few set pieces had to move. We virtually rotated the environment 180 degrees to shoot back into the vault itself. During the entire LED wall production phase, we had a 99% uptime, only suffering one 9-minute delay. This gave the showrunners and ADs confidence to schedule the volume shots aggressively, knowing that the Magnopus, Fuse, and All of it Now (all Unreal Engine Service Partner companies) crews could handle it.

How big was the team at Magnopus that worked on this project and how many artists were working in Unreal Engine?

The Magnopus crew encompassed 37 people during the full season’s production. 19 artists built the virtual sets, 7 operators ran the Unreal nDisplay system on the LED wall, integrated camera tracking, and programmed DMX data for the lighting systems. Software engineers, creative direction, and production oversight made up the rest of the team, all of whom worked in Unreal.
Were there any specific Unreal Engine features that were particularly essential in bringing this show to life?

Lumen is the most critical piece of Unreal that made Fallout come together. We experienced a significant drop in on-set downtime after moving away from baked lighting and could iterate on lighting changes in real time with the DP, which compounded the effectiveness of our creative reviews.

Explore Unreal Engine for Film & TV

Create compelling characters. Tell untold stories. Or let your audience explore a whole new world for themselves. Unreal Engine gives you unprecedented creative control over the entire production process.

Similar blogs


Virtual Production

Unreal Engine 5.5 is now available

This release brings major enhancements to animation authoring, rendering, virtual production, mobile game development, and developer iteration toolsets—and much, much more.

Indies

Fab, Epic’s new unified content marketplace, launches today!

Today, Epic launches Fab, a one-stop destination where you can discover, buy, sell, and share digital assets. This new marketplace supports all types of creators with content for use across Unreal Engine, Unity 3D, UEFN and other digital content creation tools.

Film & Television

Catch up on the big news from Unreal Fest Seattle 2024

Dive into Epic Games’ announcements from Unreal Fest Seattle 2024 to find out about Unreal Engine royalty rate savings through the Epic Games Store, new features in UE 5.5, Teenage Mutant Ninja Turtles in UEFN—and much more.