Universal Monsters © 2021 Universal Studios. All Rights Reserved.

Universal Monsters are back: How Plastic Wax made a Monsters-themed Fortnite mini-series

Many of the horror genre's most well-known conventions—the creaking staircase, the cobwebs, the swirling mist, and the crowds chasing creatures with torches—originated from one place: Universal Pictures. 

From Dracula to Frankenstein, the studio’s classic films gave birth to Monsters that continue to shape pop culture almost a century later. Now, they’re being introduced to a whole new generation through Fortnite’s virtual shorts festival, Shortnitemares.
Taking place over the run-up to Halloween, the festival premiered a collection of spooky shorts, which could be watched directly within Fortnite. Animation and VFX house Plastic Wax was commissioned to create one of the seven pieces in the festival: the first in a series of four episodes that would feature a compelling new take on Universal’s iconic Monsters.

The first episode in the mini-series was especially novel because it was produced with Unreal Engine and MetaHuman Creator, which helped the Plastic Wax team create final pixels in real time—with no post-production required.

Dracula vs. Van Helsing

Van Helsing and Dracula have been arch enemies since the late 1890s. What would happen if The Bride of Frankenstein joined them for a fight? Viewers get to find out in We Will Be Monsters: Episode 1, which depicts their latest epic battle. The project puts viewers directly into the action, going from a crowded Hong Kong street filled with 3D cars, shops, and neon lights to a luxurious temple with none other than Dracula waiting inside. It’s here that Van Helsing almost defeats Dracula in a fast-paced sword fight—before The Bride of Frankenstein changes the course of the story in a last-minute twist.

“We aimed to create realistic characters, complemented with stylized, punched-up animation that was part Shaw Brothers kung fu movie, part heist film,” says Nathan Maddams, Founder and Creative Director of Plastic Wax.

Working with Director Rick Famuyiwa (The Mandalorian, Dope); Art Director & Production Designer Tyrone Maddams; and Universal Monsters concept artist and legendary creature designer Crash McCreery (Jurassic World, Terminator 2: Judgment Day, Edward Scissorhands), Plastic Wax turned to MetaHuman Creator for the character builds, using only actor reference photos as their guide.

Final facial details like each hero character’s nose, jaw, and brows were then sculpted in ZBrush, before the models were imported into Substance Painter and Marvelous Designer to refine hair, clothing, and skin. All characters were then reimported into Unreal Engine to generate turntables and facial range tests for final approval.
“MetaHuman Creator saved us a ton of time throughout the entire character creation process,” Maddams reveals. He explains that before, the Plastic Wax team would have to create up to 300 scans of an actor’s head to capture every expression, retopologize the resulting model, then get all final details sculpted and cleaned into a high-resolution asset before rigging and animation could begin. “With MetaHuman Creator, we bypassed that whole scanning process,” he adds. “We had rigged, screen-ready characters straight out of the box that we could then reuse for our other episodes.”

Reinventing The Bride of Frankenstein

The pipeline’s increased efficiency meant the Plastic Wax team could iterate character and costume design while the scripts were still being developed, then start testing animation and on-set lighting to see how characters would perform early in the production process. This asynchronous pipeline was especially useful when building The Bride of Frankenstein.
The Plastic Wax team used MetaHuman Creator to add details on the Bride’s face, generating different skin patches, scars, and imperfections by combining six different Bride variants into one. “In the original film, the Bride of Frankenstein was only shown for three minutes. This was our opportunity to make her the ringleader and put her in charge,” says McCreery, explaining that working in real-time enabled the team to produce a more creative final result.

“​​Real-time ramped everything up to a level that I hadn’t really experienced before. Your decision-making has to be quick and more from the gut, and I feel like you get a better result when it’s your first gut instinct that you react to,” he explains. “Using Unreal Engine, I was receiving MetaHuman models that I could work on top of. They were textured, lit, and just these beautiful base structures. Within literally a day or two, I had the final look of our design. It was just a game changer for me as a designer.” 

Making Monsters with real-time animation

While the character models were being finalized, the project’s action-packed fight was created with the help of both real-time motion capture and more traditional hand keying to finesse each character’s final movements. Each performer’s body movements were first captured using Vicon Shōgun, before being retargeted onto the MetaHuman rig using MotionBuilder. The Plastic Wax team then recorded all facial animation using an iPhone 12 and Unreal Engine’s Live Link Face app, which streamed detailed movements—including realistic lip syncs—directly from the actors onto their virtual MetaHuman characters.
“We tried to push MetaHuman Creator pretty hard to see where it would break, and were surprised to discover how fast and flexible it was,” says Maddams. “With it, we were able to test animation and lighting on near-final characters right from the start, using just the software and our iPhones. This was especially useful for our fight sequence, where we could firm up timing and shapes while still in previs, leaving us more space to refine every scowl and punch.”

When all the MetaHuman animation was complete, it was then exported to Maya, where each take was refined by hand by the animation department. Once complete, characters could then be brought into Unreal Engine, where the Plastic Wax team had created real-time versions of the project’s street and temple sets that could be explored virtually using a VR headset. Cinematic lighting could then be easily created and tested directly on the virtual set. This was hugely important for close-ups, where even a small difference, like whether a character has their chin up or down in dramatic lighting, could change the mood of the scene and inform the cinematography.
Universal Monsters © 2021 Universal Studios. All Rights Reserved.
“From start to finish, we were hugely impressed by the process of creating an animated project using real-time tools,” concludes Maddams. “We had to create everything from creatures to lavish sets and even crowds, but by using MetaHuman Creator and Unreal Engine, we have a faster path to our final pixels.”

You can watch Universal Monsters x Fortnite presents We Will Be Monsters now until October 31, 2021. And for more real-time animation stories, check out our Animation hub.

    Ready to create your own unique MetaHumans?

    Check out the MetaHuman Creator product page for details on how to sign up for the Early Access program, and to access the documentation and tutorials. Still have questions? Visit the FAQ.