Image courtesy of Treehouse Digital LTD

MetaHumans star in Treehouse Digital’s animated short horror film The Well

What’s in a name? Treehouse Digital’s choice of moniker is not arbitrary; based near the seaside town of Bournemouth on the UK’s south coast, the company is made up of six friends who spent their childhoods playing in a treehouse together, pretending to be their favorite film and TV characters.

Clearly, it was a fertile environment for the cohort’s creative development. They stayed in touch into adulthood, and in 2017, they founded Treehouse Digital, cutting their teeth on traditional live-action filmmaking with Treaters, the first in a planned series of short horror films they called ScaryTales.

Soon, the friends were finding themselves drawn to the emerging world of virtual production. They did their first in-camera VFX test in 2019, and the company has now transitioned into offering full virtual production services to the film and television industries, while still continuing to produce original content. 

In October 2021, they released The Well, a short horror film created entirely in Unreal Engine—the company’s first fully CG production.

From live action to full-CG animation

Treehouse Digital’s expansion into creating animated content is supported by the evolution of Unreal Engine’s filmmaking toolset, as The Well’s Co-Writer and Director, Peter Stanley-Ward, explains. 

“We have always been heavily influenced by animation,” he says. “In live action, we have developed a signature look with carefully chosen aspects of filmmaking like art direction, locations, and photographic choices. We’ve always dreamed of finding a meeting point, where both worlds could come together—this is why we’ve had one eye on what Unreal has been developing. We knew this would be where these different worlds would collide.”

In fact, The Well had started as an idea for a live-action short. “We love the spooky stories that kids tell each other,” says Writer Natalie Conway. “The campfire scenario is a tradition that goes back to when stories first began, and it’s not going anywhere any time soon. We’ve adopted the principle and added some visuals to our tales. Treaters was live action, but we wanted complete control over the look and style, so we shot the entire movie in a studio with digital backgrounds. It was a fun exercise, and was very much a first incarnation of doing a visually stylized short. The Well was the obvious choice for our next ScaryTale.”

The film’s Producer, Chris Musselwhite, takes up the story. “As we dug deeper, it became more and more apparent that the only way to make this was entirely in engine,” he says. ‘It just lent itself to the technology. It was a real win for us that we actually found something that deserved this technology to bring it to life, rather than trying to fit a narrative to a piece of tech.”

For Virtual Production Supervisor Paul Hamblin, the decision to go with Unreal Engine was an easy one. “Unreal was so user-friendly for our small team, that it was an absolute no-brainer for our choice as a main platform for our first venture into full-CG content,” he says. “The accessibility of the filmmaking tools in Unreal was the first thing that drew us to it. 

“We also love how much consideration has been given to filmmakers in the way it works—for example, once we got our head around Sequencer and how powerful it was, we were hooked,” he continues, referring to the engine’s built-in cinematic non-linear editor.

Nailing the narrative

Once the team was happy with the script, they began putting together some mood boards for location and clothing, bringing in concept artist Steve Trumble to flesh out the look of the environment and the well itself. They also cast the actors for the four characters in the piece, using performers they’d worked with before on live-action projects. 
Image courtesy of Treehouse Digital LTD
At the same, they began turning the script into an animatic, a crudely animated first pass of the movie. 

“It really gave us an idea that the story was working visually,” says Peter. “We even made some changes to the script in that process, because we saw that there were certain areas which weren't working as well as others. So it's almost like another draft of the screenplay when you do that.”

Enter MetaHuman Creator

It was at this point that the free MetaHuman Creator Early Access program was launched, enabling anyone to quickly and easily make fully rigged photorealistic digital humans. Until then, the team had been exploring a few different existing character creation options.

“We got some nice results, and at that time we were going more stylized,” says Tom Saville, the film’s Animation Director / Editor. “But when MetaHumans landed, we couldn’t ignore how incredible they looked—not to mention that they were built for Unreal. We suddenly got confident that we could go more photoreal, something we never dreamed we'd be able to do so quickly.” 
Image courtesy of Treehouse Digital LTD
“It was a game changer for us,” agrees Peter. “It couldn't have come at a better time, because we were just at this point where we were figuring out how we were going to bring these characters to life. How are they going to move? How are we going to animate them? And when MetaHumans landed, it was just a no-brainer that this is the way to go.”

MetaHuman Creator enables you to create new characters by sculpting and blending between a diverse series of presets, derived from high-quality scanned data. With its highly intuitive interface, anyone can produce a custom physically plausible character in a matter of minutes, and completely refine it to their requirements in a couple of hours. 

“The ability to see a character come to life so rapidly made the process almost like casting an actor,” says Tom. “Coming from primarily a live-action background, it felt like such a familiar process to us.” 
Image courtesy of Treehouse Digital LTD

Refining the characters

After the four characters had been crafted to the team’s satisfaction in MetaHuman Creator, they were exported to Maya so that custom clothing and hair could be created for them.

The team designed and stitched the clothing in Marvelous Designer, simulating it over the top of the characters to achieve natural-looking draping, before sculpting fine details such as refined folds and shoulder pads in ZBrush. Next, the clothing had to be bound to the MetaHuman skeleton in Maya.
Image courtesy of Treehouse Digital LTD
“We selected the specific joints that made sense because they were close to the clothing,” says CG Lead Doriana Re. “Then we copied the skin weights of the MetaHuman bodies onto the clothing, and did some tweaking with the weight paints.”

The clothing was then imported into Unreal Engine, where simulation was added where needed using the engine’s built-in physics engine.

When it came to texturing the clothes—together with hero props like the well itself, and also makeup and skin details—they chose Substance Painter, even using it to add dirt under Trish’s fingernails. Meanwhile, grooms for the characters’ hair, and even Trish’s woolen hat, were created in XGen, a plugin included with Maya.

Iterating in a real-time pipeline

Quite late in the process, after the characters had been completed, imported into Unreal Engine, and lit, Peter felt that there was something missing with Trish, the lead character. They decided to see how she looked with face paint, which they applied in Substance Painter.
“That face paint just made Trish, Trish,” says Peter. “So that's a great example of how you're able to constantly evolve because of that real-time pipeline. You're constantly able to make decisions creatively as you go, and the characters evolve and the story evolves.”
Image courtesy of Treehouse Digital LTD
Chris elaborates. “The real-time process is very much in sync with our creative style,” he says. “We’re able to make proper creative storytelling decisions based on what the final picture will look like—and we can apply our live-action decision-making in a fully virtual world. 

“A good example of this is that we effectively rewrote the opening 30 seconds of the film, 24 hours before release. We had an opening in place already, but seen in context of the full film, we thought we could do better as storytellers…so we jumped back into Sequencer. It was such a quick process, and at that point we were so confident with it that it felt very much like grabbing quick takes with a physical camera. If it wasn’t for real-time, we wouldn't have attempted it.” 

Capturing and fine-tuning the motion

With the actors cast and the characters ready for animation, it was time to schedule the motion capture shoot. Xsens suits were used to capture the body performance, and Faceware for capturing the face. “All the suit data was recorded and backed up in their respective systems, but we also had all the data streaming live into Unreal so we could see a really good representation of how the actors were coming across in their characters,” explains Chris.

Using a combination of Preview and the on-set witness cameras, the team then made their performance selects and began filtering them into their edit. They then refined the finger animation for those selects by using a pair StretchSense gloves to record intricate movements, which were applied over the original animation in Sequencer.

After processing through the Xsens application, the team found that the motion capture data was relatively accurate and looked good straight from the captured volume, with just a few tweaks required to a limb or body position. It was the same case with the facial performance data. 
Image courtesy of Treehouse Digital LTD
“On the whole, it was an excellent base, but we needed a little extra push with keyframing to really nail the lip movements and expression,” says Chris, explaining that they used Unreal Engine’s Facial Control Board to achieve this. “Once we baked the take to the Control Board, we were able to go in and not just clean up the data, but really work on bringing out those subtleties for the actors’ performances.”

“It’s very liberating to be able to clean up and adjust animation in the engine, rather than in an external program,” says Tom. “We were constantly making micro-adjustments to everything to improve the shot. For example, re-targeting the eyeline was key to getting things to look natural.”

Adding the finishing touches

While the final performances and camera angles were being built, so were the environments. There were three main locations: the opening ‘trainline’, the above-ground well, and down the well. 
Image courtesy of Treehouse Digital LTD
To create the set and environments, the team used kit-bashing, employing various Quixel Megascans—which are free for use with Unreal Engine—and Marketplace assets to lay down trees, roots, and other objects. For specific elements such as hero trees and roots, they either designed and modeled them from scratch, or manipulated other assets.  

For the water in the well, they used Waterline PRO, dropping a sphere in just off camera to create bespoke waves. The creep, watery atmosphere was assisted by the addition of a variety of modified lights to provide fake caustic lighting on the walls and the main character.
Image courtesy of Treehouse Digital LTD
Next, the team turned to Unreal Engine’s Volumetric Clouds and Sky Atmosphere to control the mood in each location. “This allowed us to really dial in the tone of the piece by controlling exactly the density of the clouds, how high you wanted them to be, the movement of them,” says Paul. “It gave more of an organic feel to filming because they were acting like real clouds, rather than being a skydome or a matte painting.”
Image courtesy of Treehouse Digital LTD
Completing the picture, Unreal Engine’s built-in particle VFX system, Niagara, was used to add falling leaves and dust particles.

Final rendering in Unreal Engine

With everything ready to go, because of the speed of working in Unreal Engine, the team was able to treat the final rendering process a bit like a shoot day. 

“We’d record out a take of a shot, then we’d take a look at it in Adobe Premiere, then we’d see if we want to adjust any lights or improve the angle, then we’d fire out another version,” says Chris. “Once we were happy, we’d move on. It’s very much like how a live-action set works: you do the blocking, then check it out on the monitor, tweak the lighting, get the actors in and get the shot.”
Image courtesy of Treehouse Digital LTD
The last stage was to finalize the look of the film in a color-grading session, using Lumetri in Adobe Premiere to add film grain, edge blurs, and natural bloom, plus a few post-process effects with Red Giant, such as lens distortion and a little diffusion. “It really cemented our final look, and because we were rendering lossless image sequences from Unreal, we had the latitude in the footage,” says Tom. 

What’s next for Treehouse Digital?

With their first fully CG production under their belts, the team at Treehouse is not planning on putting their feet up. They’re already in pre-production on the next ScaryTale, due to release Halloween 2022, and are actively looking for more projects.

“It’s obvious that this technology is opening up a whole new way of visual storytelling, a future where a small group of like-minded artists can come together and create beautiful, authentic stories that will resonate with audiences all over the world,” says Peter. “For us, we don’t want to turn back. We see so many new and exciting ways to bring worlds and characters to life with this technology—and it’s a methodology that really fits us at Treehouse.

“For us, The Well was a project unlike any other—exciting, fun, liberating, and challenging in all the right ways. It was a baptism of fire, but once Halloween was over, we couldn’t wait to dive back into Unreal and make more.” 
Image courtesy of Treehouse Digital LTD

    Get Unreal Engine today!

    Get the world’s most open and advanced creation tool.
    With every feature and full source code access included, Unreal Engine comes fully loaded out of the box.