Image courtesy of Disney+

The Jim Henson Company’s Earth to Ned uses real-time digital puppetry for final-pixel TV

Rob DiFiglia
The Jim Henson Company has been delivering performance-driven family entertainment for over 60 years, and winning dozens of awards—over 50 Emmys and nine Grammys—in the process. Best known as the original creators of the Muppets, the company is widely recognized as an innovator, employing industry-leading tools and techniques ranging from traditional hand puppetry to animatronics and real-time digital animation. Among its huge portfolio of credits are such memorable productions as Fraggle Rock, The Dark Crystal, Labyrinth, and Farscape.

The company’s Chairman of the Board Brian Henson, son of its founder Jim, is keen to continue this culture of innovation. Case in point: he’s been embracing virtual production in one form or another for decades. Today, he’s breaking new ground with episodic show Earth to Ned, a format he describes as “alien science fiction meets American talk show.” 
 

Earth to Ned, which airs on Disney+, combines live action, animatronics, and real-time on-set performance-driven CG animation. The premise of the show is that aliens who were sent to Earth for an invasion become obsessed with pop culture and decide to create a talk show where they interview human celebrities.

The real-life celebrity guests are hosted by animatronic aliens Ned and Cornelius, along with a CG artificial intelligence named BETI, a particle-based entity rendered in real time on set in Unreal Engine.
Image courtesy of Disney+
The idea of CG digital animated characters that could be driven by performance originally came from Jim Henson himself, but later took hold in 1994, when the company was nearing the end of production on Dinosaurs, a satirical family sitcom series. That show had been created using huge animatronic suits with dozens of motors—hot and uncomfortable for the performers, and prone to breaking down.

“We started thinking, if only we could do CG animation where we still had performers performing the bodies of the dinosaurs, but they didn't have to be in all these big skinned-up suits,” says Henson. “And if only we could perform the heads without 45 motors that could fail, but still capture that same energy.” 
Image courtesy of Disney+
To facilitate this then-groundbreaking concept, they needed to develop a system that would give the performers real-time feedback.

“We're not the types that would go out into a motion capture environment and pretend you're a dinosaur and then wait a week to see what you look like as a dinosaur—that would never work,” explains Henson. “So we need to be able to see screens all around us that show us instantly what we look like as a dinosaur.”

While the proprietary system they created has stood the test of time—they’re still using it—it didn’t give them sufficient visual quality to directly use the output; everything still needed to be rendered offline. 

“It's built for previs,” explains Dan Ortega, who holds the title of Creative Supervisor (Digital) at the company. “It doesn't really have the ability to do any real-time effects or even high-res images. And that's on purpose. We built it with tools that would allow us to capture data and take it into a post pipeline.”

For Earth to Ned, the plan was to render BETI live on set at final broadcast quality, meaning the existing system would not work. Not only that, but they would need to handle some pretty special requirements for BETI’s character. Henson wanted her to appear to be made out of electricity and to be able to morph between a humanoid look and an alien face.
Image courtesy of Disney+
The person on the receiving end of these challenging requests was Ortega. “He [Henson] described it like a ball of energy, that's similar to a Tesla coil, but not really because it had to be made out of particles,” he says. “It had to have lightning bolts that were connecting it to a room. He also needed that to morph into a talking face.”

Ortega knew right away he would need to use Unreal Engine to achieve the real-time effects.

“Unreal allowed us to use our award-winning puppet controls to interact in a virtual world that's in stunning quality, in a high frame rate, without any latency,” he says. “Unreal gave us all those complex materials, the particles, all the effects that we needed to perform and animate in real time with that engine.”
Image courtesy of Disney+
To have BETI appear to be physically on set, the plan was to create ‘rooms’ she could float in with screens inserted into the set. By tracking all the cameras in real time, it was possible to generate the correct parallax to create the illusion of there being volumes behind the screens.

“The other challenge was that Brian wanted the face to be teleported between three different rooms on set and it all needed to happen in real time on camera with no post,” says Ortega. “In my head, when we were done with that meeting, I'm thinking, ‘Well, that's something we've never done before. I have never seen anything like that.’ ”

With Unreal Engine selected as the rendering engine, Ortega needed to integrate it into their existing pipeline, which includes Autodesk Maya—where the characters are rigged—and the Henson Digital Puppetry Studio, their proprietary suite of hardware and software tools comprising a performance capture system together with mechanical puppetry control devices.

To achieve this, Ortega’s team created the Henson Nodeflow engine, a real-time proprietary standalone animation engine that acts as a gateway between the systems.

The Maya scenes are exported to the Nodeflow graph, which mimics the Maya Dependency Graph and runs the imported character by emulating Maya node functionality, including joints, constraints, curves, Blendshapes, and more complex functions such as Set Driven Keys. Nodeflow then passes this data to Unreal Engine. In the case of BETI, 4,000 nodes were sent over to Unreal Engine at 1.5 ms per frame.

To enable real-time skeletal animation of the character, the team created a custom streaming application to send the output from the Henson Digital Puppetry Studio to Unreal Engine via Live Link, Unreal Engine’s interface for streaming and consuming animation data from external sources.
Image courtesy of Disney+
Ortega’s team also used this same connection in conjunction with Blueprint, Unreal Engine’s visual scripting language, to enable puppeteers to trigger the real-time effects, such as the lightning, the facial morphing, and the teleportation between rooms. 

About a dozen Blueprint parameters are connected to character attributes that are programmed in the Henson Performance Control System. This enables the performers to use the Academy Award-winning puppeteer interface to adjust things like brightness, frequency, and effects scaling using inputs such as hand controls, pedals, and sliders.

As a result of all this effort, BETI is able to interact with the guests, something the celebrities love.

“The guests were really excited when they came on and realized, ‘Oh, she's really going to be there,’ because I think people assumed it was going to be a post effect,” says Henson. “That illusion was complete for the guests who were there. They couldn't see any puppeteers. They just saw Ned and they saw Cornelius and they saw BETI. And it's fabulous when that happens.”
Image courtesy of Disney+
The success of using real-time final-pixel rendering on Earth to Ned has left the team hungry to see how far they can push the concept, and further tests are already underway.
Image courtesy of Disney+
“If we can start delivering our full CG-animated TV shows with a higher and higher and higher percentage of the shots requiring no post-production pipeline, then obviously the efficiency of the production skyrockets,” says Henson. “And with Unreal, we can get a much higher-quality image, a finished image. We can get to a finished image in real time. And we couldn't do that without Unreal. We tried. We tried for a long time.

“By teaming up with Unreal, on a live-action set we can produce finished CG animation that can then go to editorial and be completed in the same way that you make a live-action production,” says Henson. “And that's very exciting.”

    Get Unreal Engine today!

    Get the world’s most open and advanced creation tool. 
    With every feature and full source code access included, Unreal Engine comes fully loaded out of the box.