April 17, 2019
Real-time mocap replaces HQ Trivia host with LEGO minifigure
“This project is all about essentially joining HQ Trivia with LEGO,” commented Aidan Sarsfield, Head of Production Technology at Animal Logic. “HQ Trivia is this amazing kind of game show where hundreds of thousands of people simultaneously log in at the same time and they play a game show. That, as it turns out, can be augmented by including LEGO minifigs.” Scott Rogowsky, who was a host of HQ Trivia in February when The LEGO Movie 2 was released, was well-known by the game’s regular players. As part of his hosting duties, Rogowsky appeared on-screen during each game, energetically commenting in real time about the questions, answers, and players.
For a special episode of HQ Trivia just before the new LEGO film’s release, Animal Logic built a LEGO minifigure of Rogowsky, rigged it with repurposed assets from The LEGO Movie 2, and imported it to Unreal Engine. Rogowsky then donned a motion capture suit and, with his signature enthusiasm, drove the minifigure in real time to host the game.
The episode was a huge success, with more than 600,000 participants in the game. LEGO Scott interacted with players just as Rogowsky was known to do, with the addition of commenting on his new plasticized shape and responding to their requests for specific facial expressions and movements in real time. “We had a lot of feedback,” commented Rogowsky. “They loved The LEGO Movie 2, they loved LEGO Scott. It was eye-opening to see what we can achieve here.”
The Rogowsky minifigure was driven by an iPhone X and a motion capture suit. Apple’s own ARKit face tracking provided Unreal Engine with Rogowsky’s facial expressions in real time, while the Xsens motion capture suit provided his body movements.
Unlike motion capture targeted to human characters, Rogowsky’s movements had to be interpreted into the much more limited range of LEGO minifigure motion. The real Rogowsky could move his arms and legs freely, but the minifigure toys don’t bend in all directions. For example, a LEGO minifigure can’t bend its knees or close its fingers. To make Rogowsky’s movements work with his LEGO counterpart, Animal Logic wrote a custom interpreter that took anything Rogowsky did and conformed it to a plastic figure’s range of movement.
In addition, for facial expressions, Rogowsky’s minifigure needed to look as if it were animated by hand. This meant that instead of smoothly transitioning from one expression to the next, the facial expressions needed to step in stages. Animal Logic developed a system that recognized specific facial expressions, which triggered a corresponding texture that represented that facial expression. For example, when Rogowsky raised an eyebrow, the corresponding texture that shows an eyebrow up was swapped onto LEGO Scott’s face in real time.
Animal Logic was very keen to get the plastic look of LEGO Scott accurate for a real, physical minifigure. A LEGO minifigure character always has yellow skin unless it’s based on a real person. The skin color of Rogowsky’s minifigure was matched to him, but with one of the actual colors LEGO uses for real toys. Like the feature film characters, the plastic body of Rogowsky’s minifigure also had accurate marks and fingerprints, just as if a real person were playing with LEGO.
The big difference was that with Unreal Engine, the real Rogowsky could make LEGO Scott talk and act in real time. Compare that with the feature film where, with all its complexity, it can take hours to produce a single frame.
Commenting on the process, Sarsfield remarked, “After bringing all of these disparate technologies together and bringing all of these people into this completely uncharted territory, everybody that was involved really enjoyed the process. I think anything that involves exploratory possibilities is always going to be of benefit to all of us.”
To create your own great real-time experiences, download Unreal Engine and get started today!