August 29, 2018
Unreal Engine helps power Kite & Lightning’s winning performance at Real-Time Live!
Their Bebylon presentation, which won Best Real-Time Graphics and Interactivity, featured a live performance driving fully expressive, real-time rendered CG characters, proving how an ingenious combination of readily available technology can level the playing field creatively and produce astonishing results. While still only four-people strong, Kite & Lightning have already come a long way from their humble beginnings in 2012. “The seed of K&L was planted thanks to a dancing bear my buddy Ikrima Elhassan made in augmented reality running on his iPad,” says Cory Strassburger, who went on to co-found Kite & Lightning with Ikrima. “It was the first time I saw AR and it melted my brain with excitement.”
The pair started making AR together, and Kite & Lightning was born in early 2013, pitching AR - which was virtually unknown back then - to just about every movie studio in LA. In the midst of that pursuit, Ikrima showed up to the office with an Oculus DK1 fresh off the Kickstarter. That was a turning point for the team: “Within about 20 minutes we abandoned AR for VR!” laughs Cory.
After creating some early experimental VR pieces and then getting serious work that included NBC’s The Voice and Lionsgate's Shatter Reality for the film Insurgent, the team began development on their current project, a party brawler VR game called Bebylon Battle Royale.
It was Bebylon that became the inspiration for the Real-Time Live! demonstration.“I was desperate for a way to bring our game’s immortal Beby characters to life, and facial capture was the big hurdle,” explains Cory. “I knew Apple had bought a company called Faceshift who at the time was already democratizing face capture on the desktop, so I got curious how much of their tech made it into the iPhoneX and ARKit. My first tests showed they managed to miniaturize the whole core of their face capture tech into the iPhoneX and that was very, very exciting.”
Digging deeper into that became Cory’s solo Sunday project, since the company was already swamped with game development work during the weekdays. Kite & Lightning had already acquired an Xsens MVN suit which handled the body capture, and together with the iPhone X and a “seriously uncomfortable $45 paintball helmet,” Cory eventually had what he needed to submit an entry to Real-Time Live! competition.
But there was still one crucial element of the story missing: “The entry submission was the result of a series of non-real-time tests where I improved the face capture and added the body capture using our Xsens suit, all being assembled in Maya and rendered offline,” says Cory. “It wasn’t until we got accepted into Real-Time Live! that I got it working in real time within Unreal Engine.”
Cory had been itching to get the mocap setup working in Unreal, but it was getting selected for Real-Time Live! that accelerated that process. He credits Chris Adamson from Xsens for motivating him to submit the entry. “I’m so grateful he did because the results were much more exciting than I originally imagined, and the amount of time it’s going to save us (working this well in real time) is enormous!”
It was a plugin for Unreal Engine that provided the final piece of the puzzle. IKINEMA LiveAction took the streamed body data from the Xsens suit and retargeted it to the Beby character, along with protecting against foot slippage and inter-body penetration. “It’s really the key to tuning the body data to the character and virtual environment, so things you would normally have to do in a post process are happening in real time,” Cory enthuses.
Hosting the plugin was just one of many roles Unreal Engine played in the project, the most obvious of which was ingesting all the real-time capture data and beautifully rendering it in real time.
“It’s funny to think about it now, how Epic really did most of the hard work and I just plugged into it,” muses Cory. “For example, they integrated their LiveLink tech with ARKit which made streaming the iPhone X data into the engine as easy as typing in an IP address. And thanks to the Face AR sample project they posted (which contains a lot of valuable time-saving work like face calibration, data smoothing, etc.), I just needed to swap out their Kite boy character with my Beby character, build the iOS app with their included scene and voila, iPhone X was streaming perfectly.”
Cory also downloaded Epic’s latest Digital Human Meet Mike sample project featuring Mike Seymour. “I pirated skin shaders, eyeballs and a very cool lighting rig, and BOOM, I had some seriously good looking Bebies rendering in real time,” he tells us.
The benefits don’t end there. “I think what I’m most excited about with this setup being in Unreal is the ability to use their Sequence Recorder to record the whole performance and audio with a click of a button,” says Cory. “It literally can’t get any easier to capture performances and drop them right into our game!”
So what does the future hold for Kite & Lightning, armed with this new pipeline? “We’re planning to have a ton of fun using this setup and I’m excited to see what real performers can do with it to bring the Bebylonian characters to life,” Cory enthuses. “We also plan to put this virtual production idea to the test and livestream Beby characters on Twitch, possibly combining it with AR so we’ll be broadcasting live from our office in Santa Monica.”
But in Cory’s view, the possibilities are even more intriguing. “I can see how this combo of technology opens many doors, not only for bringing game or previs characters to life, but for virtual streaming of live character performances and virtual productions,” he explains.
“It’s so easy now to stream multiple performers into Unreal Engine and either capture that data or livestream the final render. Combine that with multiple virtual cinematographers and cool virtual sets, and you have a home-brewed virtual production setup in the same vein as the one Jim Cameron and Weta created for making Avatar! That’s mega-exciting because the tech is getting less cumbersome and you don’t need to wrangle tons of people to operate it, which means you can focus more on making great content without serious time pressures!”
Cory is still enjoying the afterglow of winning Real-Time Live!, although he initially had a little trouble believing it. “I honestly was a bit shocked to have won... It wasn’t until I was walking off stage and the proceedings were concluded that I realized this was the actual final award, and not just an honorable mention! There were some incredible real-time projects on that stage, so it's a huge honor to have won,” he tells us.
“I’d definitely like to high-five and express how grateful I am to the inspirational peeps I get to work with every day at K&L, all the peeps at Xsens, IKINEMA and Epic who directly helped me with this endeavor, and specifically to Epic for their philosophy and approach to their work and community. It’s great to be on the receiving end of so much high-quality awesomeness!”
Interested in creating your own winning real-time experiences? Download and explore Unreal Engine 4.20 today.