February 6, 2017

IKinema LiveAction and Unreal Engine: A Powerful Duo for Live Virtual Experiences

By Siobhán Hofma IKinema

To successfully convince and engage your audience during their virtual experience is the ultimate high-five for creators in the virtual field. Achieved by means of compelling storytelling, dynamic environments, and importantly – characters that behave with infinite realism and believability – essential elements to emotionally connect with characters inside their story, film and TV is the perfect modern-day example.

So, when the project brief for denizens of mediums such as TV broadcast, virtual production, and the arts was to deliver realistic and credible characters, many have relied on our VR and virtual production tech, LiveAction, in UE4 to produce a series of diverse and cutting-edge live virtual shows.

What is LiveAction

At the core of IKinema’s technology, is a powerful full-body IK solver which delivers the most natural and believable motion for 3D animated characters. With LiveAction software, the actor's performance is live captured using motion capture cameras, the data is instantaneously streamed to Unreal Engine through LiveAction, data is auto-cleaned using ACP filters, then perfectly retargeted to the virtual character (with zero latency). The final output is continuous, uninterrupted natural motion from actor to character, and a powerfully efficient real-time pipeline saving animation teams an immeasurable amount of time and frustration.

 blogAssets%2F2017%2FFEBRUARY+2017%2FIKinema+LiveAction+3.0+and+Unreal%2FIKinema_LA3_ACP_770-770x616-ebbb7a5592c2c6d7d167f0776def4c9d36b8473b

Animation Cleaning Pipeline (ACP): auto corrects and adapts data to floor terrain, eliminating floor penetration and sliding for perfect animation during live performances

 

blogAssets%2F2017%2FFEBRUARY+2017%2FIKinema+LiveAction+3.0+and+Unreal%2FIKinema_LA3_retarget_770-770x397-b323614c1fc775d4bcf4bad3e48845150086d776

Accurate real-time retargeting is achieved from source to human and fantasy creatures

 

blogAssets%2F2017%2FFEBRUARY+2017%2FIKinema+LiveAction+3.0+and+Unreal%2FIKinema_retarget_770-770x455-c50127842d66b334362b7c843521836163365520

LiveAction accurately solves on disproportionate characters

 

By leveraging LiveAction with Unreal Engine, they kicked out some of the world’s most interactive and realistic virtual experiences to date.

blogAssets%2F2017%2FFEBRUARY+2017%2FIKinema+LiveAction+3.0+and+Unreal%2FMr-Monopoly2_770-770x514-938c41a147d55f353686e33e8812443ec9d406b5

Image: Hasbro | Facebook

 

Hasbro, Facebook Live Broadcast: Announcement launch of Monopoly Ultimate Banking game, LiveAction acted as the bridge between motion captured from an actor performing in studio, and retargeting the MoCap data directly to their 3D animated Mr. Monopoly moving within a cartoon stage. Running parallel with Unreal Engine and several other technologies, LiveAction helped bring Mr. Monopoly to life producing realistic mannerisms for a convincing performance, and bolstered the team with an unbreakable pipeline during the live Q&A session with their fans.
 

blogAssets%2F2017%2FFEBRUARY+2017%2FIKinema+LiveAction+3.0+and+Unreal%2FGloboTV_RioOlympics3_770-770x433-18ceb96a3cee7756243775bab041ccefb21865f1

Globo TV, 2016 Rio Olympics: Here, LiveAction and UE4 provided the Globo TV team with a fully reliable real-time pipeline to stream and retarget athletes captured data to their animated avatars, for live broadcasts to Brazil’s nation throughout the Rio Olympics. Real presenters could analyse match replays, walk around and interact with virtual athletes and their virtual stadium environments during real-time sequences.

 

blogAssets%2F2017%2FFEBRUARY+2017%2FIKinema+LiveAction+3.0+and+Unreal%2FHellblade_03_770-770x433-8a34eec1246693745bb3947a28f432aa4e22db87

Ninja Theory, Hellblade: Senua’s Sacrifice: LiveAction played a role in seamlessly delivered perfectly matched actor behaviour (on-set), resulting in convincing humanoid motion for the virtual Senua character (on-screen) with cutting-edge and first-ever-seen VFX techniques. The show unfolded directly in front of a live audience, winning Best Real-Time Graphics & Interactivity Award at SIGGRAPH 2016.
 

blogAssets%2F2017%2FFEBRUARY+2017%2FIKinema+LiveAction+3.0+and+Unreal%2FThe-Met_770-770x520-8c38fe00fc29d9e0b209a8c42853632a69dd3a07

The New York Metropolitan Museum, The Return: An interactive art installation featuring a digital recreation of Adam. Tullio Lombardo's profound scultpure. combined digital puppetry, motion capture and live action. Visitors could ask the onscreen Digital Adam questions, with the actor performing and answering in a separate volume onstage to onlookers.
 

Under the hood


LiveAction Main features

  • Live motion capture streaming and solving with zero latency on multiple characters (example, up to 12 on a medium spec CPU, and 13 plus on more powerful processors)
  • Supports OptiTrack, Vicon, and Xsens motion capture systems
  • Supports joint retargeting and rigid body solving (Facility and Studio versions only)
  • Animation Cleaning Pipeline (ACP) – auto real-time correction of MoCap data such as noise, jitter, feet sliding, floor penetration
  • Dedicated UI for full pipeline setup in Unreal
  • Solve on similar or disproportionate characters

 

New to LiveAction 3.0

  • Automatic bone mapping
  • Template editor – custom name matching
  • Skeleton matching on target rig
  • Improved work flow for ACP
  • Supports custom MoCap systems through the MoCap Streaming Protocol (VRPN based)
  • Fast setup – see results within a few minutes

 

So, when your next brief reads something like this:


PROJECT:  Virtual Experience
DELIVERY DATE:  Ahh… yesterday
BRIEF: Mind blowing character motion. On budget. On time. MSL (Minimal Stress Levels)
NOTE TO SELF: Download Unreal Engine and contact Support@IKinema so we can help!!


IKinema LiveAction 3.0 is available for Windows platform and comes in both node-locked and floating licenses. Three license levels on offer – Binary, Facility, and Source. Visit the IKinema for more info.