New to Unreal Engine 4.20 is support for Apple’s ARKit face tracking system. Using the hardware of the iPhoneX, this API enables the user to track the movements of their face and to use that in Unreal Engine. The tracking data can be used to drive digital characters, or can be repurposed in any way the user sees fit. Optionally, the UE4 ARKit implementations enables you to send facial tracking data via the LiveLink plugin directly into the engine, including current facial expression and head rotation. In this way, users can utilize their phones as mocap devices to puppeteer an on-screen character.
For more information about face tracking with Apple’s ARKit, please see Apple’s official documentation
Note: As of the time of this writing, the only mobile facial animation capture system available is only available on iPhoneX hardware. As this technology expands to other devices, this document will be updated.
Hardware Requirements: iPhone X
Maps: One low-res demo map for deployment to device (FaceTrackingMap_Simplified) and one high-res map for LiveLink puppeteering (FaceTrackingMap2)
Note: App is intended to be deployed from Unreal Engine. This requires the user have an Apple Developer’s account, available at developer.apple.com