Save up to 70% now through December 3 & earn a Marketplace Coupon!

Face AR Sample

Epic Games - Jul 14, 2018

The FaceARSample showcases Apple’s ARKit facial tracking capabilities within UE4.

  • Supported Platforms
  • Supported Engine Versions
    4.20 - 4.27
  • Download Type
    Complete Project
    This product contains a full Unreal Engine project folder, complete with Config files, Content files and .uproject file, which can be used as a template to create a new project.
New to Unreal Engine 4.20 is support for Apple’s ARKit face tracking system. Using the hardware of the iPhoneX, this API enables the user to track the movements of their face and to use that in Unreal Engine. The tracking data can be used to drive digital characters, or can be repurposed in any way the user sees fit. Optionally, the UE4 ARKit implementations enables you to send facial tracking data via the LiveLink plugin directly into the engine, including current facial expression and head rotation. In this way, users can utilize their phones as mocap devices to puppeteer an on-screen character.

For more information about face tracking with Apple’s ARKit, please see Apple’s official documentation.

Note: As of the time of this writing, the only mobile facial animation capture system available is only available on iPhoneX hardware. As this technology expands to other devices, this document will be updated.


Hardware Requirements: iPhone X

Maps: One low-res demo map for deployment to device (FaceTrackingMap_Simplified) and one high-res map for LiveLink puppeteering (FaceTrackingMap2)

Note: App is intended to be deployed from Unreal Engine. This requires the user have an Apple Developer’s account, available at developer.apple.com.