To support the efforts of XR devs everywhere, we’ve been building a solution for compositing real-world video onto virtual world space in Unreal Engine 4. Mixed Reality Capture, available in Early Access as of Unreal Engine 4.20, equips you with the tools you need to project yourself (or any tracked object) into your virtual experience.
Here, I’m using a drone to fire a laser at a robot in Robo Recall!
Getting the capture space set up is pretty straightforward, though there are a few real-world things you’ll need to get started. For recording Robo Recall, we used a green screen draped between two tripods, a stationary mounted camera, a capture device to snag the camera feed, three Oculus sensors and the Rift +Touch controllers to play.
Once the environment is set up, we use a calibration app that creates spatial mappings between objects in the virtual world and the real world.
After everything is properly calibrated, we can play through the game and record the composited video using the screen recording software of our choice (for Robo Recall, we used OBS.)
A quick tip - You’ll want to play around with what you render and how you render it while capturing using Mixed Reality Capture to provide the best representation of your project. Notice that the guns in my back holsters are not rendered and the hand models are also hidden. We turned them off for this recording as their scale and location is tuned for the player’s experience - not the spectator’s - and they didn’t quite feel right as they were OOTB. Like everything, test and iterate often!
To learn more about using the Mixed Reality Capture feature, including how you can calibrate supported devices and composite users into a virtual environment, read through our early access MRC Development documentation to get started with your own immersive experiences.
Is your video capture device not listed? We’ve built the system in a modular way to facilitate new capture methods easily. If you’re interested in integrating a different video capture solution, have a look at the “Media” module in the engine to find a series of interfaces to implement. Currently we’re using the WmfMedia module for our playback.
We’re eager to see how people leverage this feature to show off their projects in a new way and are looking for feedback on the tool. Drop us a line in the MRC feedback forum thread to share your thoughts!