February 21, 2019
Ncam helps deliver seamless integration of live action and real-time 3D graphics in UE4


Specialized hardware for accurate tracking
At its core, Ncam relies on a camera-mounted specialized piece of hardware. This small lightweight sensor bar combines a suite of sensors. Most visible are the two stereo computer vision cameras. Not so obvious are the 12 additional sensors inside the Ncam hardware bar. These include accelerometers and gyroscopes, which together with the stereo camera pair make Ncam able to fully see the set in spatial depth, with a real-time 3D point cloud.

Gathering data for predictive movement
Ncam from the outset is relying on a fusion of different techniques including visual tracking, odometry, and inertial navigation system technology to solve the problem of camera tracking. However, in addition to gathering data, Ncam also provides insightful information. The software uses this data to do predictive movement and have robust redundancy. It knows where the camera was and where it thinks it is going. The software handles any loss of useful signal from the cameras. If the actor blocks one of the stereo lenses, or even both, the system will continue uninterrupted based on the remaining aggregate sensor data.
The software integrates all this data into one useful input to UE4. For example, while the computer vision cameras could run at up to 120 fps, the other sensors run at 250 fps, and so all the various data is retimed and normalized into one coherent, stable output which is clocked to the timecode of the primary production camera.
Some sets have very challenging lighting and Ncam has an option to run the cameras in infrared mode, for strobing or flashing-light scenes. The system is also designed to have low latency, so a camera operator can watch the composited output of the live action and the UE4 graphics as a combined shot, for much more accurate framing and blocking. It is much easier to line up the shot of the knight and the dragon, if you can see the whole scene and not just a guy in armor alone on a green soundstage.

Precise lens calibration and matching
The camera tracking resolves to six degrees of freedom: XYZ position and then three degrees of rotation. Added to this is the production camera’s lens data. In addition to focus, iris, and zoom, Ncam has to know the correct lens curvature or distortion during all possible zoom, focus, and iris adjustments for the UE4 Graphics to match perfectly together with the live action. Any wide lens clearly bends the image, producing curved lines that would otherwise be straight in the real world. All the real-time graphics have to match this frame by frame, so the lens properties are mapped on a lens serial number basis. Every lens is different, so while a production may start with a template of say a Cooke 32mm S4/i lens, Ncam provides a system for lens calibration that compensates for individual variations.
Ncam is compatible with systems such as Arri’s Lens Data System (LDS), but those systems typically don’t give image distortion over the entire optical range of the lens. At the start of any project, productions can calibrate their own lenses with Ncam’s proprietary system of charts and tools to map the distortion and pincushioning of their lens, and then just reference them by serial number.
In the end, the system produces stable, smooth, accurate information that can perfectly align real-time graphics with live-action material. Ncam founder Nic Hatch explains, “We spent a lot of time working to fuse the various technologies of all those different sensors, I guess that’s sort of our secret sauce and why it works so well.”
Integrating CG elements with Real Depth
The other huge benefit of Ncam is depth understanding. When elements are combined in UE4, the engine knows where the live action is relative to the UE4 camera, thanks to Ncam’s “Real Depth”. This allows someone to be filmed walking in front or behind UE4 graphical elements or virtual sets. Without the depth information, any video can only sit like a flat card in UE4. With Ncam, as the actor walks forward on set, they walk forward in UE4, passing objects all at the correct distance. This adds enormous production value and integrates the live action and real-time graphics in a dramatically more believable way. This one feature completely changes Ncam’s use in motion graphics, explanatory news sequences, and narrative sequences.

Advanced real-world lighting matching with Real Light
While the primary focus has been on Ncam understanding the space in front of the camera and what the camera is doing, the company also has an advanced tool to understand the lighting of the scene. Their “Real Light” project allows for a live light probe to be in the scene and inform the UE4 engine of the changing light levels and directions.

Real Light is designed to solve the challenge of making virtual production assets look like they are part of the real-world scene. Real Light captures real-world lighting in terms of direction, color, intensity, and HDR maps, allowing the Unreal engine to adapt to each and every lighting change. Importantly, it also understands depth and position of the light sources in the scene, so the two worlds interact correctly. This means that the digital assets can fit technically and look correctly lit, which is a major advance in live action game asset integration.
Interested in finding out about more new technology, techniques, and best practises that are changing the game for on-set production? Head on over to our Virtual Production hub, or check out our other posts relating to broadcast.