Zero Density delivers live broadcast virtual production solutions
Traditional post-production artists find Reality Engine’s user interface and approach very familiar, and not unlike Autodesk’s Flame Batch tool. But unlike Flame, Zero Density’s virtual production solution is designed to work on air in a broadcast environment, which necessitates being frame-accurate and ensuring a frame is never dropped.
The system takes the sophistication and flexibility of a post-production workflow and provides it in a robust broadcast solution—something that has not previously been done. Traditionally, graphics packages have only supplied fill and key signals and assumed a basic composite in the vision switcher. Zero Density provides a much more powerful system that understands gamma, linear workflows, and LUTs. It adds back into a live environment the controls normally reserved for offline solutions.
After two years of intensive R&D with game industry technology, the team has succeeded in integrating it with the rigors of live-to-air broadcasting. The company first released the resulting product, which enables multiple Unreal Engine cameras to be set up on the virtual set, at the National Association of Broadcasting (NAB) conference in 2016. The release marked something of a milestone: Zero Density became the first company to use Unreal Engine in broadcast. The product went on to win a set of major awards at the International Broadcast Convention (IBC) in Amsterdam just months later.
Because Epic gives full access to the Unreal Engine source code, all Zero Density’s code is able to live in Unreal Engine, making use of shader compilers, C++, and Unreal Engine’s cross-platform functionality. For example, the company’s central Reality Keyer is actually implemented as a shader in the GPU code.
In terms of its keying conceptualization, Reality Keyer is not dissimilar to the Foundry’s advanced Image Based Keyer (IBK), which is found in the company’s Nuke compositing solution. Reality Keyer works with a clean plate and combines this with the system’s tracking functionality to produce a mesh representation of any studio green screen cyclorama. The 3D representation is generated by the program, allowing for cleaner keying and dynamic garbage masking to produce a clean and realistic keyed final image, even with sweeping pans and tilts. The system’s use of projection mapping of the clean plate assists the keying and makes the system much more advanced than just a normal chroma keyer. It is the first and only real-time image-based keyer with such advanced clean plate technology, yet it is implemented as a shader inside UE4.
FOX Sports, who adopted Reality Engine in February 2019, makes extensive use of the keying technology in its NASCAR Race Hub. Reality Engine is designed to work with a wide range of industry tracking technology. Zero Density takes in tracking data from most of the major solutions, along with lens information if those programs provide it. If no lensing information is available, the software starts with the zoom and focus information from the last calibration and then estimates lens curvature and other properties such as field of view—all in Unreal Engine.
While the company is very well known for its work with broadcasters such as FOX Sports, Zero Density is now bringing in clients who are producing episodic TV such as children’s programming. These projects shoot on green screen and then use the recorded live output to immediately go to final editorial, without the need to engage in traditional offline rendering and post-production visual effects.
In this episodic workflow, the software can output not only the final composite, but also various key layers such as garbage masks. In the future, the team would like to move from this level to an even more advanced version that would support a real-time workflow that also outputs additional data to enable the whole final comp to be tweaked and re-edited in post.
One huge improvement in recent times has been the addition of real-time ray tracing using NVIDIA RTX graphics cards. At NAB in April 2019, Zero Density previewed version 2.8 of the Reality Engine with ray tracing implemented, demonstrating how video screens on the virtual sets can now reflect onto other relevant parts of the set, together with improved shadows and many other optical enhancements. The company released version 2.8 in August. The latest version, 2.9, will be previewed at IBC in Sept 2019.
Reality Engine is a powerful, robust, high-end solution, and Zero Density not only provides the main system, but also the necessary automation, monitoring, and controlling interfaces. The system is designed to run on multiple Unreal Engine instances that are all controlled and managed as one. Case in point, Turkish arts and culture channel TRT2 broadcasts seven different shows with its virtual studio, which is powered by three Reality Engines. With the move to virtual production, Zero Density is also showing how broadcast workflows with tight on-air requirements can provide valuable tools to a variety of programs and deliver complex visuals to increasingly demanding live broadcasts.
This article is part of our Visual Disruptors series. Visit our Virtual Production hub for more interviews, articles, insights, and resources.