Image courtesy of Meta Immersive Synthetics

Affordable aviation training in a compact package at I/ITSEC 2021

Sébastien Lozé
With their rich real-time visuals, game engines are quickly gaining ground as an integral part of low-cost, flexible, open-source solutions for simulation in a variety of sectors. But many solutions focus only on the visuals and don’t take advantage of the logic, programming, and physics that are built into game engines like Unreal Engine.

Meta Immersive Synthetics (MIS), a division of Meta Aerospace, aims to change this pattern with the development of a series of plug-and-play systems for defense training. Each platform uses Unreal Engine at its core, leveraging both its logic modules and game-quality graphics for military simulation needs.
Image courtesy of Meta Immersive Synthetics
In addition to providing detailed imagery at both close and far ranges, the platform also takes a “visuals are simulation” approach, identifying the composition of each element in the environment to enable calculation of properties like reflectivity and heat signature. MIS also reduces the simulator’s physical footprint to the minimum by running the simulation in VR, which naturally leads to a lower overall cost.

The first platform to reach the mature prototype stage is NOR, a simulation platform based on Unreal Engine, and built for scale. The first application of NOR will be the Air Tactics Trainer module intended for compact VR-based solutions. With this modest setup, pilots can fly through a detailed and accurate landscape and learn combat scenarios, handle emergencies, and experience any condition imaginable, all without the expense and logistical challenges of using real aircraft.

NOR will be exhibited at the I/ITSEC 2021 conference in Orlando, FL. Attendees will have the opportunity to try out the system for themselves in the Meta Immersive Synthetics booth, flying an F-16 through a Nevada landscape, with no prior training or experience with simulators required.

A new direction for simulation

Meta Aerospace has long been in the business of building simulators, but MIS is developing an entirely new system—one that’s more flexible, scalable, and adaptable to future needs. MIS settled on Unreal Engine as the backbone of this new system, which provides the rendering capabilities of a game engine along with ease of use, programmable logic, and robust source code that can adapt to changing simulation needs.
“Obviously, there is no point in the military trying to keep up with the games industry in terms of rendering, and in terms of all the things that game technology provides,” says Niclas Colliander, Managing Director of Meta Immersive Synthetics. “If you want to have cool bullet-drop physics in a shoot-em-up game, that's essentially the same physics that you need in a simulation. It makes sense to just leverage everything that's there already to enhance the capabilities, and instead focus on developing whatever capabilities the game engine doesn’t have.”

Colliander draws on his experience with the Swedish Air Force to inform this new direction—he flew fighter planes for 10 years, but was unimpressed with the simulators his squadron used for training. “We were flying aircraft that cost hundreds of millions of dollars, but we had simulators with visuals that look older than the video games I play back home,” he observes. “Our simulators were among the best in the world, but everything was designed for single-core computing, with cumbersome, large, old code bases. And we also needed specially trained personnel to start the simulator, which shouldn't still be happening in 2021.”
Colliander adds that there was ordinarily one simulator per squadron, with each simulator costing $15 million or more. MIS’s goal, Colliander says, is to take advantage of Unreal Engine’s capabilities to make simulators more accessible.

Efficient deployment lowers cost and encourages adoption

“Why do we need these $15 million domes when we can actually just put a consumer headset on someone, buy a stick and a throttle, and then they can actually fly in VR?” asks Colliander. “Our goal would be to, at a fraction of that cost, have one simulator per pilot instead, and that's what we’ll actually be showing at I/ITSEC with the NOR system.”

The NOR demonstration at I/ITSEC is just the beginning for MIS—the company plans to develop more modules as use cases are added. For example, one of NOR’s upcoming modules, the JTAC Trainer, will be teased in the Immersive Display Systems Inc. booth at I/ITSEC.

“It's interoperability, making sure that the platform can talk all the military languages and interact with everybody,” says Colliander. “There are already plugins that do this, but it's not necessarily integrated enough to just be plug-and-play.”

Leveraging AI to augment environment data sources

While many companies are making use of GIS data, satellite data, 3D models, and photogrammetry to represent the world around them, these uses are often limited to certain resolutions. “An inherent problem of photogrammetry applications is that if you get in close, you get a smeary, melted look of the world,” says Colliander.

To provide a realistic depiction of the world at any resolution, MIS uses a combination of satellite imagery, elevation data, road network maps, electrical grid maps, and other data, layering one over the other to interpret and enhance the visuals.
Image courtesy of Meta Immersive Synthetics
Taking the interpretation one step further, the system uses neural networks to label each element of the world based on its composition—grass, gravel, rock, asphalt, and so on. These labels give the system the ability to perform sensor modeling, and assign values such as reflectivity and heat retention to different parts of the world.
Image courtesy of Meta Immersive Synthetics

Unreal Engine at the core

“We take an approach that's highly coupled with the environment,” says Colliander. ”In our book, creating the visuals requires the exact same knowledge as doing the simulation. Separating the two is just old thinking.” He adds that MIS is applying the same approach to military vehicles, where the visuals are coupled to the real-world materials that make up the vehicle, providing the opportunity to simulate radar cross sections, visual signatures, and IR signatures based on the vehicle’s composition.

Accurate physics and a workable coordinate system are also integrated into the system. To illustrate the importance of these elements, Colliander draws on a gaming reference. “In a game, interaction is typically close by, or you can fake that it's far away,” he says. “But if you're training with fighter aircraft in an environment that mimics real life, you need to be able to interact over hundreds of kilometers while maintaining precision.” This, Colliander says, is why MIS has integrated 64-bit physics and an ellipsoidal world coordinate system into its turnkey solution.
Image courtesy of Meta Immersive Synthetics

Proving the technology

The NOR exhibition at I/ITSEC is MIS’s first showcase of this new direction for simulation, but it’s just the beginning. Through the NOR demonstration, the company hopes to inform and educate the simulation community about the possibilities of a highly accessible system based on Unreal Engine. 

The goal, says Colliander, is to add more use cases, each one linked to the previous, eventually enabling a full "Soldier to General" capability across all domains and missions. Applications could include man-in-the-loop simulations, with people on the ground or in aircraft, all the way up to a General sitting in a planning room looking at a strategy game.

Colliander reveals that the name “NOR” comes from the abbreviations for the NOT and OR logic gates in engineering, which can be combined and configured to solve just about any problem. “The idea is to make NOR an environment that could cater to any type of simulation,” he says. “With a functional physics engine and a realistic replica, you can just as easily simulate, say, a tank, or a person on the ground, as you can a fighter aircraft.”

He adds that the reason MIS chose a fighter aircraft as its first use case is that the company considers it one of the more complex applications for simulation-based training. “If we can do a fully synthetic fighter simulator with all these features,” he says, “we can pretty much do anything.”

    Let’s talk!

    Interested in finding out how you could unleash Unreal Engine’s potential for simulation? Get in touch to start that conversation.