Image courtesy of Mike Killian

Flying with AR: in-flight aviation training with Red 6

July 27, 2021
How do you train for a dogfight at 50,000 feet? The military has pondered this question since the first fighter pilots took to the skies decades ago. Traditionally the U.S. Air Force has looked to on-ground virtual reality simulators, and also to in-the-air practice battles between compatriots, but neither one can completely prepare a pilot for an actual encounter with the enemy. 

Enter an in-flight augmented reality system from Red 6, an AR platform company founded in 2018. With the system, a pilot wearing a customized AR helmet and visor takes flight in an aircraft equipped with a Windows 10 PC, a few sensors, and Unreal Engine. 
 

Through the AR visor, the pilot is able to see both the actual environment and one or more computer-generated enemy aircraft projected in stereoscopic 3D. The enemy plane moves just as a real one would, giving the pilot realistic experience with threats without any actual danger to himself or the aircraft. And with such a system, pilots can run the same scenario over and over again until their reactions are perfect.
Image courtesy of Mike Killian
Such a system was once thought to be impossible. While flight simulations aren’t new, the fact that it’s in the air running on a pilot’s helmet, and that the view and movements are so realistic, is a breakthrough for fighter pilot training. 

According to Nick Bićanić, Founder and Chief Science Officer at Red 6, traditional military head-up displays can have an extremely narrow field-of-view, and sometimes show only one color for one eye. He adds that such systems work well for basic symbology and avionics data, but they cannot be used to simulate reality.

“With the Red 6 system,” he says, “when pilots go up there and see a 105-or-more-degrees field of view, full color, daylight-visible binocular system, and the aircraft that they're seeing synthetically isn't actually there, but it feels like it is and it's behaving like it is, they come down and they go, ‘That was amazing. I want this yesterday. How soon can we have it?’ “
 

The need for a solution

In addition to on-ground VR simulators, the Air Force has also used live, in-flight practice battles to train fighter pilots. But such training, while effective in some ways, has many drawbacks. For one, it’s extremely costly—each pilot needs hundreds of hours in flight, which means simulated adversaries need to get into the air for those hundreds of hours, too. This can bring expenses for fuel and logistics to $60,000–$90,000 per hour, eventually adding up to billions of dollars per year. Additionally, there are less tangible costs—all those extra hours in the air as an adversary can lead to pilot burnout, especially since those pilots aren’t getting any real training themselves during these sessions.

And then there’s the risk of collisions or other mishaps inherent in such an exercise. “It's a very dangerous business when you're operating at high speeds, high g-forces, and you're doing close passes,” says Daniel Robinson, Founder and CEO of Red 6. “Obviously, the risks are significant.” 

On top of it all, such training doesn’t achieve optimal results. “Seeing two pilots up there flying and fighting, you’d think they're both getting great training,” remarks Robinson. “The reality of it is, if someone is simulating being an adversary aircraft, they are very much constrained.” 

Between the risks, the expense, and the limited effectiveness, it was clear that a new solution was needed. An AR-based solution ticks all the boxes: it delivers complete and repeatable training, it costs less, and most importantly, it’s much safer. 
Image courtesy of Mike Killian
“The safety implications for what we're doing are tremendous,” says Robinson. “It isn't just the outright cost that we're seeking to solve—it’s the quality of training and the quality of the pilot that we physically produce at the end of a training pipeline.”
 

Developing the technology

When Red 6 set out to develop an in-flight AR platform, the first technical challenge was the tracking system. Through tracking, the platform would know at any given moment where the pilot’s head was in 3D space and which way it was facing, so the system could serve an appropriate series of images of the enemy aircraft to the AR visor when the pilot looked in that direction.
Image courtesy of Red 6
The problem was one of accurate tracking in mid-air versus on the ground, says Glenn Snyder, Founder and Chief Product Officer at Red 6. “Take someone that's not standing on the ground, but put them in the air and then let them move at hundreds of miles an hour so they can go up, down, left, right, do barrel rolls, do back flips, or whatever,” he explains. “So how do you figure out where that head is now? And how can we figure that out within, let's say, a centimeter of accuracy?”

The team couldn’t find an off-the-shelf solution that worked, so they invented their own—the aircraft is tracked through its absolute position in space, while the pilot’s head is tracked relative to the aircraft.
Image courtesy of Red 6
It was the combination of the two types of tracking, absolute and relative, that made the system possible. “You put those two together, you bind the camera to it in UE4, and you have a 300-mph augmented reality solution that you can use,” says Bićanić. “And that’s exactly what we’ve built.”
Image courtesy of Red 6
The company also found that they could use nDisplay technology to serve different images from one PC to several displays—left and right stereo images to the pilot, for example, plus a radar scope display on the plane’s dashboard—keeping the system footprint small while still delivering to as many displays as needed.

Another challenge in creating an in-flight AR system is latency: the delay between head movement and the display of corresponding visuals. “How to get our latency down has been the number one problem we've been trying to solve since the beginning,” says Snyder.

Latency for an in-air, high-speed AR system affects the user experience much more profoundly than latency for an on-ground system that moves at a slower speed, Snyder explains. While there might be a small, acceptable ‘fudged error’ for an on-ground AR system, the same variations for an in-air system would show themselves in quarter-mile distances. Such inaccuracies would be unacceptable for in-flight pilot training.

The team had been using a different real-time tool at first, but when they switched to Unreal Engine, it instantly cut their latency to one-third of what it was. This, Snyder says, was among the major reasons the team decided to go with Unreal Engine for their AR solution. From there, they worked on reducing latency to an acceptable 20 milliseconds or less.
Image courtesy of Red 6

Convincing the military

Being ex-military themselves, the team at Red 6 is in a good position to know what the Air Force and other branches are looking for. “The Air Force was very conscious of the fact that they had a gap to fill, but they had no idea that it was possible to fill it,” says Bićanić. “They told us that they'd been looking for a solution like this for 10 years. And they love it. They want it integrated into as many Air Force aircraft as possible, as quickly as possible.”

Robinson agrees. “What we've done with the augmented reality is seek to replace the need to put the adversary pilot, the adversary aircraft, and all of the associated costs up in the sky,” he says. “We don't need that anymore. We can do that through augmented reality and using Unreal to simulate any threat we want in the world. And that's a tremendously exciting value proposition.”
Image courtesy of Red 6
And what do pilots think of the system? “The reactions we've been getting from pilots have been just incredible,” says Snyder. “They hopped out, they just looked up and said, ‘How the heck did you guys do that? We want it now.’”

With the success of their AR-based fighter pilot training, the team at Red 6 sees a bright future for both themselves and Unreal Engine. “There are a number of companies that are building the future of spatial computing, and augmented reality is a fundamental part of that,” observes Bićanić. “It's fundamental to us to align ourselves with the most important players in that space, and there's no question that Epic Games is one of them.”
Image courtesy of Camron Hatef at PopSocial

    Let’s talk!

    Interested in finding out how you could unleash Unreal Engine’s potential for simulation? Get in touch to start that conversation.
    News
    August 19

    Unreal Engine 4.27 released!

    Creators across all industries have something to celebrate with this release: In‑camera VFX goes next-level with a slew of efficiency, quality, and ease-of-use improvements, while other highlights include path tracing for stunning final images, out-of-the-box access to Oodle and Bink, production-ready Pixel Streaming, and much more.
    News

    Unreal Engine 4.27 released!

    Creators across all industries have something to celebrate with this release: In‑camera VFX goes next-level with a slew of efficiency, quality, and ease-of-use improvements, while other highlights include path tracing for stunning final images, out-of-the-box access to Oodle and Bink, production-ready Pixel Streaming, and much more.
    Spotlight
    September 20

    Balenciaga blurs real with Unreal in Fortnite

    Fortnite players can now dress in stunning digital versions of Balenciaga silhouettes. Find out how the pioneering fashion house has created a mind-bending, world-blending transmedia experience, reusing those same Fortnite character models for real-world 3D billboards, traditional promo materials, and more.
    Spotlight

    Balenciaga blurs real with Unreal in Fortnite

    Fortnite players can now dress in stunning digital versions of Balenciaga silhouettes. Find out how the pioneering fashion house has created a mind-bending, world-blending transmedia experience, reusing those same Fortnite character models for real-world 3D billboards, traditional promo materials, and more.
    Spotlight
    September 7

    Mold3D Studio to share Slay animated content sample project with Unreal Engine community

    In a bid to inspire and educate artists, Mold3D Studio is sharing its decades of experience in the industry by creating a sample project for animated content. With a distinctive style that’s a hybrid of anime and realism, Slay is rendered entirely in Unreal Engine.
    Spotlight

    Mold3D Studio to share Slay animated content sample project with Unreal Engine community

    In a bid to inspire and educate artists, Mold3D Studio is sharing its decades of experience in the industry by creating a sample project for animated content. With a distinctive style that’s a hybrid of anime and realism, Slay is rendered entirely in Unreal Engine.