HTX Labs delivers immersive VR training simulations using Unreal Engine
They started with a handful of Unreal Engine-based demos. Just 18 months later, they found themselves providing VR Emergency Procedure (EP) training to student pilots as part of the US Air Force’s Pilot Training Next (PTN) program. And this year, they’ve secured a Small Business Innovation Research (SBIR) Phase II award from the Air Force to expand EMPACT®, a next-generation VR training platform, to support authoring, managing, and distributing immersive content to deliver training to the point of impact—the worker and the warfighter.
The response from PTN has been nothing short of enthusiastic. “HTX has provided us an immersive emergency procedure trainer that filled an essential gap in our pilot training program,” says Paul "Slew" Vicars, Lead for the PTN Program, U.S. Air Force. With EMPACT, an instructor pilot can set up an emergency scenario in VR and use it to test and train new pilots. “For example, you are at 10,000 feet and suddenly a fire erupts in the engine, or there is smoke in the cockpit, or you have a hydraulic malfunction,” explains Schneider, CEO of HTX Labs. “The trainee has to react to that emergency with a very specific procedure, which he needs to know like the back of his hand. And he needs to do it, not just talk through it, just as he would have to do in the real situation.”
The Air Force has many such emergency procedures, most of which would be difficult or impossible to demonstrate in real life without endangering the trainee. Such scenarios are perfect candidates for VR training.
“The Air Force wanted to get the next best thing to actually having your plane set on fire, which you never want to see in real life,” explains Schneider.
“Our military customers, and actually all our customers, want our VR training to be as real and immersive as possible,” says Verret, CTO at HTX Labs. “Unreal Engine gives us the ability to deliver on that expectation.”
Bridging enterprise and militaryIn 2016, software engineers Schneider and Verret developed a deep interest in the potential of immersive technology to reduce time, costs, and risks in enterprise training. They began showcasing demos at conferences, where they eventually caught the eye of an Air Force Colonel from AFWERX, an Air Force organization that fosters innovation and collaboration with the private sector.
This led to HTX Labs’ initial work with Pilot Training Next, and ultimately to securing the SBIR Phase II to expand their EMPACT VR training solution. SBIR has the multi-faceted goal of providing the granting agency with innovative R&D, tools, and services, while also giving the recipients the means to develop a commercial product.
Verret credits the team’s experience in enterprise software with making them a good fit for SBIR. “We’re enterprise, we’ve always been enterprise, and that attitude is behind everything we do,” he says. He adds that while much of his team has the skill set to work at a studio, “the culture that we have instilled here is very focused on commercial software and the methodologies and the processes that go with that.”
Because of this approach, every aspect of EMPACT is designed to fit into a platform that can support multiple types of training across a variety of industries. “We don’t do anything with a one-and-done mindset,” explains Verret. “Architecture is critical for us. We focus on reusability, APIs, and layers.”
Scenarios for safety and preparedness training are just part of the overall package. HTX Labs aims to provide a full platform that includes data tracking and tools for post-training analysis in a variety of industries. “Whenever we’re building a new simulation, we think about what information we need to capture,” says Schneider. “What data do want to be able to look at over time, to truly determine whether the student is progressing? Can we track enough things to compare this student to another student, or this student to an expert, or this student to the entire cohort?”
“The more data that we can compile—and not just compile, but expose in a meaningful way so someone can make sense of it and visualize it—the more beneficial VR becomes for training.”
Choosing Unreal EngineHTX Labs chose Unreal Engine as the basis for their work not only for its rendering quality, but also for its broad capabilities in creating specific scenarios for emergency training.
“We didn’t want to build each and every emergency procedure from scratch,” says Schneider. “There are so many different emergency procedures—fire in flight, electrical fire, generator inoperative. There are scenarios in the air, scenarios on the ground.”
Using the Blueprint visual scripting language as a framework, HTX Labs created a reusable foundation for building emergency scenarios. “Unreal Engine gave us the means to reuse a lot of our content, and to build scenarios much more quickly,” says Schneider.
HTX Labs uses a variety of DCC packages to build environments and props, including Modo, Maya, and Houdini. “When we are going to virtualize someone's facility, whether it’s a hangar, a refinery, or an office building, we have a very defined, efficient process that we have developed over time,” says Verret. “Unreal Engine is at the center of it all.”
He adds that Unreal Engine's integration with Perforce source control repository is a huge boon to productivity. “It’s all about speed and iterations for our developers, which translates to faster building overall,” Verret says.
Looking to the future of VR-based trainingHTX Labs sees a bright future for VR-based training across all heavy industries, and in the military in particular. “Excitement about the technology has proliferated to just about every major base,” says Verret. “They all have virtual reality capabilities now. The missing link is just a more coordinated effort. There’s no turning back from it now. It’s momentum that just continues to build.”
To find out what Unreal Engine can bring to simulation training in your industry, contact us and we’ll get that conversation started.