Image courtesy of JUMP by Limitless Flight

Meet JUMP: the world’s first hyperreal wingsuit simulator

Ask anyone which superpowers they’d most like to have, and the ability to fly is likely near the top of the list. Intrepid individuals throughout history have attempted to conquer the skies using everything from pedal-powered airplanes to rocket-powered jet packs.

Arguably the closest a human has ever come to genuine non-mechanically aided flight, however, is wingsuit BASE jumping (also known as wingsuit flying).

This highly dangerous and technically difficult sport involves BASE jumping from a high point while dressed in a webbing-sleeved jumpsuit that enables the wearer to glide rather than free fall.

Requiring years of skydiving and BASE jumping experience—and with a fatality rate of one per 500 jumps—wingsuit BASE jumping is a pursuit that has been beyond the reach of 99.9% of the population. Until now.
 

The world’s first wingsuit simulator

You may have seen exhilarating GoPro footage of wingsuit flyers plunging down the side of a mountain. JUMP is the next step on from that—it’s the world’s first hyperreal wingsuit simulator.

The simulator combines a real wingsuit, a custom VR helmet, and a mix of suspension, wind effects, and hyperreal multi-sensory stimulation. In short, it’s the closest you can get to physically flying without taking up BASE jumping yourself.

JUMP is the brainchild of CEO and Founder James Jensen, a man with a long history in creating spectacular location-based entertainment experiences.

He was part of the team that set up The VOID, one of the first-ever walking virtual reality simulation companies. That experience took kids and adults into the worlds of Star Wars and Ghostbusters via top-of-the-range VR and graphics technology far beyond the reach of any home setup.
 
Unreal Engine is just flat-out leading the industry in high-resolution real-time simulations
- James Jensen, JUMP CEO and Founder
The idea for JUMP was born around a campfire in 2016 on a trip with Jensen’s buddy Marshall Miller, a professional wingsuit pilot. Jensen was talking about walking virtual reality simulations when Miller pulled out videos of his wingsuit BASE jumping in Sweden.

“I think the words came out of my mouth: ‘I wish I could do that,’ and immediately, my wife sitting next to me said, ‘you're never going to do that,’ ” says Jensen. “That night, I drew a napkin sketch of what the simulator would do in order to be able to mimic a real-world wingsuit BASE jumping experience, parachute, and landing.”

He assembled a team, and between 2019 to 2021, they built a prototype simulator. That led to a working facility in Bluffdale, Utah, which has now been operating for over four months and has flown over 5,000 people. “I've never skydived or BASE jumped,” says Jensen. “I rely on my professional athletes to tell me this is real—they've said it's about 85% there. We're pushing for 100%.”

Photogrammetry for realistic 3D models

The JUMP experience fuses a number of cutting-edge technologies. Let’s start with the visuals. JUMP takes the flier into hyper-detailed 3D landscapes of some of the world’s most breathtaking BASE jumps, including the notorious Notch Peak.
 
To achieve this, the JUMP team flew a custom helicopter rig kitted out with top of the range cameras, and spent two days capturing thousands of ultra-high-resolution images of the landscape below.
Image courtesy of JUMP by Limitless Flight
Those ultra-high-resolution images were processed using the latest version of RealityCapture, a state-of-the-art photogrammetry tool that enables you to create ultra-realistic 3D models from sets of images and/or laser scans.
Image courtesy of JUMP by Limitless Flight
Reconstructing the 58,000 images captured required five supercomputers. The team also used precise information from gyroscopes, IMU (inertial measurement unit), and other sensors to create a high-precision custom flight log.

The result was an incredibly detailed digital model of the environment of over 8 billion polygons across 10 square miles.

The next step was to bring the huge dataset into Unreal Engine 5. “It took some support from the RealityCapture team, but in the end, we developed some new tools that helped chop up these massive data sets and assign the appropriate textures and materials,” says Jensen.
 
Image courtesy of JUMP by Limitless Flight

The team leveraged Nanite, UE5’s virtualized micropolygon geometry system to handle the import and replication of the multi-million-polygon mesh while maintaining a real-time frame rate without any noticeable loss of fidelity.

For the ultra-realistic lighting and shadows, the team harnessed the power of Lumen, a fully dynamic global illumination system in Unreal Engine 5 that enables indirect lighting to adapt on the fly to changes to direct lighting or geometry.

“Because we are looking for total photorealism, we are leaning heavily into Nanite and Lumen to make our scenes come to life,” says Jensen. “We currently have the largest dataset in Nanite at eight billion polygons—over 700 parts and 16k textures per part.”

Jensen explains that features like these are the reason JUMP opted to use Unreal Engine to create the experience. “Unreal Engine is just flat-out leading the industry in high-resolution real-time simulations,” he says.
“Seeing the things that I used to do in video production that would take days, even weeks, and months to render now all happen in real time is unbelievable. Polygon count has always been a bottleneck, and global illumination with Lumen—it's just mindblowing to see in real time.”

The JUMP team filled out the virtual environment with realistic shrubs, trees, grass, and other objects from Quixel Megascans, a 3D scans library packed full of photorealistic 3D scanned tileable surfaces, textures, vegetation, and other high-fidelity CG assets that is included with Unreal Engine 5.

They also developed their own physics engine, FLIGHT, which handles all of the configurations and physics for both the physical and digital worlds, and they used Blender and Maya for 3D art.

The result is an awe-inspiring virtual world realistic enough to trick your eyes into believing you’re really standing on the precipice of a 4,000 foot drop. But where professional BASE jumpers must risk life and limb, you’re guaranteed to walk away from this experience in one piece.
 
Image courtesy of JUMP by Limitless Flight

Haptics and physical effects

The visuals are just half of what makes JUMP unique. To get the fully immersive sensation of real flight, you need to combine what you see in the VR headset with a real wingsuit, suspension system, wind effects, and multi-sensory stimulation.
Image courtesy of JUMP by Limitless Flight
“Physical effects are essential in being able to mimic reality,” says Jensen. “When you can synchronize physical sensation with visuals and audio, you go to a whole other dimension in virtual reality simulations.”

The simulation’s haptics are triggered by events in the virtual environment. “We’ve written custom code inside Unreal Engine specifically for moments inside of the wingsuit BASE jumping experience that initiates signals for scent, wind speed, haptic stage effects, sound effects, and physical objects,” explains Jensen.
Image courtesy of JUMP by Limitless Flight
Cutting-edge technologies like this can provide the means to accurately simulate a BASE jump. But how can a non-jumper know what the sensation should feel like? By relying on the experience of professionals.

Jensen has brought in experts to advise on the development of JUMP. These include the aforementioned Marshall Miller, a good friend and professional wingsuit pilot with over 10,000 jumps under his belt; and Hartman Richter, another professional wingsuit pilot and talented  programmer who wrote the FLIGHT server software physics engine to mimic real wingsuit dynamics.

With their help, the team has been able to include details that give a true sense of presence.  For example, the wingsuit is filled with compressed air. Once you jump off and push off the cliff, your wingsuit inflates within a few seconds. A fan starts to accelerate and blow wind at a faster pace as you're increasing air speed. All of these elements add to the realism of the experience.

Flying into the metaverse

One of the biggest lightbulb moments Jensen experienced while developing JUMP came when observing the reactions of fliers. “When I started taking people out to the prototype, I was watching them watch other people,” he says. “They started competing, saying, ‘I want to do what they just did.’ ”

Jensen realized the experience was like snowboarding down a mountain, with people trying to match and out-do their friends. That led the team to explore the idea of an esports-style competition. “We aren't broadcasting live yet, but we fully plan to have wingsuit racing in all of our locations that are streamed on Twitch,” says Jensen.

The team is also developing a multiplayer experience that will enable fliers in different locations to meet in the virtual environment and jump together. And there are plans to use facial scanning technology to create realistic avatars of users, bringing them into the experience and providing the opportunity to offer personalized GoPro-style videos.

For now, JUMP is a location-based experience—and Jensen says several locations are planned for future developments. Tantalizingly however, he alludes to a future in which a version of the system could be operating in bedrooms and living rooms around the world.

“The JUMP simulator and technology are the foundation for true full mobility inside any metaverse,” says Jensen. “Through a few years of location-based entertainment, we will inevitably derive a perfect virtual reality mobility product for at-home use.”
Image courtesy of JUMP by Limitless Flight
When that happens, Wade Watts’ state-of-the-art immersion rig—and the full potential of the metaverse—might not be so far away after all.

    Get Unreal Engine today!

    Get the world’s most open and advanced creation tool.
    With every feature and full source code access included, Unreal Engine comes fully loaded out of the box.