How Unreal Engine is Impacting ExerGaming Research

How Unreal Engine is Impacting ExerGaming Research

By Joey Campbell

Hi my name is Joey Campbell, I’m a VR lecturer, developer and researcher based in Cork, Ireland and am currently undertaking a PhD with the Bristol Interaction Group at the University of Bristol. The general theme of my research centres around physical exercise and video games (a.k.a. ExerGaming) with a particular emphasis on physiologically controlled avatars.

I would like to discuss two research projects which I’ve been working on to date which were both developed with Unreal Engine:


Project 01. VeRitas

Richer immersive environments viewed through HMDs and interactions based on real-time physiological feedback have paved the way for fresh studies in the field of HCI and physical exertion. Our design approach uses these advances to discover if it is possible to motivate the performance of intense physical activity through adaptive interventions combining immersion, physical resistance, and falsification of physiology. VeRitas addresses these issues by motivating players to surpass the low levels of physical exertion associated with traditional exerGames.

We designed a VR cycling game in which the avatar’s speed and distance covered are controlled by the user's heart rate while pedaling a stationary Wattbike, speed is dictated by the user's heart rate and steering is controlled by a standard gamepad. The goal of the game is to cover as much virtual distance as possible in a sixty second period.

VeRitas from Joey Campbell on Vimeo.


Game output is mapped indirectly to the virtual bike speed through the user's heart rate. Mapping the user’s pedaling with that of the virtual cyclists would have lent to a greater sense of interactive ‘connection’ with the bike, however, we wanted to examine the impact falsified physiological data would have on energy expenditure. This is why we ensured that heart rate played a fundamental role in the game control.


For feedback purposes it was necessary to design a GUI that displayed heart rate, virtual distance, bike gear and remaining time. A traditional 2D status bar in a HMD can make users nauseous so we attached Heads-Up Display elements to the avatar as 3D widgets. This reduced any virtual sickness and kept the important feedback in a prominent location.


In order to save development time we purchased a premade game environment and bicycle rig on the Unreal Engine Marketplace. A cyclist character was designed in Adobe Fuse and exported for Unreal where we merged it with the bike model.


In order to use physiological data as a game variable it was necessary to take in real-time heart rate data. Inaccurate sensor data and impractical heart monitors led to several design iterations. We experimented with the Pulse Sensor and Easy Pulse Monitors but inaccurate readings coupled with the awkward sensor placement for a bike game made them impractical (both were worn on the finger when the user had to simultaneously hold onto bike handlebars and control a gamepad). Eventually we used the Polar Heart Rate Monitor which had been reconfigured to pass heart rate data wirelessly through an Arduino board and into the game engine using the UE4Duino plugin. There was no noticeable latency and the sensor readings were accurate. The monitor readings became weak beyond a two-metre radius, so we placed the receiver beneath the saddle of the bike.


A mapping Blueprint system was developed in order to dynamically adjust the virtual bike’s speed based on the incoming heart rate data. There were 12 gears on the virtual bike (1st gear being the slowest speed and 12th being the highest); a BPM reading of between 40-50 would activate 1st gear, 50-60 2nd gear, 60-70 3rd gear, up to 170+ 12th gear (BPM was relative to gear speed which was relative to distance covered). The virtual bike’s gear system, speed and distance covered were all relative to the incoming sensor readings. This meant that a user who was unfit could in theory cover a greater virtual distance than a fitter user exerting a greater force, so we made sure that we selected an exercise bike that could measure energy expenditure rather than only relying on distance covered as a unit of measured exertion.



We could find no prior study that used altered physiological feedback within an ExerGaming context so no optimum level of heart rate exaggeration existed. We conducted a pilot test to establish an appropriate percentage range that the user's visible heart rate could be altered by during game play. Seven versions of the game were created where the user's visible heart rate would be changed by +10%, +20% +30%, 0%, -10%, -20% and -30%. A single user was asked to play these seven games (chosen in random order) with a two-minute break between each game, and asked to give feedback on the system. The user didn’t notice when their BPM was raised by any amount but when it was reduced below 20% they commented that they thought there might be inaccuracies with the heart monitor. Based on this feedback we decided to ‘cap’ the falsification reduction at a level of 20%. We hypothesized that decreasing the visible BPM would motivate greater physical exertion.


Physical Resistance

We were interested in examining the effects of ‘indirect actuation’ ie. altering limb exertion through involuntary resistance adjustment. The Wattbike includes a physical resistance dial with seven levels which could be adjusted to increase exertion required to turn the bike’s pedals. We did not want to alert users to changes in the resistance, so we constructed a pulley system which would enable remote alteration of resistance control without bringing it to the user's attention.


Our three independent variables were immersion (VR headset or flat screen), resistance (gear actuation or no actuation) and falsification (true BPM and false BPM).

We used two main measures: Power Output and Rate of Perceived Exertion. The first dependent variable generated by the user was ‘absolute’ power measured in Watts and recorded by the Wattbike for each trial. We also manipulated these absolute power percentage for each condition to calculate a ‘relative’ power value, measuring the performance in that condition as a ratio of that participant’s average performance across all trials, to factor out outliers in individual performance.


Project 02: “Virtual Collisions”

Following the previous project (VeRitas) we wanted to develop an ExerGame where we would be able to dynamically adjust the resistance of a physical object in real time (the idea being that this physical object would be used to control a virtual avatar within an immersive game and that the game engine could govern the physical objects ease of mobility).

Virtual Resistance from Joey Campbell on Vimeo.

Physical Computing

At the start of the project we were confident that we would be able to purchase an ‘off the shelf product’ but soon realised that we were going to have to produce it in house. We looked at several different technologies for implementing this type of resistance (electromagnetics, hydraulics, pneumatics) until we decided that friction would be the least complicated and most efficient to implement. We tried various combinations (stepper motors, linear actuators & continuous rotation servos). There were issues with latency as well as power but in the end we settled on high torque servos (30kg per cm) which could be integrated into Unreal Engine via the UE4Duino plugin.

servo motors from Joey Campbell on Vimeo.

We tested out the servos on the brake handles of a kid’s tricycle and we were able to incrementally adjust the level of tightness/resistance without any noticeable latency. It should be noted that there isn’t sufficient power in an arduino board for high torque servos and to overcome this obstacle we used a rechargeable golf cart buggy battery connected to a breadboard.


Synchronization of real and virtual worlds

While we knew the final physical game controller would be some form of wheeled vehicle we did not know how we would synchronize the movement and position of this object with its in game clone. We did a test using a real office chair and an imported .obj model of an office chair; we created a VR pawn class, added a Vive motion controller tag and the static mesh and positionted the motion controller onto the centre of the virtual chair. We in turn taped the real Vive controller onto the corresponding location on the physical chair. There was a bit of tweaking involved to get the size of the real and virtual chairs to sync up and there was also a bit of work to define a floor and border space within the HTC Vive chaperone bounds:

vr & physical chair movement from Joey Campbell on Vimeo.


As you can see above, being restricted by cables in an exergame (where resistance and motion play an important role) tends to limit the game flow. To overcome this tethering we made our setup more mobile by using a 12V 8000mAh Rechargeable Lithium-ion Battery as well as an adaptor to power the Vive link box. We also substituted the desktop with an Alienware Laptop which had a HDMI Port.


Finally we decided to replace the real and virtual office chair with a wheelchair model. The brakes on the wheelchair were adjusted, repositioned and lubricated to minimise as much cable friction as possible. The servo motors were mounted on servo brackets bolted to the aluminium tubing.


Collision Detection

The motion of the physical chair synced up perfectly between both worlds but there was zero collision detection happening between the wheelchair model and any physics-enabled static meshes placed in the scene. What did work was opening the wheelchair static mesh editor and changing the collision complexity to ‘use simple shapes as complex’. While this did enable collisions with the static meshes in the environment it failed to trigger any Blueprint events.

As a work around we reset the wheelchair collision settings in the mesh window to be ‘no collision’, reset the collision complexity to be ‘project default’ and the wheelchair’s VR pawn Blueprint settings were changed to ‘no collision’. We added (under “add component” in the top left of the VR pawn Blueprint window) a ‘box collision’ and scaled its size and position to match the wheelchair model. This was changed to ‘block all’ and “generate overlap events” and “hit events” were enabled. This box collision was dragged to the same level as the chair node hierarchy beneath the controller.

With collision detection enabled, brakes could now be adjusted incrementally replicating resistance equal to the that of the real world mass which the virtual objects were simulating in the immersive environment.


We set out to integrate resistance into a virtual world by seamlessly overlapping a virtual model with its real world counterpart. As well as generating a higher sense of presence by introducing a tangible version of the virtual object (viewed through the VR HMD and felt in the real word) we also have a mobile system with automated dynamic resistance so the game engine can simulate a real real sense of collision between the avatar and virtual objects.


Our future line of enquiry will focus on exploring the relationship between this type of tangible-virtual interface and physical exertion through the collection and analysis of quantitative data such as physiological responses. We envisage, for instance, that decoupling human and machine haptics when undertaking physical activity, in an embodied tangible-virtual environment, may facilitate increased physical output. In our next study we intend to place a myoware flex sensor on the user's calf muscle and do a series of tests examining the relationship between physical output and dynamic resistance. We also intend to compare user exertion levels based on whether real time exertion and resistance data is displayed or omitted on the head mounted displays [HMD’s] H.U.D.

The combination of uninhibited access to Unreal Engine along with affordable VR hardware has enabled researchers like ourselves to create compelling immersive environments on a limited budget. Ever increasing graphic fidelity allows the integration of 3D assets which are on par (visually) with their real world counterparts. The use of third-party assets from the Marketplace has freed up time and enabled us to focus all of our attention on immersive aspects such as haptic feedback, resistance and tangible interface design.

For further updates and progress on Joey Campbell’s research, visit



Recent Posts

Exploring the Tactics Behind TINY METAL

We speak to Area 35 about the inspirations and goals behind the upcoming...

Unmasking the Motives and Mayhem in RUINER

We explore the dark but beautiful streets of Rengkok City with REIKON GA...

HALON Evolves Previs with Unreal Engine

War For the Planet of the Apes has set a new previs standard and we caug...