February 23, 2016

The Lexus Hoverboard ‘Ride The Slide’ Experience Shows Off Slick Moves with Unreal Engine 4

By Rama Allen Eric Renaud-Houde Raymond Leung

Lexus fulfilled the world’s Back to the Future dreams in 2015 by creating the first real-life hoverboard, later sharing the one-of-a-kind experience through a video of pro-skateboarder Ross McGouran testing the Lexus Hoverboard in a custom-built skate park. The next step was bringing the ride to the public with a bit of help from the team here at The Mill and the power of Epic Games’ Unreal Engine 4.

Whether it’s developing Mill Stitch, the first onset VR-review production tool, or creating prototypes for real-time, room scale VR with Vive, we’re always experimenting with evolving technology and exploring new ways to tell stories.

When challenged to recreate the experience of riding the Lexus Slide for the 2015 LA Auto Show, we answered with Lexus ‘Ride the Slide’, a room you can ride, powered by a real-time game engine experience and a hacked pressure sensitive controller, created in collaboration with Spinifex Group U.S. 

The Experience 

The beachfront skate park simulation was crafted in a custom-built surround video cave, putting users in a visually and aurally dynamic environment as they drop into a quarter pipe and ride through the Lexus Slide Park. The park in the simulation is a replica of the real custom park Lexus built in Barcelona earlier this year.

Users start the experience by standing on a hacked pressure-sensitive controller that monitors their weight distribution. They can control speed and orientation by leaning forward, back or side-to-side on the hardware-hacked physical board. 

The hyper-real skate park and surroundings were crafted in Epic Games’ Unreal Engine 4, and custom software was written to distribute synchronized content across all 18 HD screens in the video cave at 60 frames per second. Event-based spatial audio was created and integrated into Unreal by Apollo Studios.

Distributed Rendering

In order to distribute UE4 rendering across 18 displays, we had to solve quite a few technical challenges in a very short amount of time. This included the hardware and virtual camera configuration, and networking and state distribution for frame sync. While it was challenging, it was the kinds of problems we thrive on!


The first step in solving distributed rendering is always to ask whether it can be avoided all together. In this case, with the amount of pixels that we needed to push (9720 x 3840) and a lack of SLI support, it was clear that we couldn’t afford to use a single machine. We thus decided on a system of one server and six client rendering machines on a closed networking switch.

Virtual Cameras

Each machine would render to either the top or the bottom half of a wall (3 v-synced portrait-mode displays with NVIDIA surround enabled). For the scene to seamlessly extend across displays, virtual cameras attached to the player pawn had to match the display layout in the room. To that end, we implemented asymmetric view frustum (lens shift) support directly into UE4’s camera component class. Each client instance would then load json settings and activate the camera specific to its view portion.

Furthermore, a solid framerate of 60 fps was a requirement; anything below proved to be too uncomfortable, especially given our fluid camera motion. With NVIDIA GTX 980ti cards, ~80% resolution and a few more quality setting tweaks, we were able to obtain solid rendering results. We did have to minimize screen-space post-process effects since they would cause disparities between computers.


We decided to keep the shared system state as minimal as possible, so that we could rely on UDP and the speed of the switch to keep “sync”. As the user moves around the skate park, we simply broadcast his/her view position and orientation as well as some additional time information to keep Matinee transitions in sync. 

We went down this road to avoid the overhead of TCP frame lock, as getting in front of Unreal engine’s GPU/CPU synchronization points would have been out of the scope of this project. We arrived at this decision after many UDP latency and locking tests. One such test that we performed was a ping pong test on the network to test UDP packet reliability and speed. We were able to achieve sub-microsecond round-trips and after letting it run for hours, dropped 0 UDP packets. Happy with the results, we decided to move forward with this approach.

Implementation wise, we integrated the Asio C++ library and the new Cinder OSC wrapper directly into Unreal. We implemented a FRunnable object that the asio::io_service ran in and created some configs to choose whether the application would act as a server or render client. Asio ended up also handling the rest of the installation’s communication layers. For instance, we had to interface with a Cinder application, which was polling a Wii Balance Board for user data and control and sending that information on to Unreal to control the speed and movement of the experience. This application also listened to the Unreal application, which sent out information to control a DMX fan as an AR element. 

Asset Creation

The asset creation for Lexus Slide varied slightly from our usual VFX pipeline. Special care was needed in creating assets that would work well with Unreal Engines' Light baking, Lightmass. Each asset was UV wrapped with UV's for standard Diffuse, Normal and Specular maps and in addition to that, an extra set of UVs was needed so that we could optimize and increase the quality of the Lightmaps.

Real-Time Social Sharing

Tasked with capturing video of the user in the space for the social sharing element of the project, a multiple camera system was set up within the experience, allowing them to save and socially share a 15-second movie of their personal Lexus ‘Ride the Slide’ experience. 

The Unreal application was connected to a TouchDesigner application, to create a custom video capture system that communicates with both Unreal and iOS devices for user queue and start of experience, as well as managing real-time colour correction and editing. 


The Mill team was able to solve for a solution to distribute Unreal Engine 4 rendering across 18 displays and produce a seamless experience at the quality required for an immersive experience. Over the course of 12 days, 4,000 riders were transported to the custom Lexus skate park through the ‘Ride the Slide’ experience.

We’ve continued to explore more advanced methods for frame synchronization, including working more directly with the internal Unreal Engine frame synchronization points, machine clock syncing and reading precise packet timestamps with libpcap. We're also looking for Unreal technical artists to join us! 

Take a look at our Behind the Scenes: Lexus Ride the Slide film and find out more about The Mill on our official website