Image courtesy of WMG University of Warwick

Meet the hybrid real-time simulator for testing autonomous vehicles 

With more than 80% of vehicles on the road today still at level 0 autonomy—where the driver must be fully in control of the vehicle—most experts would agree we’re still several decades away from a world of fully self-driving cars.

Human safety is the most important consideration for researchers attempting to achieve level 5 vehicle autonomy, where the driver does not have to be in control at all during travel. The AI systems that power self-driving vehicles must be road tested for several billion miles and in an almost infinite range of situations to be considered safe. 

Ideally, testing would take place in the real world to achieve the most accurate results—but this can be extremely dangerous. On the other hand, testing in a simulation using interactive 3D technology is controllable, repeatable, and safer as well as cost-effective and time-efficient. So what if you could combine the best of both worlds? WMG at the University of Warwick has done just that. 

Combining virtual environments and physical cars

WMG is a multi-disciplinary department of the University of Warwick that works with industry to undertake applied research and education in engineering, management, manufacturing, and technology.

It’s made up of cutting-edge research groups, including the Intelligent Vehicles Group—a team of over 80 people working on the testing and development of autonomous vehicles, sensors, human factors, and communication.

Four years ago, the Intelligent Vehicles Group built the 3xD Simulator—a state-of-the-art system that enables researchers to drive in real vehicles and link them up to a simulated environment, which is displayed on a full 360-degree screen via eight projectors. “We can drive any vehicle inside, connect it to the system, and do road testing and development,” explains Juan Pablo Espineira, Project Engineer at WMG.

The 3xD Simulator is used to test autonomous systems under normal operating conditions in a synthetic environment representing the real world. In the real world, you are limited by conditions like weather or traffic and you can’t reliably and safely recreate complex situations. 
These might include things like turning into a junction where a driver is running a red light and there is fog or rain and one of your sensors malfunctions. This scenario would be incredibly difficult and dangerous to do in a real test, but can be easily done hundreds of times in simulation.

The 3xD Simulator’s hybrid approach enables you to include humans in the loop as drivers, safety drivers, or passengers. That means you can determine how people react to situations, how they interact with the technology, and how much they trust it. 

If you want to test out an autonomous emergency system for example, you can put a person inside a physical car that they operate as if driving. The car thinks it's in the real world, and as the emergency scenario or accident occurs and the car brakes are applied, you can test the reactions of both the person and the vehicle.

A customizable autonomous driving simulator 

The visualization software that originally supported this simulator had been custom-made by an outsourced company. This approach had two drawbacks—first, the graphical fidelity was not sufficient for some test cases, and second, the team were unable to modify or extend it easily and quickly. “In research, we often have to provide new solutions or adapt previous solutions, and that is something that we could not do in the previous system,” says Espineira.

The team decided to switch to Unreal Engine to address these issues. Access to the engine’s source code gives the team limitless flexibility to adapt their simulator as required—to implement specific sensor or noise models, for example, or to have things in the virtual environment interact in a specific way. “Unreal allows you to do all of that, and it also allows you to implement things through C++ that are not in the engine,” says Espineira. “So for example, I implemented libraries to communicate with the car, which have proved very useful.”
The team has also leveraged Unreal Engine’s Blueprint visual scripting system to enable more researchers to take a hands-on role in the project. “Blueprint is a very useful tool, because a lot of the time we have researchers who are very good at their specific field—like sensors or electronics or mechanics—but who have a limited exposure to simulators and simulation software,” says Espineira. 

Blueprint enables non-programmers to script code in a more visual way—by connecting nodes rather than writing lines of code. “If we use an open-source simulator and researchers have to go into the source code to modify the sensor model or noise model, that’s actually quite challenging,” says Espineira. “Blueprint provides a simpler way to do these things.”

Creating a real-time simulator with Unreal Engine

To create the 3xD Simulator, the team started out with Unreal Engine’s vehicle template. This includes a vehicle with built-in vehicle dynamics that can be modified as required. They then built a road network and surrounding environment. WMG sometimes uses 3D models from a LIDAR scan, but you can also use assets from the Marketplace

Espineira needed a way to project the environment onto the 360-degree screens surrounding the physical vehicle being tested. “Generally, you put cameras in your Unreal environment to ascertain where a screen should be, and then send the image to the projector,” he says.

With eight projectors working together to create a single seamless image, this is less simple however. “You need eight video ports, and you need to run eight instances of Unreal,” Espineira says. “You cannot do it in one computer, so you need to find a way to do that using multiple computers, and also in a way that the computers are synchronized.”

The team found the solution in nDisplay. nDisplay deploys and launches multiple instances of Unreal Engine across an array of computers in a network to render 3D content simultaneously to multiple displays in real time. “One of the instances will be the master, the other instances are for visualization. These are the ones that are going to come out the projectors,” Espineira says. “Each projector is commanded by one computer, and that saves us a lot of performance.”

The next stage of building the simulator involved devising a way to flow information from Unreal Engine to the physical car and vice versa. To do this, Espineira used the ObjectDeliverer plugin. This provides Blueprints that connect to a server and enable you to send and receive data.
This data might include things like the speed and RPM of the vehicle in the simulation; its XY position for GPS relation or navigation; the position of entities in the virtual environment to send to traffic simulation software like SUMO; and sensor data such as radar and LIDAR data for visualization in analytical software like MATLAB

The WMG team needed to hook up the physical components in the car to the simulation. It achieved this by tapping into the Controller Area Network (CAN) bus network, the system by which the components of modern cars communicate with each other. Accessing this system enables the team to read data the car sends, such as the position of the throttle or whether the light beams are on, and to write instructions, such as what the speed or RPM dial should be.

Finally, Espineira used Blueprint to create radar and LIDAR sensor models inside Unreal Engine, in order to track data coming from the vehicle’s non-camera sensors. 

Real-time technology and the future of intelligent cars

The autonomous vehicles sector is changing at an increasingly fast pace and everyone has their own preferred software tool chain—so being able to generate quick custom solutions that can easily be adjusted is critical. It’s one of the main reasons WMG uses Unreal Engine. “This is something you can’t do as easily and quickly with commercially available software, or that will come at an extra cost,” says Espineira.

As researchers come closer and closer to fully autonomous level 5 systems, game engines will be pivotal for facilitating the all-important safety testing autonomous cars of the future will need to undertake. 

    Let's talk!

    Interested in finding out how you could unleash Unreal Engine’s potential for training and simulation? Get in touch to start that conversation.