April 26, 2019
Making autonomous vehicles safer before they hit the road
CarSim, TruckSim, and BikeSim use vehicle data that describes suspension behavior, powertrain properties, active controller behaviors, tire properties, and also road slope, obstacles, weather conditions, and asphalt type. At the core of the software is a simulation solver that can predict how the vehicle will react, for example whether it will tip or skid under specific conditions or whether it will brake quickly enough on a wet surface. The software also produces a visual representation of the vehicle’s motion from the solved data.
On the flip side, the software can also import real-world vehicle and map data and analyze for speed, response time, and other aspects of the driving experience. While this application has obvious uses for accident reconstruction and training simulators, a new use has emerged in recent years—data gathering and machine learning for autonomous vehicles.
Self-driving cars are already available to the public in limited driving situations, yet the technology behind them still needs finesse to match safety and regulation requirements before they hit the roads in more complex driving environments. Part of that process is to record and analyze the cars’ data during test runs.
Testing an autonomous vehicle
Autonomous vehicles use a variety of physics-based sensors to detect the environment around them: cameras, radar, and LIDAR. The measure of a self-driving vehicle’s success is based largely on its ability to process the data from these sensors and interpret its distance to other cars, pedestrians, cyclists, and even debris left in the road, not to mention the slope, size, and condition of the road itself. The vehicle must also detect lane markings, signal lights, and traffic signs, and respond as a human driver would in all weather and lighting conditions.
The first tests of such vehicles were performed on physical test tracks, but it soon became clear that it was much more efficient—and much safer—to perform such tests on a virtual car first.
A virtual vehicle is fitted with all the sensors of a physical vehicle, and the visual data is fed to the sensors just as with a physical test. The difference is that engineers can easily try out variations in sensor placement, and quickly iterate on variations in obstacles, weather, time of day, and road conditions, all from the safety of a virtual environment.
Whether done virtually or physically, testing of an autonomous vehicle requires hundreds or thousands of hours of driving, and such tests generate an enormous amount of data. When Mechanical Simulation saw this trend coming a few years ago, they set about upgrading their products to meet the challenge.
Enter Unreal Engine
“The first thing we wanted to do is improve our driving simulator product with real-world traffic and road models,” says Robert McGinnis, Senior Account Manager at Mechanical Simulation. “But then, as autonomous driving came on and people wanted to incorporate physics-based sensors, we started presenting our technology as a general-purpose vehicle simulation tool for vehicle dynamics and autonomous driving engineers.”
At the same time, for the software’s visual representations, Mechanical Simulation was aware that they needed to keep pace with advances in computer graphics.
They found that to gain more options for visualization, many customers were starting to port the CarSim and TruckSim solvers’ results to Unreal Engine along with their own car models and environments. UE4’s readily available source code, C++ support, and Blueprint visual scripting system made it an attractive choice for processing the volume of data that driving tests generate.
That’s when Mechanical Simulation decided to integrate more of their product into Unreal Engine. “It was pretty obvious that we could get the information from the road and the sensors and interface tools like MATLAB/Simulink, and let people integrate their own active controllers,” says McGinnis.
This gave Mechanical Simulation a clear path to upgrading their offerings using Unreal Engine, leaving them more room to focus on their core technology: the solver inside their products. “Early on, our software did not have a good way to build complex scenes for visualization,” says McGinnis. “One approach we took was to add an Unreal Marketplace plugin that allows a CarSim vehicle solver to be loaded into the Unreal Editor. It allows people to create scenes and scenarios using that tool all by themselves.”
The VehicleSim Dynamics plugin, released for free on the Unreal Marketplace in 2017, gives CarSim and Trucksim users a powerful tool for generating visual representations with all the advantages Unreal Engine has to offer, such as physically based rendering (PBR) materials, realistic lighting, landscape and foliage packs, and cityscape items.
The VehicleSim Dynamics plugin works by converting the solver data to Blueprints, which can then be easily queried to produce data about both the terrain and the vehicle.
The anatomy of a plugin
To make the terrain data work best with Unreal Engine, the solver arranges the terrain description into a searchable structure that can efficiently be queried by the solver. To accommodate customers with less powerful machines, the plugin also separates the graphical and physical terrain representations.
“By completely separating the physical and visual representations of the simulation, we are able to run the solver on a separate machine. We then establish a communication channel back to Unreal to represent the vehicle visually,” says Jeremy M. Miller, Lead Developer at Mechanical Simulation. “It’s a little complicated, but we had to do it to connect with HIL [hardware-in-the-loop] systems that don't have any GPU capacity.”
The team at Mechanical Simulation sees the simplicity of the Unreal Engine plugin as a huge plus for their customers. “They don't want to be running $200,000 worth of software on a single machine which requires another engineer just to help the prime engineer get his job done,” says Miller. “We preach that the simplest tool chain is the one that's going to be the most efficient.”
The plugin has also proven to be useful for training, testing, and previsualization of newly designed vehicles. The team is constantly looking to improve the plugin to better serve their customers, for example recently adding an FBX converter to bring in physical terrain models that will work with the plugin.
Using Unreal Engine has provided some additional benefits to customers working on the more aesthetic side of vehicle design. “We have customers iterating in Unreal Engine to study topics such as headlight design, and placement of sensors on different vehicles to optimize sensor coverage at the lowest cost,” says McGinnis.
Plugin use in the field
One such customer is VERTechs, a Tokyo-based company that develops AI technologies for self-driving systems. To help test autonomous vehicles, VERTechs developed a virtual town from scratch called AUTOCity.
"With the behavior control data from CarSim, it's possible to get an extremely photorealistic video rendered using UE4 on AUTOCity,” says Yoshiya Okoyama, CEO of VERTechs. “Depth data and segmentation images are indispensable for AI learning. Both of them are created at the same time with the technologies of UE4. Furthermore, a simulation of LIDAR can be performed at the same time by creating virtual point-cloud data for assets on AUTOCity. Those parallelized simulations have already been realized on a general-purpose computer in real time."
Japanese company Rikei Corporation is also using the Carsim UE plugin to augment its offerings. Rikei develops photoreal virtual spaces for a variety of fields including vehicle simulation.
“UE4 can provide us with environments that are quite close to the real ones because of the very high reproducibility of light,” says Takanori Tamura, Sales Manager at Rikei. “UE4 makes it possible to simulate the reflection of road surfaces during and after rain.”
“Specific weather conditions, including the location of the sun, can be reproduced in UE4,” says Khusinov Jakhongir, Senior Engineer at Rikei. “Testing can be performed in environments that would be dangerous if they were real, such as a very steep slope and a slippery road surface. And UE4 makes it possible to continue testing 24 hours a day, 365 days a year.”
As the needs of the vehicle simulation industry continue to grow, the team at Mechanical Simulation is determined to keep growing with them. “The goal is a seamless experience for our customers,” says Miller. “A test engineer might want to run a vehicle maneuver test suite with a sunny day scenario, then complicate the test with the addition of rain, and then with rain at night. He shouldn’t have to think about how to represent those things visually—he just wants to know how the car will perform. Unreal Engine has the capacity to produce these visually pleasing scenarios, pleasing to both human simulation participants and simulated vehicle sensors.”
Unreal Engine excels at handling complex data sets and turning them into real-time simulation applications to train both humans and machines. To get started in this field, download Unreal Engine.