April 6, 2020
Toyota evaluates vehicle ergonomics utilizing VR and Unreal Engine
Fifty years ago, the process would have relied on physical prototypes and mannequins. The late 1960s saw the measurements of hundreds of people taken and stored in huge databases, which were used to define the typical shape and size of human bodies. From these databases, physical models of people were built and used to work out things like the steering wheel position and the reachability of pedals.
In recent years, virtual ergonomics technology has transformed that traditional process. Designers and engineers can now simulate human interaction with a vehicle far more realistically by testing the reactions of real people in a virtual environment.
One innovative team at Toyota is harnessing the power of Unreal Engine and VR to validate designs even faster and at far lower cost, capitalizing on the open nature of the engine to connect industry-leading software and technology.
The resulting workflow not only improves the ergonomics of Toyota’s cars today, it opens up opportunities to test out proofs of concepts for the autonomous vehicles of the future.
Real-time tools for human factors engineeringTypically, it takes three years for a vehicle to go from the early stages of design to the dealership floor. Ergonomic validation takes place at the beginning of that process in the first year to test out concepts and ideas between the design and engineering stages.
Mikiya Matsumoto is the general manager of the Prototype Division, Digital Engineering Department at Toyota. While it’s now common for automotive companies to harness real-time technology on the showroom floor for interactive configurators, his team has been leveraging Unreal Engine far earlier in the automotive life cycle to assess the user friendliness of vehicle designs and identify areas for improvement.
The process starts with the import of a 3D mockup into a virtual environment built in Unreal Engine. A person wearing a VR headset sits in a real car seat and experiences a series of simulated scenarios to test out the design and usability of the vehicle.
One such validation scenario developed by the Toyota team involved testing the visibility of other road users out of the rear quarter window of a new-generation car. “We prepared several pedestrians and bikers in a virtual city environment,” says Matsumoto. “The evaluator could see the simulated pedestrians and bikers passing near to the vehicle through the rear quarter window from the driver seat position via a VR headset. The test enabled us to improve visibility and we were able to complete it very quickly at a low cost compared to conventional methods.”
The team also leverages the setup to perform accessibility checks, using tracking gloves to evaluate how easy it is to reach various buttons and controls. It uses a HTC Vive headset, CarSim for vehicle dynamics, Leap Motion controllers for hand tracking, and a combination of different physical prototype parts and VR simulation, depending on the evaluation performed.
Because Unreal Engine is an open platform, Matsumoto’s team doesn’t have to jump through hoops to work with these industry-leading third-party tools. Many software providers are connecting their tools and systems to the Unreal Engine platform via plugins—like the Mechanical Simulation CarSim plugin the team uses.
Car model data is imported into Unreal Engine via Datasmith. Datasmith is a collection of tools and plugins for bringing content into the engine, many of which interoperate with the CAD and product lifecycle management (PLM) systems used in the automotive industry. Datasmith enables the team at Toyota to go straight from CAD to Unreal Engine in a couple of clicks without using any third-party software in between.
With their ability to create complex scenarios that include virtual vehicles and human characters, game engine toolsets are the best way to perform real-time ergonomics testing in a virtual environment. The Toyota team leverages the Blueprint visual scripting system in Unreal Engine to create these virtual scenarios for each test.
Blueprint is a powerful visual coding toolset that puts programming features into the hands of non-programmers. It gives teams the flexibility to build custom functionality for very specific requirements, such as the unique testing simulations necessary for Toyota’s ergonomics validation.
Other ergonomics tools that are used in the automotive industry generally do not offer VR functionality. Users must validate ergonomic tasks from a third-person perspective, which lacks the immersive realism of VR. In addition to being expensive, these systems tend to be closed, which makes them harder to use in combination with third-party software.
Real-time ergonomics testing future vehiclesMany automotive ergonomics studies today are still carried out on real physical mockups of vehicles. These are costly and time-consuming to build as well as inefficient—the design has often changed by the time the physical mockup is available.
The workflow developed by Matsumoto’s team saves time and money compared to traditional methods of ergonomic assessment, and provides a more flexible development path.
“Real-time technology allows us to perform virtual user experience testing,” explains Matsumoto. "This reduces the cost of and time taken for proof-of-concepting, leading to a more agile way of development.”
What’s more, Matsumoto believes real-time technology will be pivotal in proving the concepts of the vehicles of the future. “Future vehicles might not have a traditional steering wheel—they might have something totally different to control the direction of movement. By using real-time technology and VR, we can evaluate any type of human-machine interface (HMI) and user experience.”
Want to harness the power of real-time technology for ergonomic evaluation? Download Unreal Engine today for free!