Imagine being able to see what cells, platelets, and viruses look like close up—all without a microscope. It may sound like something from a sci-fi novel (or a particularly memorable school field trip to the science museum), but for the team at Random42, it’s part of everyday life.
The only difference is instead of shrinking the human body, they’re using technology to expand it. After investing in an Oculus DK1, the London-based studio—which has been creating pioneering medical animation for three decades—began using Unreal Engine’s Blueprint visual scripting system to make realistic blood vessels they could explore in VR. “We were surprised at how easy it was to set up and build a project in Unreal Engine, compared to other game engines we had experimented with before,” says Callum Welsh, Head of R&D at Random42.
Image courtesy of Random42
Image courtesy of Random42
Image courtesy of Random42
Image courtesy of Random42
Image courtesy of Random42
It was the start of a pipeline that would lead to Random42’s Covid-19 Infection and Vaccine Development Animation: their first animation made entirely in Unreal Engine. The recently released short is a follow-up to the hugely successful Coronavirus Outbreak Animation that was released in 2020 and has since received almost two million views; been featured in news programs and documentaries around the world; and spawned the brand-new sequel on vaccine development.
Image courtesy of Random42
No room for error
According to Welsh, the move to Unreal Engine has resulted in more time to iterate on shots, a huge advantage when every frame needs to be as scientifically accurate as possible. This was especially important for their latest vaccine development animation: lack of clarity could affect rates of vaccine hesitancy or lead to misinformation.
To ensure their medical animation was as accurate as possible, Random42’s science department began by collating information from a variety of sources into a fact-checked narrative. The art and animation team then used a protein data bank to download point clouds that would serve as a reference for building super-accurate virus geometry. A procedural Houdini setup meant they could process dozens of these point clouds at a time and make sweeping changes to all affected exports. Each export was then brought into Niagara to instance meshes in the Unreal Engine level.
Image courtesy of Random42
“Because we were using a real-time workflow, it was effortless to jump into a shot and constantly polish the animation, spawn particles, or try different lighting,” Welsh explains. “We would position another hero protein to fill out some dead space and have an updated rendered playblast in the edit in a few minutes. With Sequencer, we were also able to trigger events and update variables in Blueprints and Niagara systems, which meant we had so much freedom in animation. It was great to be able to scrub the timeline, notice a section that required some improvement, and then jump into the viewport and tweak the layout, materials, and lighting on the fly.”
By the time the project was complete, the Random42 team had even built their own custom tools to simplify material creation across the project, and saved even more time by using MetaHuman Creator to create realistic characters.
Image courtesy of Random42
“As we were working on the project, the first preview of MetaHumans came out,” Welsh remembers. “Like everyone else, we were super excited to try it out. One of the artists did a quick test scene and found it fit perfectly. It was very intuitive and easy to use, and the various face morphs looked great in animation. The best thing for me is it standardizes a rig for everyone to use. If we produce animations on one MetaHuman, we can easily transfer them onto another. Together with Blueprints and Niagara, this aided in a much more collaborative animation process, as we could do real-time reviews and create the final result in record time.”
The need for more education
“Our goal is to educate people, so ensuring we are a resource you can trust is key,” Welsh adds. “With Unreal Engine, we were working so much more efficiently that it was a lot easier to scrap something if it wasn’t working, as you hadn’t invested as much time in blocking it out. That meant we could potentially even scrap entire scenes if an element isn’t up to our high standards of accuracy, without missing any deadlines.”
As the world of modern medicine and healthcare becomes more and more advanced, it can be increasingly difficult for many to understand the complexities of the body and make informed decisions about treatments. With their new real-time workflow, Random42 aims to ensure they can push the quality of animation further, to make more engaging content that people can more easily digest and learn from.
Image courtesy of Random42
“In the future, we want to revisit several Unreal Engine R&D tests, like a virtual camera rig that we made using Oculus Touch,” Welsh concludes. “The most important thing, however, is to help save lives by democratizing information. Research has shown that video learning is much more effective than text-based learning styles for engagement, understanding, and retention. Animations have always been an effective way of telling stories, and medical animations are doing the same thing. We are telling the story of how a disease works, and hopefully, this will help us all better understand how to mitigate its effects.”
Get Unreal Engine today!
Get the world’s most open and advanced creation tool.
With every feature and full source code access included, Unreal Engine comes fully loaded out of the box.