Nickelodeon’s SlimeZone multi-player VR experience comes to IMAX Centers
SlimeZone is a social VR experience that allows players to step inside the Nickelodeon metaverse. Built in Unreal Engine by the talented team at the Nickelodeon Entertainment Lab, SlimeZone lets up to six players select from a range of Nickelodeon character avatars, play games, watch cartoons, create art and slime their friends. SlimeZone debuts today exclusively at IMAX VR Centers in Los Angeles, New York and Toronto and will be coming soon to Shanghai, Bangkok and Manchester.
We spoke with Chris Young, Senior VP Nickelodeon Entertainment Lab about building out an immersive version of the Nickelodeon world.
Can you tell us a bit about Nickelodeon Entertainment Lab?
The Nickelodeon Entertainment Lab was officially announced in May 2017, building on work that was started in early 2014. The Lab was established to take a long-range view of technology and how it could be used to create new forms of entertainment for our audience through a combination of prototypes and content development.
How did the SlimeZone project come together?
The initial prototype for SlimeZone was built with a three-person team. It started out as a networked social experiment using motion controllers and simulated physics to create fun multi-user interactions. From the initial proof of concept, it evolved over the course of a year from a robust tech demo to a fully productized experience.
Were assets from the shows themselves used? What was involved in bringing those assets into Unreal Engine?
We leveraged a fair amount of existing CG assets from various Nickelodeon IPs through a simple, straightforward Maya to UE4 FBX workflow.
You started using Unreal Engine in 2013. What are more recent features that have made it easier to build AR/VR experiences in Unreal?
The multi-platform plug and play nature for developing with different VR hardware directly in the UE4 editor, along with built in client/server networking features, mixed with our ability to rapidly prototype through Blueprints allows us to try anything we can dream up. It's also great that we have the added benefit of extending the source to accommodate any whacky unfiltered ideas—which ultimately frees us up creatively to bring Nickelodeon's unconventional thinking to this emerging platform.
How many characters and environments are there in the experience?
We have a dozen different Nickelodeon avatars to choose from that use waveform VoIP to drive blend shapes on the mouths for network communication. The environment is approximately 100,000 sq. feet and includes a cinema, basketball court, ping-pong table, art room with a variety of different particle paint tubes, target gallery, a slime arena complete with slime canons and SpongeBob’s pineapple house!
Why did you choose to build the experience in Unreal Engine?
Initially we were attracted to UE4 for the level of rendering quality you can achieve and the approach to real-time production techniques for creating linear and interactive content with Sequencer. We also liked that Blueprints gave us the visual ability to build out our ideas, and that ultimately we had access to the source if we wanted it.
How important is it to be able to iterate frequently on a project like this?
You cannot create VR (or good VR) without iteration. It is unlike any other authoring process and it relies on being able to quickly create, check and challenge your thinking. Spatial understanding of your experience informs every aspect of the end product. SlimeZone perfectly embodies the iterative process for creating in VR -- you start with two people networked with primitive spheres on their heads, you enable physics and start passing a cube back and forth and that unlocks an idea that leads to another and then another.
Can you elaborate on the liquid simulations and use of NVIDIA FleX fluids?
We initially were using a decal system for creating slime, but we knew that we needed to have liquid simulations. We were thrilled to find the NVIDIA branch on GitHub and even more excited that we could simulate as much FleX fluids as we are at 90fps.
What’s next? What are of the ways you’ve dreamed about experimenting with your content and technology?
In the Lab, we believe that the future is rendered real-time. Entertainment will be delivered as an executable. Real-time technology is foundational to VR, AR, and virtual cinema pipelines, so we will continue to explore these tools and platforms to find new ways to connect with our audience.