Out of This World - An Inside Look at the Making of “Field Trip to Mars”
Ron Fosner, who served as lead developer for Framestore's Field Trip To Mars project, has a laugh so infectious that, even when you’re 4,000 miles away, it’s hard not to get drawn into his world.
“Ha, ha! Yeah, definitely. Thirty-six hours before we were going to go live, that’s when I thought, ‘We’re never going to pull this off!’ I looked in the bus and there were no seats, only two out of four monitors were fitted, and the servers were just sitting in a stack on the bus because they hadn’t been installed yet into the server racks," he says. "I’m just like, yeah, I don’t think this is gonna fly.”
But a mere 12 hours later, Ron’s nightmare had started to shape up and become the stuff of VR dreams when the bus drove away on its inaugural trip to the red planet.
Of course, the bus wasn’t really going to leave Earth’s atmosphere and travel the 140 million miles to our nearest rocky neighbor, but for Framestore - the famed VFX house that has taken audiences to low Earth orbit in the Academy and BAFTA award-winning film Gravity, as well as to a celluloid Mars in The Martian - it may well have proved to be a simpler proposition.
At the tail end of 2015, Framestore were approached by the advertising agency McCann New York to pitch for a project for Lockheed Martin. Leaders from Framestore including Alexander Rea, Sue McNamara and Theo Jones assembled a team to deliver something truly groundbreaking. The idea? To take schoolchildren on a virtual reality trip to Mars, but with a difference: The VR experience would be shared and take place within a school bus that would drive around the Washington, D.C. area. Every move the bus made, no matter how small, would result in the outside virtual world being updated so that you felt like you were on Mars and believed that you were on a different planet. And that’s where some of the many (surmountable) challenges occurred.
Gary Marshall, a senior developer on the Field Trip to Mars team, has a background in motion capture and tracking. He takes up the story, “The first thing that I did was build an Arduino microcontroller so that we could get accurate orientation data from a GPS and an accelerometer. I built a little bus mesh within Unreal Engine and used the controller that I’d built to rotate it. From there we ramped up pretty quickly and I actually spent some time walking around the local streets with a laptop, a backpack, and a bunch of cables, looking really suspicious. I definitely got a lot of weird looks from the cops!”
Aside from weird looks, one of the problems that the team encountered, of which deriving a solution was going to be critical for the project to work, was the inherent lack of granularity within current GPS systems. The tiny movements of the bus had to be mirrored exactly within the Unreal Engine-powered Martian landscape, and GPS technology works in meters, rather than the centimeters that the team needed.
Marshall continued, “Like a lot of things, the solution came about partly by chance. I happened across a laser that is used to measure the speed of conveyor belts within factories, and we thought that we could flip the paradigm. So instead of the laser being static and the target moving, how about we make the laser mobile and have it pointing at a static road? So we bought these crazy, high-end lasers from a German company (Polytec) and rigged them up to an old record player. I wrote an interface that allowed the lasers to speak to the software and watched as the mesh bus moved back and forth in Unreal Engine as we moved the record player. All of a sudden we had this level of granularity that would allow us to work at a much higher level of resolution than GPS.”
One issue was solved, but there were more to come.
Mars is a big place. And Washington, D.C., where the bus was going to be driving around, is also a big place. That was going to require careful thought.
“Originally, we were going to aim for a 250 km / 155 mile square mapset of the D.C. area. This is a huge amount of data, somewhere in the region of 6x the size of Rockstar’s Grand Theft Auto 5. Pretty soon we realized that this was too ambitious, so we settled on a 4 km / 2.5 mile square, which is still an amazing amount of data to deal with. But it was a relative walk in the park compared to our first idea,” said Bill Davey, map and data development lead.
Davey recalls being in awe of the landscape that they were going to create and the volume of data that they were going to have to work with. “As well as the data, we were continually needing to solve problems. The day we tested the laser tracking solution, it rained. We get out on the bus and there’s no data and we have no idea why. It’s an expensive bit of kit and it doesn’t work. Then someone thought to look under the bus, and realized that the lens was covered in water. That’s not a problem in a factory but when it’s on the side of a bus… So we wiped it clean and crafted a rain guard. Another physical problem solved.”
Numerous obstacles the team faced were not simply physical, nor engine-based. Framestore has been producing VFX for 30+ years, and is used to dealing with rendered imagery. Despite this level of vast experience, Bryan Brown, the lead technical artist, had to educate the group on the implications of working in real-time 3D.
“Some of the biggest challenges I faced were getting the team up to speed on how we’d be using real-time, and managing expectations. When you’re creating a scene in a movie then the sky really is the limit. You have an abundance of time, and can throw processing power at creating exactly the end result you want. But real-time works differently, where you are constantly balancing the different resources you have at your disposal. So, where I wanted to keep the landscape down at 2k textures, the team wanted to push to 4k. I spoke to our technical partners within Epic and they advised that this was experimental at the time, but we tried it and Unreal Engine handled it brilliantly. So, buoyed on by that success, the next internal question was, 'Hey, how about 8k?' And I was like…No, real-time doesn't work like that!"
Claude Dareau, senior developer on Field Trip to Mars, elaborates on a few of the procedural solutions that he devised.
“This was the first project in which I’d ever used Unreal Engine, so it was a definite learning curve for me, but I think that as I was coming into it fresh I wasn’t hindered by any preconceptions of what was or wasn’t possible. We had to have this landscape that was largely procedurally generated but still looked like a filmic interpretation of Mars, so creating this landscape and placing the geometry was tough.”
As he explained, “We had Martian objects that would have been very difficult and time-consuming to place by hand, so it was done procedurally. The maps also needed to be flood-filled, which again takes a lot of team time and man hours to do, and would have stretched the time we had available for the project. So I devised a plugin that would auto-generate all of the background landscape features, allowing the rest of the team to concentrate on hand building the hero features within the scene.
"I remember feeling a sense of trepidation ahead of pressing the button but, after two weeks of working on this plugin, I sat there and watched it churn away to provide a populated landscape that went a long way towards enabling us to meet the deadlines that were in place. That was a very cool, very satisfying moment.”
For such a massively tech-heavy project and all the pitfalls that could potentially befell them, the team were universally full of praise for Unreal Engine.
“I’ve worked in Unreal Engine for the past four years, and I’m still amazed at how much data it can crunch,” says Bryan Brown.
Ben Fox, the project's technical artist agrees, “The cool thing is that it works with real world stuff really well. You just put stuff in there and say, 'I want a sun here, a sky here…' and it just deals with those concepts very well. I didn’t have to create those natural phenomena as the engine does such an excellent job of doing it. It was the natural choice for the aesthetic we wanted. I don’t really think that there was ever any other option for what we were going to use.”
Unreal Engine is often lauded for its ability to deliver an incredibly rich and detailed real-time user experience. But the visual experience is only a part of the story. Mars not only had to look good, it had to sound great.
“The thing about VR is that spatialisation is the key to making it really convincing,” says Joey Hernandez, sound designer on the project. “We wanted audio to be an integral part of the experience, and a part of that was having the audio playing the whole time. So, I developed a system where non-specific audio would play so that the kids on the bus wouldn’t get bored. I had to be able to turn that off seamlessly so that when you passed by an event, such as the Mars rover, it tells you all about the Mars rover. So it had to be timed exactly. And that wouldn’t have been possible without using Unreal Engine.”
With multiple challenges faced and overcome, the project moved towards the end-game where the client and, ultimately the public, were going to experience what it was like to take a bus ride around Mars. But there were still a few unexpected surprises.
Davey recounts the last demo the team carried out, in front of the client, Lockheed Martin.
“It was conducted at Lockheed Martin’s campus. We were all set, and parked out front to await their senior team. Trouble was that no one had told us that the LM campus was built on some kind of semi-magnetic rock. So as we drove the bus around the car park, the gyroscope on the roof of the bus would spin out of control. It just kept constantly going round and round, only stopping when we drove about 50 meters away, at which point we were completely in the opposite orientation. The only way to fix it was to rig up a game pad input system so I could use the triggers on the game pad to bring it around the right way before the user could see it, and then go from there. After everything, all the tech in the world couldn’t have prevented 'The Rock' from screwing things up!”
“We also forgot to turn the sound system on!” interjects Joey Hernandez. “The 'on' button was in this little cubby that we thought no one would ever have to go into. But I had to crawl in on my hands and knees and get pulled out backwards by my feet. That was… interesting.”
Talking to the team, it becomes clear just how close knit they became during development. Everyone had an integral part to play, and everyone’s thoughts and comments carried the same weight. The end result exemplifies the Skunk Works boot camp that they endured together, right to the end.
“In order to fully outfit the bus we de-camped to a garage in D.C. It was 34°F. We had five heaters in there but the circuit breaker couldn’t handle them – we had a choice of PCs or heaters, so we put one heater under a table, and would all scoot under there to get warm,” recalls Fosner.
“There were moments when we’d be months into the project and we’d see the same boring drive and we’d think 'Oh, we should really figure out a way to make this not boring for the kids,' and then it was like, 'Wait a minute, the kids have literally never been on a bus where the windows turned into an alien planet,' so I had to keep reminding myself that this was cool and something completely different.”
But we’ll leave the final words to Dareau.
“The single best moment was when we did the first run with the children. The build up to that point was fraught with problems, mostly hardware-related, and we’d all been working long, long hours. We get the kids on the bus, the screens go dark and Mars pops up, and they go crazy. Just seeing their reaction was incredible. I’d only had a few hours' sleep and was shattered. I definitely felt emotional when I saw that.”
Field Trip to Mars is an incredible realization of an idea, and a fantastic example of how a small team of people, using the right tools and with the right motivation, can create experiences that are genuinely groundbreaking. It’s not just us who think so, either.
Framestore’s project was the single most awarded campaign at the recent Cannes Lions International Festival of Creativity. It won a total of 19 Lions across 11 categories, including:
- 1 Innovation Lion
- 5 Gold Lions
- 8 Silver Lions
- 5 Bronze Lions
It was also nominated for a Titanium Lion among 22 out of over 43,000 entries across all categories from around the world. Quite an achievement.
We’re proud and honored that the team at Framestore chose to use Unreal Engine, and we’re glad that we were able to help them deliver on the promise of taking the next generation of scientists, programmers, real-time visualization experts and (potentially) astronauts on an intergalactic trip of a lifetime.
If you haven’t already seen the video then we urge you to do so below.
Hear from the team, and see behind-the-scenes footage. It’s definitely worth five minutes of your time.
And who knows, it may just inspire future Unreal Engine developers to think big and shoot for the stars...or at least for Mars.