6 février 2020
A comprehensive guide to creating 360-degree game trailers using Unreal
We at Archiact were planning the announcement of our recent VR adventure experience, FREEDIVER: Triton Down, which is available now on Steam and the Oculus Store, and were nosing around for the best way to spread the word about this game we’d made and loved. With our 360-degree teaser trailer going live, it has not only kicked off the most successful announcement week in our studio’s history, but the video itself has shattered every video record our previous trailers had ever set. Best of all, no fancy third-party tech or expensive program licences were needed: we were able to accomplish all of this using Unreal Engine with only a small team of three developers spending a few hours for setup, plus another day or so for render time. The workflow was amazingly smooth, and we’d love to share it with you.
If you’re wondering if 360 video might be right for your VR game promotions, this post will walk you through the technical steps within Unreal to film, record, and render your in-game content in 360 degrees. We’ll also share some of the learnings we gained along the way regarding the less-tangible side of 360 content creation, such as non-linear storytelling, viewer engagement, and more.
Okay, But Why 360?
For VR developers, the problem is a familiar one: how can you accurately convey just how amazingly immersive and interactive your VR game is, when the vast majority of your marketing materials will be experienced on a 2D screen? Most of us stick to what we know, and rely on creative ways to portray a three-dimensional experience through a 2D medium, largely in the form of trailers, screenshots, and GIFs.We knew right away that FREEDIVER needed more than that. Between the intense underwater environments and use of one-to-one gestural swimming locomotion, it was screaming for a promotional asset that matched its immersive chops: 360 video seemed like a good place to start.
There was just one catch for us: we had never made 360 content before.
Who Else is Using 360 Content?
What’s the first thing you do when you’re about to do something for the first time? See what everyone else has already done!Somewhat surprisingly, there are only a handful of VR games that have chosen the 360 format for their promotional videos. The first and most prominent example is the trailer for The Climb. The Climb had an advantage here. Because it's gameplay is straightforward and relatable enough, which is literally summarized by its title, they didn’t need to spend too much time establishing features or mechanics. What we noticed right away in their trailer is the sense of presence: a big part of The Climb's appeal is the incredible vistas you’re rewarded with during and at the end of each climb, and the way this trailer is shot encourages the viewer to really look around and take in the beautiful scenery. You want to be there, right now—the act of climbing is almost secondary, and that’s okay. Arizona Sunshine’s 360 trailer takes advantage of both space and time. While you’re whisked from scene to scene in this apocalyptic tableau, time is slowed down enough to give the impression that this is all happening at once. It gives a real sense of chaos to the view, and sets up expectations for a game that will drop you right in the middle of that intense whirlwind. Interestingly, the storytelling here is quite linear: where The Climb's trailer rewards the viewer no matter where they choose to look, Arizona Sunshine’s trailer still focuses the action right in front of the player, and there isn’t much else to see beyond the immediate action you’re served. Last, but not least is the 360 trailer for Psychonauts: The Rhombus of Ruin. At a runtime of 93 seconds, this trailer appears to be pre-rendered, and is almost entirely story-based. The use of binaural audio here is key, as the viewer turns their head around to explore the spaceship scene, the direction of the character’s voice will always tug them back to center. One challenge this trailer does highlight is how to tell a clear linear story in a non-linear format; without the use of voice over, there are few context clues to bring the viewer into the story and, therefore, to give them a presence in the world the developers have created. It’s a tough challenge!
What We Learned
- Reward the player for looking around
- Presence is key, gameplay less so
- Find a way to keep your viewer’s attention centered on the most important thing, without punishing them for straying
- In other words, treat it a lot like VR! Many of the same principles apply here.
What Does 360 Content Need to Succeed?
With some new knowledge under our belts, we set our goal for our own 360 trailer. In the end, we decided the focus needed to be on tone and energy: we wanted viewers to feel what it was like to be trapped inside the hull of a sinking ship, to get a taste of the nerves and the shortness of breath that the game so wonderfully douses you in.Right away, we sketched out the following guiding elements:
- Establish the scene quickly: you are on a boat!
- Establish presence: you are a person! Look at those arms. Those are yours!
- Establish elements of anxiety: this is not a friendly place, and the swimmer is definitely in trouble
- Establish our highest level gameplay mechanics: swimming and oxygen management
- Always give the viewer something interesting to look at
- Put the viewer in danger, then give them hope, and go out with a bang
From that, our first storyboard was born: the viewer opens their “eyes” in the hull of a completely flooded ship galley. They must swim to the nearby intake hatch—past their dead shipmate’s floating body—and get enough oxygen to swim down into the vent and—hopefully—to safety.
The Hidden Challenges of 360
Armed with our storyboard and a rough script, we dove into production. Right away, we faced challenges. Some were story based: How do we direct the viewer’s eye? How long should we keep the camera in a specific location before moving on? All the others were technical. How can we even record this in the way we need? What will our in-game avatar look like when you can swivel the head around in any direction?With Unreal, the road to answers for these questions was much smoother than we anticipated, and even allowed for iteration to get to that perfect end result. Here are the technical steps, one by one, for you to follow along with and/or troubleshoot your existing process.
Step 1: Get Equipped
The first step in production was to get our ducks in a row. And by ducks, we mean plugins. The one we used is the [experimental] Stereoscopic Panoramic Capture plugin from Unreal. Our trailer was created using a pre-4.23 version of the plugin, so be sure to check out the notes on the new version for the most up-to-date workflow! With that installed, make sure that Instanced Stereoscopic Rendering is OFF in the Project Settings. Restart the Editor for the changes to take effect. Add the following execute console command nodes right after the Begin Play Event node. Now you’re ready to load up your scene!Step 2: VR Mo-Cap? Easier Than It Sounds!
Since presence is key, we absolutely needed to have the arms of the main character (Ren Tanaka) in every shot. That meant essentially performing motion capture inside the game itself. Sequencer was the best tool for this job, and we used it to record gameplay.In the FREEDIVER project, our base character is spawned only when the user plays the game. In order for Sequence Recorder to tangibly handle the base character, we needed to set the GameMode Override to “GameMode” in the World Settings. Next, we added (dragged in) the BaseCharacter manually into the level. We selected the BaseCharacter in the level and set its Auto Possess Player property to ‘Player 0’. Once we had those set, the BaseCharacter took inputs from the controller and could then be used to record Sequencer animation. From there, we open up the Sequence Recorder window... ...and selected the BaseCharacter and pressed the Add button at the top of the Sequence Recorder window. Now it’s time to dive into the motion capture! Launch the Game in VR mode… ...then press Shift+F1 to focus out of the VR window and get back to the Editor while the VR mode is still playing. Press the Record button at the top of the Sequence Recorder window. Click back on the VR window to focus on it. You should be able to resume control over the character and camera. You should also see a countdown overlay indicating that the recording will start in 4, 3, 2, 1 seconds.
Lights, camera, action! Once the recording begins, move about the virtual world and perform your actions as you planned. Remember, every action, from head movement to controller inputs, will be recorded as Sequencer animations, so don’t forget to act the part from head to toe.
Tips for VR Mo-Cap:
- Don’t be afraid of multiple takes. Just like capturing live action, it will take a few run-throughs to get everything right.
- Keep your head as steady as you can. Avoid swinging around wildly.
- The viewer needs two to three seconds to fully focus on a new object or action.
- If you don’t have a player avatar with visible hands, definitely consider it! We were amazed by how much character and storytelling was possible through Ren Tanaka’s gestures.
- Exaggerate: tiny motions may not register in the final animation sequence, so keep your arms up high and wide, and your movements slow.
Once the performance is wrapped, press Shift+F1 again to defocus out of the VR window in order to get back to the Editor and stop the recording in the Sequence Recorder window. A recorded sequence will then be created. Open up that sequence to see the animation track contents. You can inspect the character body animation by right clicking on the SM_VRPlayer animation track properties and double-clicking on the recorded animation asset.
Step 3: Fine-tuning Your Animation
While our method of motion capture was effective in recording the essence of the player’s movement through the scene—timing, head movement, and general placement/interaction of hands—no IK system is perfect, and there will likely always be room for improvement in the resulting animation.Because this teaser trailer would be the VR community’s first ever glimpse of what FREEDIVER has to offer, we wanted the animation to be perfect. It’s more than just a quality issue; it’s also a question of storytelling. First-person footage from a VR game is indistinguishable from first-person 2D game footage, unless you have significant player presence in the form of hands/interaction fidelity.
In order to tell the story of Ren’s underwater struggle for survival in the 30 seconds or so we had to tell it, ensuring that her hands were believable “actors” in their movement and interactions was key. Thanks to the Sequencer, you can export the FBX files from your gameplay motion inputs directly to the animation software of your choice. In our case, that was Maya, which our animator used to fine-tune the animation keyframes written [in] Sequencer.
Remember: When you export to the animation software, you are only going to see the player avatar, and not the game world/objects around it. Since this makes orientation and fine interactions difficult, we recommend recording an additional 2D render from the viewpoint of your player avatar in Unreal, and attaching it to the head bones of your animating character.
Step 4: Final Tweaks & Export
Reimport your polished animations into Sequencer. Now, you have the chance to make any final tweaks to your in-game world. You can add lighting to guide your viewer’s eye, nudge in-game objects to better fit the frame, and even hand-animate moving elements for exact timing. This was by far one of the most useful steps in the process, and Unreal gives you the flexibility to make changes as you need.Remember: If you want to have non-in-game visual elements appear in your 360 video (such as a logo in the corner, legal text, etc.) now is the best time to add them. This way, they won’t be “stuck” to the viewer’s gaze like a sticker on their virtual eyeball, which is a distraction and instant immersion-killer. If a logo “floats” along with the viewer, but remains in place while they look around, it’s much less intrusive.
When you’re ready to capture, all that’s left is to play in StandAlone Game mode, and render out the sequence as it unfolds exactly how you wanted it. You can select the resolution your render will output as; for ultra-crisp video in 360, we’d recommend 8K, or 4K minimum. 360 images for both the left and right eye view will be rendered out and saved individually.
Remember: Because the render outputs as 360 stills in sequence, there will be no attached audio. In order to capture your in-game soundscapes organically, use a screen recorder such as OBS or ShadowPlay to record the Sequencer events independently, then import to your editor later. Next, fire up your linear editing software of choice and import the images as a sequence. From here, you can color correct as needed, and render out the final master file in the desired video format.
VLC player can play this new 8K 360 panoramic movie format, with complete click-and-drag gaze control.
Step 5: Editing Your 360 Footage
This is it: you’re finally ready to take your master 360 files and edit them into your trailer, or whatever video asset you’re creating. 360 files will behave just like 2D files in most editing software, so simply arrange the sequences as needed, color correct them, add transitions and title cards, and render out your final. If you recorded the game audio separately, this is the time to add that back in.Quick Tips for Editing in 360:
- Because your footage will be at least 4K, you will likely need a beefy PC to handle the render.
- You can add 2D elements such as text and still image files, but they must be projected in 360/VR-mode in order to avoid severe distortion when rendered in 360, as seen below. (Many editing suites have this function built in; otherwise, you should be able to find a plugin to handle your projections.)
- Not all graphics cards are equipped to render video effects in 360. Ensure you have a supported graphics card and update your drivers.
- As mentioned above, avoid overlaying text or graphics in layers directly over the 360 video, as they will remain static and become a severe distraction in the “corner” of your viewer’s virtual eye. If you want to have a logo permanently on-screen, for instance, add it to the game world and attach it to the player avatar instead.
Step 6: Audio
For us, we knew that audio was going to be paramount to the success of this trailer, and that meant we left it to the audio experts!The audio for FREEDIVER was created by Interleave, and was designed to be as realistic and immersive as possible. Instrumentation is meant to function organically with the ship's sounds, since the ship is a primary character in FREEDIVER, which sings and speaks through its sinking. Instead of approaching the score with traditional brass or strings, the audio designer and composer settled on a music direction where the ship became the actual source of tone and tension throughout the score. They even rubbed different sustained frequencies against each other based on the user's input as a way of enhancing the tension the user would feel underwater.
Melodies were conjured by manipulating sounds like dry ice placed in large ventilation shafts, and string instrument bows were used on different densities of metal in software samplers. When designing the ship's large impacts and huge metallic whines from the ship, Interleave tuned them to work together with the goal of blending them with the music, making it difficult to distinguish one from the other. In-game, they also played with this idea of blending music and sound design further: for instance, the pause menu's swelling metal sounds playback in 3D, with randomized choices of samples and volume and position moving around the listener's head.
The teaser provided the additional challenge of hitting many emotional beats in a short time span. At the beginning, the viewer hears the uncertain whines of the ship which crescendo into large haunting blasts, as Ren Tanaka struggles to get to the air pocket in the hatch. As she submerges, the music shifts and builds into a more triumphant and courageous section. The percussion and bass kick off with a rising heartbeat, settling down to begin a shift to an increasingly panicked heartbeat aligned with Ren's fight to survive. The teaser sound effects were edited to picture primarily from gameplay assets and then tweaked and sweetened. Interleave wanted a strong contrast between the air and water-filled environments, so they used a tonal contrast as well as a perceptual one. Plugins were used to add credible space to the air environments and to keep the underwater environment intimate. They also smeared the position of underwater sounds to accent sound wave speed differences in the two mediums, and many of the sounds were positioned using 360° surround tools.
In the end, Interleave supplied us with impeccable 5:1 surround sound audio, as well as 2-channel files in case we were uploading the final video somewhere without 5:1 support. The trailer’s final form is a haunting, powerful representation of the FREEDIVER experience, and the audio plays a huge role in telling the story of Ren Tanaka’s plight.
Step 7: Rendering & Upload
Once you’ve obtained the audio and striped it under your visuals, it’s time to kick off your final renders. Here are the exact settings we used for our two master renders: one for 5:1 surround, and one for 2-channel.
Video Settings
Format: H.264 with MP4 wrapperWidth: 4096
Height: 2048
Frame Rate: 60
Field Order: Progressive
Aspect: Square Pixels
TV Standard: NTSC
Performance: Software Encoding
Profile: Main
Level: 5.2
Bitrate Encoding: VBR, 2 Pass
Target Bitrate [Mbps]: 30
Maximum Bitrate [Mbps]: 35
Video is VR: YES
Frame Layout: Monoscopic
Horizontal Field of View: 360
Vertical Field of View: 180
Audio Settings
Audio Format: AACAudio Codec: AAC
Sample Rate: 48,000 Hz
Channels: 5.1 (or 2-Ch)
Quality: High
Bitrate [Mbps]: 320
Remember: Many video hosts, including Steam, do not support 5:1 sound, and will crunch it down to a messy 2-channel format. To avoid unexpected results, take a page from our book and have a custom 2-channel audio file rendered out for this exact purpose.
Now that you have your final renders (hooray!), it’s time to upload to the video host of your choosing. YouTube, Vimeo, and Facebook all support fully interactive 360 content, but uploads can take a very long time to process, so be sure to give yourself plenty of time before the big reveal.
Once your file has processed, take a moment to quickly double check that the video can play in 360 and is interacting as you expect, then share that awesome creation with the world!
Thanks for reading! If you create your own 360 content using this guide, go ahead and share it with us via Twitter (tag us @ArchiactVR and @UnrealEngine) so we can see all your hard work.