March 9, 2020
Redesigning Audio in Unreal Engine’s Shooter Game Sample Project
This dev blog is intended for anyone who’s interested in the sound design process or implementation side of working with Unreal Engine. I was hired by Epic Games to perform a complete audio overhaul of their “Shooter Game” project, which can be installed via the Epic Games Launcher. It’s also Epic’s most downloaded starter template, so if you haven’t done so, grab it and check it out.
I not only redesigned all of the audio assets, but I included scalable Blueprint logic and tried my best to depict things in a clear and digestible manner, while still incorporating a few useful parameters, systems, variables, and components that I feel most sound designers will want to look into. What I’ve setup is merely a design suggestion and one of many ways to approach audio implementation in Unreal Engine. I feel there’s a misconception within the game-audio community that in order to be “next-gen sounding” (whatever that actually means), there needs to be middleware incorporated into your UE project, which simply isn’t true.
My hope is that we see an increased number of game audio folks who recognize the value of becoming a more technically savvy content creator. There’s no disputing the massive improvements to the native Unreal Engine tools and features that Epic’s programming and QA teams have been implementing, testing, and modifying over the past few years, and they have a very ambitious and exciting roadmap ahead. I feel it’s our responsibility as professional sound designers to take advantage of these tools and technology, becoming better game designers to boot, rather than relying on a programmer to complete often simple implementation tasks for us, especially considering the dawn of the Blueprint era.
As someone who has been a huge advocate of Unreal Engine since 2012, who’s passionate about the art of sound design and craft of implementation, I was humbled to be hired for this project. After taking everything into consideration (sound design, implementation, discovery, bug fixing, playtesting, meetings, syncing, etc.) I estimated this to be a 15 to 20-day job. Once I synced to the project, I’ve gotta say, it was pretty rewarding to select all the actors in the audio sub-level and delete them. Here’s a quick overview of my experience and some key things I kept in mind.
The Project Goal
My main objective on this project was to update and make the soundscape more modern sounding and dynamic. The previous assets were over-compressed, the mix wasn’t attended to as well as it should have been, and the assets had a pretty dated sound. There was also a rather imposing music track, which was pretty demanding on the ears in terms of frequency and volume. It really took away many opportunities to experience individual game events and ultimately led to gameplay that was less fun. Some of the bigger offenders were the weapons lacking punch, the ambience needed to offer more of a sense of space and texture, the footsteps were super heavy and loud, and the pickups were borderline explosive. I could go on about individual assets, but for the sake of time, I felt that all assets needed an update. I basically just nuked everything that was in the game and started from scratch.
Here’s rather satisfying footage of me nuking everything in the audio sub-level
Sound (Re)Design
The audio in Shooter Game had a dated aesthetic and an imbalanced mix. Since it is still one of the most commonly downloaded sample projects for new licensees and people new to Unreal Engine, this may have given people the wrong impression regarding what’s possible with the native audio engine. Aside from that, the dated sound, combined with an inconsistent implementation strategy made the sample game much less fun to play. The audio assets and implementation lacked balance between the main classes of sound (e.g. weapons vs. footsteps vs. pickups). The .wav assets had limited dynamic range and were competing with each other in the same frequency space. They also lacked individuality and articulation. The soundscape simply seemed bland and, as a result, diminished any emotional connection to the experience, which is arguably the most crucial role of audio in any media. Sound designers want to ensure that any cause or effect in a video game has purpose-driven and meaningful audio design that supports the overarching game design and story. Because of these issues, I felt that a complete overhaul was necessary.For a couple days, I recorded a good amount of original source material by going around different areas of San Diego armed with mics and field recorders. I grabbed all sorts of stuff ranging from ambiences to footsteps on different surfaces. I also recorded custom foley, such as metal impacts and other textured recordings at my studio. I also got some excellent source material using different synthesizers and ran all sorts of weird stuff through FX chains to evoke a mysterious and futuristic sci-fi world.

Implementation
Some gameplay systems in this project were hardcoded in project level C++ code. In other cases, I kept the existing implementation as-is, and simply swapped out SoundCues and source files with newer sound content. For example, the weapon-shooting system uses a hard-coded looping sound system. That system references a custom SoundCue node that only exists in Shooter Game; called Sound Node Local Player, it switches assets based on first and third-person instantiation. Aaron and Dan informed me that this is usually done in higher level gameplay code systems and not in Sound Cues. But for Shooter Game, because of engineering priority purposes, changing how Shooter Game implements local vs non-local player sounds wasn’t an option. I had to do this re-design without involving any programmers for new features or code work. Thus, I had to continue using this implementation method. For other implementation cases, like impact templates, stingers, and pickups, I generally left the sound hooks for those alone and only swapped out the sounds themselves and their associated cues.For the remaining implementation work, I felt there were three main areas I could focus on that most people would likely find insightful. These areas included:
- Overlap Events: Changing ambience and one-shot sounds based on overlap events.
- Actor Blueprints: Create custom utilities to repurpose and customize.
- Material-Based Footsteps: Change footsteps (or other) sounds based on surface type hit results.
Overlap Events: In my opinion, overlap events represent Unreal Engine implementation 101. They allow us to create a Blueprint event from something we can place in the level. Once we are in Blueprints, sound designers have a lot of control over any sound system they might design.

Figure 1. Placing a Brush Actor to trigger an overlap event in an area where I want to have custom audio behavior in the level.
To illustrate this, in Figure 1 above, I’ve placed a custom volume (for the reverb) and trigger (for the ambience) to match the layout of the room. Any time the player overlaps these invisible Brush Volumes in the level, we can do some or any of the following:
- Change the room tone or ambient bed that’s playing in the space.
- Switch to more unique one-shot sounds that trigger randomly around the player’s world position while we’re in that space.
- Change the reverb for any sounds inside this space.
- Turn sounds on and off either inside or outside the volume.
- Use HPF & LPFs for sounds that are playing either in this volume or outside the volume.
Figure 2. Overlap events execute logic that controls ambience.
In Figure 2, I demonstrate a simple Blueprint script, which executes when an overlap event occurs. First, I do a check to ensure that the actor that caused the overlap event and the controlled pawn are the same/equal. I then retrieve an audio component I have in the world by reference and fade it in. Boom, simple. Note that the reason I need to check that the right actor is triggering the volume is that NPCs (or a character controlled by another player across a network connection) might change the ambient sounds on our local client based on their actions. That would get confusing really fast.
Figure 3. Here’s audio logic that spawns one-shot sounds at random vector coordinates around the player’s world location.
In figure 3, I’ve created some logic that uses overlap events and a Timer By Function that periodically spawns a random one-shot sound around the player’s location. The distances and frequency in which these sounds are spawned are completely user defined via public float variables that I can easily tweak and iterate on as I try different sounds out in PIE. This is, of course, in addition to our standard pitch and volume variations within the cue.
Figure 4. Audio logic for the “BP_Audio_Spline” Blueprint.
Actor Blueprints
If you haven’t already discovered their power, take it from me: Blueprints are extremely powerful and unlock an immense amount of audio functionality and customization. For example, in Figure 4, I show my utility script that handles the logic of playing a sound on the point of a spline closest to the listener. Figure 5 below shows an example spline path for the sound to travel.Figure 5. I draw a spline which follows the balcony’s edge in the High-Rise Level. My looping wind sound will emit from the closest point on the spline to the player’s position.
Typical use cases for this kind of thing might be water flowing in pipes, looping water sounds for curved rivers or streams, etc. The reason you would want to do that is to create the sense of a larger, distributed sound source vs a single-point source or a series of point sources.

Figure 6. A visual depiction of audio following the player’s closest position to a spline.
In this project, I’m using it in a few areas. Figure 6 shows one case where I use it along the balcony edge. I simply created a spline path for the audio to traverse along and used a windy/exterior looping sound to play on that path. As a result, it feels like the closer you get to the edge, the more wind you hear. But it won’t rise and fall in volume (i.e. a series of one-shots).
Material-Based Footsteps
Figure 7. The Blueprint script that changes footstep sounds based on the hit result of a line trace.
In general, changing which footstep or other foley sounds play based on what physical material the character is walking on or interacting with does not require code support in Unreal Engine. Figure 7 shows my logic for footsteps and jumplands (i.e. the sound that plays when the character hits the ground after a jump). Something like this script can be used for any case where you want to use a line trace, obtain information about what was hit, and change sounds based on that result.
Figure 8. Playsound notifies trigger sounds while animation notifies trigger Anim BP graph events.
Figure 8 shows that the events seen in this Animation Blueprint are being triggered by individual animation notify events that I’ve setup and named “Footstep” in various run and sprint animations.
Before utilizing this method, you first need to ensure that the project actually has all the physical materials set up on different surfaces. What physical material types that exist are defined in project settings. In this sample project, I’m using only tile, grass, and metal materials. However, if your project has unique materials like that “liquid metal” in the movie Terminator, you’ll want to define it in the project’s settings. Once they are set up, you’ll be able to assign those surface types to physical materials and meshes that have been placed in the level. Usually environment artists have set these up for graphics purposes already, so sound designers often just need to go and see what exists and use them.

Figure 9. Using a “map” variable to associate physical material types and sound cues
In Figure 9, I’ve made a variable type map where I associate the hit result of my line trace to play different sounds based on the surface that’s been hit. The Blueprint script performs the following logical sequence:
- Anim notify triggers an event
- It gets the Pawn Owner & Casts to the Player Pawn BP
- It then get’s the world location (as my starting vector location)
- It then subtract 100 units (use as my ending vector location)
- It performs a line trace using those coordinates
- If it hits a surface with line trace, it then tells me what it was
- It returns the physical material that was hit and finds the sound reference I’ve set up using my map variable
- It then sets a sound variable called “Sound To Play” using the result from what was found in my map variable
- Finally, it reference the mesh on the player pawn and spawns my footstep sound attached to the mesh
Closing Notes
There are often several ways to implement an idea or system in Unreal Engine. While there’s usually not a right or wrong way to do things per se, there’s often a more efficient way. If you’re working on a team, I recommend asking a programmer or designer to periodically review and critique your Blueprint logic as a general best practice. The more you educate yourself and tap into systems outside of audio, the more you will become deeply engaged in the design process, which allows you to be a co-equal developer alongside other game designers and developers using Unreal Engine. That’s a pretty cool thing.For anyone looking to learn more about Unreal Engine audio, I recommend joining my Facebook group “Unreal Engine Audio Tips & Tricks,” check out some of the UE courses on Udemy, join the Unreal Engine Forums, and explore YouTube channels like Mathew Wadstein’s WTF Is series. Epic has also recently released Online Learning Courses that you should check out and feel free to contact me via my Linkedin or visit my website Craftmediagroup.com.
Thanks for reading!