Hello, I am Isaac Ashdown, co-founder and Technical Director at inbetweengames, a small team of three former YAGER devs (Spec Ops: The Line, Dead Island 2) in Berlin. We’re running a Kickstarter campaign for All Walls Must Fall, a tech-noir tactics game set in the nightclubs of Berlin in 2089. In this post, I’m going to go into how we’ve used Unreal Engine to create the procedurally mixed music and sound effects that are such an important part of this setting.
In All Walls Must Fall, everything happens on the beat of the music. You control time-travelling secret agents who undertake missions in the underworld of the Berlin club scene. As a tactics game, the agents perform actions as directed by the player, and these actions have a duration measured in beats, with sound effects synced up to add an additional percussive element to the music. On top of the soundscape, we also synchronise visual elements, like lighting and animation, to the beat. Our aim is to create a synaesthetic effect, where all the audio and graphics in the game are pulling you into the pumping club experience.
To create the music for the game, we’ve taken inspiration from the way electronic music is composed, and created an in-game procedural sequencer which layers different loops together as you play. This allows the game to take into account what’s happening in the gameplay to bring in instruments and variations that raise or lower the intensity as needed, and contribute to a sense of tension as the player is making their tactical decisions. Each track consists of a Datatable that lists all the loops along with metadata for each one, such as its instrument type, intensity level and what gameplay modes it’s compatible with.
This is a pretty complex system, with loops (or stems) being added and removed dynamically based on a number of gameplay systems. Luckily, using Unreal’s UMG system we were able to quickly put together a live visualiser that we can bring up at any time to see what the music system is up to. By simply exposing a few parameters of the system to Blueprints, we can use some simple UMG widgets to create a visualizer that shows what loops are currently playing, what gameplay tags are set, and even lets us interact with those systems by changing the parameters with a couple of drop-downs. This has the added benefit of making the whole system intuitive to new users, and has allowed us to work with a number of different composers who can then create music for our system and ensure it works in-game exactly how they had envisioned.
The whole audio system uses Unreal’s standard Audio Components to actually play all the sounds. This allows us to use Sound Cues to modify the sounds themselves as they’re playing. Our Audio Designer, Almut Schwacke, has created awesome futuristic gunfire sounds that layer different components together procedurally using Random nodes that select from a set of variants for each layer, before a Modulator node adds some final pitch and volume variety, and finally a Mixer node combines them together. This allows us to create a huge number of variants with just a small set of source audio.
Powerful as it is out of the box, the Sound Cue framework is also very easy to extend by creating custom Sound Nodes. By simply deriving a new c++ class from USoundNode and overriding a few functions, we can add brand new functionality that can be used to add custom effects to a sound. We wanted to create the effect of hearing the club’s music through layers of thick concrete when outside, but when on the dancefloor you should be right there, so we’ve created a simple Global Club FX node. This node uses the player’s location in the game world to dynamically adjust parameters such as a low pass filter to modify the music and achieve this effect.
The audio system has been in development for several months now and we’re very happy with it. Berlin has a great techno and electronic music scene, and our plan for the game has always been to find different composers and producers to create a diverse soundtrack. However, with our system each track is added to the game as a set of individual wav files, one for each loop, along with a whole set of metadata for each loop - and our more complex tracks have over seventy loops. We input this data into the engine using Unreal’s datatable system, but flexible as that is, adding that much data can be quite time-consuming, and we want our composers to be able to tweak these values themselves to iterate on their tracks. That’s where the Automation System comes in. With a few simple calls to the Asset Registry API, we’ve created a workflow where a user simply drops their wavs into the Asset Browser, and then runs an automation task to have the editor automatically populate that datatable with all the wavs. It even does some simple filename parsing to extract what metadata it might find already in the file!
It’s not just the actions and gunfire that happen on the beat - elements of the environment do too. Clubbers dance to the music, lights move, and the dancefloor pulses to the rhythm. Much of this is setup in Blueprints using timelines. For maximum flexibility, we wanted to ensure that we can use any timeline or curve and have it ticked to the rhythm, no matter the length of the timeline or the speed of the track. To do this, when the track begins we calculate a multiplier that converts the natural length of the timeline to a set number of beats, and use the Set Play Rate node to speed up or slow down the timeline as appropriate. We can then play any timeline when the track begins, using its output to drive anything from light intensity, material parameters or rhythmic movement.
With many years of Unreal Engine experience under our belt from our previous work at AAA studios, using Unreal for our first independent project was an obvious choice. But it’s really incredible that even with a small team and a small budget, we can punch far above our weight due to the scalability and power of the engine. I’d only really dabbled in audio coding and sound design before we began this project, but the familiarity of the toolchain and extensibility of the underlying systems has meant we can take an innovative approach to creating audio for an indie game, and have it pay off.
We’ve been working on All Walls Must Fall for over a year now, and are currently running a Kickstarter campaign to raise funds for our closed Alpha access, which will begin in May this year. If you’re interested in Alpha Access, check out the Kickstarter now!