Opinion Pieces

Opinion Pieces


Sound Effects – Simulating Reality?


 

Sound effects not only immerse the game player inside the virtual realm but have an incredible influence on their overall gaming experience. Sound is what adds familiarity and excitement to what is happening on the screen.  From environmental ambience to player feedback sounds, the sound designer imagines, records, creates, and edits sound effects. We spoke to Paul Boechler Sound Designer at Electronic Arts and Ashton Mills Audio Designer at Jagex about creating sound effects for games.

 

Video games give sound designers the freedom to create more immersive audio and make it possible for the gamer to experience stories in a way that is completely unique. Sound design plays an important role in the experience of a game, creating tension, emotion and immersion in the game world. Boechler and Mills talk us through how they create sound effects. “When designing a sound, you need to be aware of all the environments and consider them in your design” said Boechler. “The footsteps you make might sound great for say a dirt road in an open field, having a lot of character, crunch, debris, and scuffs. But if that same material is reused in a warehouse or cave, suddenly all that debris might get way too busy and washy when a long reverb for that environment is applied. Obviously, a proper mix game can help alleviate this. But if you’re thinking about the environments the player will encounter your sounds in, then it can help to avoid unnecessary work and ensure an easier mix with fewer bugs down the road.”

One thing that makes sound design in games different from linear audio is that so much of the content you create is agnostic to the game environment at the design stage, that you can then create rules for how that sound will be heard in the game” said Mills. “You often give the game mono assets and attach them to objects, and it (often as instructed by the middleware such as Wwise) will handle your panning for you based on where your player or camera is in relation to that object. You tell it how close you need to be to that object in order to hear those sounds and how much the sound rolls off over distance, and the game will ride the volume of that sound for you as you move closer or further. The same is true of reverb. It depends on the game of course, but a great deal of the assets you create will be played back in a range of different environments in the game. If you swing your axe at a baddie in a forest, it needs to sound different to when you swing it at a baddie in a cave or in a small room. So you make your assets dry, then use a dynamic reverb system, with different FX applied to different ‘reverb volumes’, which are big invisible boxes in the game levels. So the game will add the type of reverb you want when you enter the respective volume.” 

 

At the design stage you don’t need to worry if you’re cluttering up the game with too many sounds. “A game audio mix is a reductive thing, like sculpture” added Mills. “Our big block of stone is made up of all the assets we put into the game. Then we use chisels (distance-rolloff, mixer snapshots, spatial rules, ducking voice management rules etc) to carve out a mix in real time that communicates key information and delivers the narrative. Iteration between the design phase and implementation phase is crucial. What you design in Reaper might sound different when it's there in your game. You can design sfx that sound absolutely mint in Reaper and you're really chuffed with yourself, so you go and make a cup of tea, you come back and plumb into the game and it sounds crap. Maybe in context it's dull and boring and doesn't bear much repeating, or maybe it sticks out like a sore thumb or clashes with other sounds that are there. Maybe the implementation of it needs to be tweaked or completely rethought. So often this is a process of trial and error.”

 

We asked what techniques and processes are used as creating sounds can be rather time consuming. “I try to have my office setup to make it easier to jump around and get sounds I need quickly” said Boechler. “There’s nothing that destroys creativity or flow quite like needing to authorize a plug-in or to plug in a hard drive. So, if I have some consistent tools I use then making it easier to get up and running quickly helps keep me focused and creating content much quicker and easier. Whether that’s making a template, setting up hotkeys, or organizing sound libraries. That extra bit of work really helps when you’re trying to keep focused on a project. Also helps to keep track of sounds libraries and plug-ins you may have forgotten about.”

 

Mills starts by using a video capture program to get footage of the actor he is working on and then pulls that into Reaper. “If it's still early days and animations and VFX aren’t done then I’ll usually work from concept art instead.  I’ll do an informal one-man spotting session where I scribble down which moments to focus and what the general arc of the sound design is going to be” he said. “I’ll then start building up a palette of layers I’m going to use. This might involve sifting through the sound library, doing some recording or designing some textures in a soft synth. I’ll then start putting layers together, synced up to my video footage. To me, layering audio is all about combining sound textures both vertically and horizontally. Vertical layers are how my sounds stack on top of each other: so for a dinosaur footstep for example we’d have the step layer (a human footstep, perhaps), the boom layer (an explosion with the transient chopped off), the crunch layer (some crunchy leaves and twigs snapping). Horizontal layering is the beginning, middle and end of the sound: the short fade-in foliage rustle before the big stomp, followed by the debris sound of rocks and earth rolling around, and a long, low-passed reverb tail. When designing audio, some of the time I'll plan out the recipe, gather the ingredients I need, then cook up my sound effects. More often than not though it’s more of a mad, improvised raid of the store cupboard and spice rack, adding this, tasting it, adding that, stirring it, realising tastes disgusting and chucking it out and starting again etc. Over time you get an intuitive sense of what flavours are going to work together and what they will bring to the dish.”  

 

Once Mills has made a sound effect he needs to think about how it’s going to play back in the game. “Sounds that get synced to animations need to be broken into chunks so they can be attached to their corresponding frames, and this can take some thought. A lot of sound is driven by game data; for example my dinosaur footsteps are going to need to be different depending on what surface she walks on, and maybe I want her footsteps to be louder or softer depending on how angry she is, so I use the game data to ride the volume fader on one of the layers. I believe that making the assets is only half of the design job; the process of implementation is just as important and just as creative!”

 

There are some factors that can impact on the sound as its being created in the game world. “I think some of the biggest factors for any sound in a game will be how many other sounds will be playing simultaneously and how often is the user hearing this sound” added Boechler. “A great sound can quickly become the worst if it’s repeated ad nauseum without any change over time. And if a sound will be constantly buried by other sounds, you may want to know before you start spending hours or days designing it. Footsteps and foley are important, until a gunfight breaks out. Hearing the chants from a crowd is important, until a goal is scored. So, I’d say the most common technique to help balance these factors is a solid dynamic mix. Whether that’s using side chain compression, mixer snapshot changes, instance limiting, or dynamic systems for the sounds themselves. Each of these can help balance the number of concurrent sounds your user is hearing and help them change over time to avoid fatigue from natural gameplay.”

 

So how do you paint the ‘audio canvas’ for non existent, sci-fi tech, environments, and creatures?  I try to gather as much information as I can, and then use that information to inform decisions for sound selection” added Boechler. “Asking questions to designers or asking for concept art can provide a lot of additional information about how to change a sound. For example, a sci-fi weapon has a sound, but an old and poorly maintained sci-fi weapon has a totally different sound. I find that’s a common starting point, but sometimes you just go with only what feels right. We don’t always need to know how a spaceship works; it just needs a modulating deep energy sound because that’s what we think spaceships sound like. Never be afraid to listen to other references either. We may not know what a laser gun sounds like, but popular films and games have conditioned audiences to expect certain sounds. Great to see what worked in the past and build on that.”

 

“When designing something that doesn't exist in the real world, it's often a case of considering the physical properties of the actor and its movement and what references are there to things that do exist in the real world” added Mills. “For example when working on the RuneScape boss, Solak, who is a big tree monster, looking at how he moves and if a real tree, got up and started walking around, it would be all rustling leaves, creaking wood and snapping twigs. There is a big source of inspiration there. What do things like this sound like in other games and films? I think about references and precedents. My game’s consumers’ ears aren't a blank slate; they will have (usually non-conscious) expectations for what these imaginary things will sound like based on their consumption of other media. That doesn't mean I have to go and copy what other sound designers have done, but being aware of it is really helpful and can be inspiring.”

 

“When you read articles and listen to podcasts of experienced sound people talking about their work they are always banging on about 'storytelling' and 'narrative' and for my first year or so in the industry I struggled to grasp what that really meant. I could see how certain parts of certain types of narrative-driven games could be relevant here, but if I am spending my days working on the sound of a guy going around smashing people with an axe, 'story' can feel like quite a distant concept” continues Mills. “Eventually I came to understand what storytelling can mean for sound design and I think it's helpful to define it. I separate it into 4 categories: 1.Narrative-with-a-big-'N' is all about the dialogue and the actual stories of how a game and its characters develop. 2. Communicating information includes things like did I hit anybody with that axe swing? How much damage did I do? Did the skeleton die? Did I get any loot? Etc. 3.World building is about the sounds that imply things about the game and its characters that aren’t communicated in the visuals, for example some heavy breath and panting before and after an axe swing might imply that the character is tired or that the weapon is heavy for them, or that it’s a special attack. 4. Expressionistic sound design is when you use sound design more like music, applying it in a more abstract and arty-farty way and you are trying to hack directly into the player's emotions to make them feel a particular emotion. Loud rain and thunder in the mix aren't just there as an in-game weather report, they make us feel a certain way too, complex emotional states that we can't really define with words. Heavy use of reverb might convey a sense of emptiness or something long forgotten. There are certain sounds people just naturally find really satisfying just, like pulling a cork from a bottle, or running water. You can encourage players to do certain things by making satisfying sounds that they want to keep hearing.”

 

Sound for game play needs to support the players mind so they feel like they are actually holding the weapon of their choice. “First, I make no claims of being a weapon sound expert and I don’t work on an FPS title; but with that in mind, I have had to cut similar sounds and have taken a lot of existing knowledge from weapon sound design” added Boechler. “Knowing the firing rate and how the user can interact with a weapon plays a big part. If it takes a long time to fire, like a cannon or rocket launcher, you’ve got a lot more time to sell all the actions that go into firing the weapon and therefore you can make them a lot bigger. If it’s a faster rate of fire you may want to just focus on the start and end of the user’s actions. So, making the first shot and last shot particularly impactful will ensure that bursts don’t feel too busy while still having the desired punch and power. Pushing reverb tails on these particular shots will help sell the sound of the space and help it feel more natural.”

 

As for creatures, it’s important that they sound menacing. “With creatures ensuring the speed and pacing of sounds to match the creature size really helps with any emotion” said Boechler. “It might seem basic but big things tend to move slow, small things tend to move fast. This rule can obviously be broken when needed but getting a feel for pacing first, will let you know how much time you have to work with to convey your emotion. As for actually sounding more menacing, I think just pushing more air and noise can add more aggression to a sound. This can come from either finding a suitable air sound or matching the volume envelope needed for a particular vocalization. Or sometimes a bit of distortion to add harmonics and noise gives a bit of that bite and rasp needed.”

“I think creatures sound menacing when they make a range of different sounds. It’s not just about the roars and screams, it's about the breaths and ruckles and chuffing and mouth clicks etc as well” added Mills. “You can be really creative with source material too. I’ve done a lot of recordings where I dangle a DPA lav mic into my throat and do all sorts of weird gargles and croaks and stuff. I've used recordings I've made of pigs, pugs a leopard and other creatures. Another thing I've learned through experimenting is that it's amazing how much creature-like source material you can get from friction based sounds, things like scraping your hand around a wet bowl, or the squeaking of wet rubber shoes. I'm forever hearing sounds and thinking "ooh, that's a creature vocalisation!"

 

When creating sfx sounds there are often some challenges. “Sometimes just starting can be the most difficult part. Either ideas aren’t flowing, or you’re not sure what tool or sounds to start with” said Boechler. “The task can seem so big and daunting that it makes it hard to start. Luckily, this goes away after time, but it still pops up every now and again. If this happens, I usually just grab a bunch of relevant source, toss it into a sampler and start playing around. At least you’re making sound and from there you’re starting to learn what you like/don’t like, and that leads to further ideas and iterations. Suddenly the screen isn’t blank, and things aren’t so bad.”

 

“One of the biggest challenges is friction in iteration, i.e. how long it can take to loop from making a tweak to hearing it in the game, and being able to hear it over and over. My way of working is very much about trial and error” added Mills. “If you're working on a SFX in a post production context, you can just keep slamming that space bar to hear what you're doing in the context it's going to be heard. To hear your work in a game you have to render your asset, implement it into your game (whether that’s directly or using middleware like Wwise) load up the game, get to the part you need to test and play it, and all that time you need to hold in your head the thing you were meant to be listening for, maybe an EQ tweak or a change to the reverb tail, whatever. Middleware is great because you can make some changes like levels and pitch adjustments whilst the game is running, and pulling in new assets can be quite speedy and then you have game cheats to get you to exactly the bit you need to play straight away. I’ve worked on some projects though where the time between making a tweak to a sound and then hearing it played back in game can take as long as 5 minutes, which feels like years if you’re working on finer details.”  

 

When designing sound for a game it’s important to work with the design team to ensure congruency. “Generally audio get involved quite early and sign off on various features that require audio” continued Boechler. “I find a great way to work with other game team areas is to create prototypes and videos to show the work you’re trying to accomplish. This does two main things. It gets buy in from other game areas that need to allocate their own resources to help audio meet the expected vision. It also sets quality expectations for the project, which can be helpful to reference back to at certain points during the project. Once other areas hear what you’re trying to accomplish though, it’s much easier to communicate what’s needed and ensure audio is getting any necessary support from those teams.”

 

“I find that movement is a really important part of sound design, just as it is with 

sound in real life said Mills. “Without movement there would be no sound. So when I’m spotting for audio hooks, very often I’m looking for what is moving, and this means I’m very closely tied in with what the animators and VFX artists are doing. Because the main aim of the sound design is to communicate gameplay information and support narrative, a close dialogue with game designers is crucial, because they are the ones that create that narrative and information. Most importantly though, when I get lost in ‘audioland’ and I’m really absorbed in what I’m doing, it’s when I’m in the zone and can produce some of my best work, but it’s also when I have the highest risk of straying from the path and the narrative, and the game designers are there to pull me back in to the context of the game.” 

 

As for kit Boechler uses Reaper, Ableton Live, and Pro Tools. Ableton Live for design work, utilizing Max for additional audio processing, and he really enjoys it for prototyping. “Reaper for asset creation, and some projects I’ll do entirely in Reaper. Most projects I work with a larger team and Pro Tools is still standard among the group. I’ve got a bunch of hardware toys, but mainly just some recorders and mics for gathering sounds.”

 

As for Mills, he is a Reaper-worshipper like many of his game sound design brothers and sisters. “Reaper-users are a bit like Vegans: they need to talk about it all the time and try and convert people! For sound design it’s the ideal DAW. It has a fantastic region rendering system which is perfect for spitting out large numbers of short sound files which is what game sfx and dx is all about. Another great feature is that tracks are dynamic, so you don't need to worry about whether a track is audio mono or stereo, midi, video, folder, aux bus etc. This is perfect for sound design because you have total freedom to just dive in and get creative and make a big fat mess, then when you need to you can tidy up and organise everything really easily. Reaper also allows you to completely customise your workflow and if you're a total geek like me you can even write code to script up your own behaviours, so you can make your work completely frictionless, letting the code take care of all the mindless mouse clicking and giving you more bandwidth to be creative. My favourite soft synth at the moment is Arturia Pigments. My quick draw FX plugins are Metric Halo Channel Strip for its fabulous EQ, UAD Teletronix LA-2A compressor, Valhalla Room and ReaPitch, the stock Reaper pitch shifter. Fabfilter Pro-L is my levels safety net. I prefer to use Wwise to implement game audio, and Unreal is my favourite game engine to work with, but I have a soft spot for Unity.” 

 

Hardware-wise Mills uses a UA Apollo twin audio interface and controls his DAW with Icon Qcon Pro G2. “I also use a TC electronics Clarity M hardware loudness meter to monitor my LUFS and true peak. I like having this as its own thing rather than using a plugin because it’s always on and its not taking up valuable screen space. My favourite mic to use day-to-day is a small but mighty DPA omni lav mic, it's so quick and versatile and perfect for little experiments in and around the studio.” 

 

With a wealth of opportunity to create effects, from location recording, library sounds to foley and VO recording Boechler uses whatever is right for the sound! “I’ve got a Zoom H6 that’s with me most of the time. But if it’s a more planned session for field recording or library sounds, then I’ve got a Sound Devices MixPre II and a bunch of microphones depending on the source. 32 bit recording and clean, low-noise preamps from the MixPre are solid no matter what source or environment though. But if I’m doing a drop recording rig somewhere or just happen upon a sound, then I’ve almost always got the H6 on me. There’s always a time and place for pulling something from a library and it might work without much fuss. This will obviously change based on the project, budget, and timeline. Most of the time though, I record my own content and perform my own foley.”  


As for Mills, “I use a mix of library and bespoke-recorded source material. My passion as a game sound designer lies in the manipulation and shaping of source material, and the technical problem-solving for implementing the resulting assets and fitting all the parts together into a mix. So although I love to get out and about and record source material from time-to-time, I see myself as more of a designer than a recordist, and I’m happy diving in and working with content recorded by other people.” 

 

“If I’m going to need a lot of source material of a particular type or I’m working on something that I think will really benefit from it then I’ll organise a field recording session” he continued. “For example when we worked on the Land Out of Time which was a large map expansion for Runescape, we knew had a lot of area to fill up with ambiences so my colleague and I went on a short camping trip up to the Norfolk coast where we recorded lots of beds along cliffs and beaches (and various dogs we met along the way), and stopped at a zoo on the way back to record their tropical birds. RuneScape’s Solak boss is a giant tree monster and his sounds all come from a trip we took to some local woodland to record lots of wood and foliage sounds. I also came back with a couple of bags of sticks and twigs and leaves etc to record Solak’s foley in the studio. In 2018 we reworked RuneScape’s mining and smithing skill system, and this involved a remastering of the audio as well. My colleague and I went out to a blacksmith in rural Essex to record the sounds of the equipment and processes used in forging iron. Working with voice actors is also one of the real highlights of the job and I’ve been really lucky over the last few years to work with some brilliant voice talent.”   

 

Sound design for video games spans a range of sub-disciplines both creative and technical. Whether creating sound effects for characters, creatures and environments or weapons or vehicles, there are unlimited opportunities for audio designers to tell stories, communicate information and breathe life into the games they make.

 

www.jagex.com

www.ea.com

 

Other Opinion Pieces