Interview - Matt Simmonds


Matt Simmonds, Game Audio Manager at VR Studio nDreams, runs the audio side of projects from pre-production and creating audio style guides, to systems design and the day-to-day production up to release. Known for his work on Phantom: Covert Ops, and The Assembley, he spoke to us about creative techniques and immersive audio.

 

Simmonds began his career in the games industry in the early ‘90s, when the 16-bit machines were just arriving and sample-based audio was becoming the norm. “I freelanced for various companies, some of which still exist today like Code Masters.  When the industry moved fully over to consoles I went in-house, starting at Climax where I did a lot of licensed titles and franchise spin-offs such as for Silent Hill, before moving onto Jagex to work on an MMO and then over to nDreams.”

 

With everyone working from home we asked how has he has seen game audio technology evolve over the past decade and how he sees it evolving over the next ten years. “For high-end the last decade has seen content limitations change more to what can be achieved in a reasonable timeframe, rather than the severe memory or streaming restraints of the past.  I imagine spatial audio will continue making inroads beyond VR, and with more focus on it there's the possibility of more rapid improvements to performance.  CPU limitations will always be a factor, though with the ever-present idea of cloud-based content certain areas of a game's audio could well be generated externally to alleviate that problem.  It'll be interesting to see where that goes and what we can make use of in real-time.”

 

Covid has been ‘kind’ to the games industry and many have adapted to working from home, but how might the industry might be affected once we can return to normality? “I think there could be a shift to people working a greater percentage from home, seeing how much of this work can comfortably be done outside of the office environment.  nDreams shipped their last full game during the UK lockdown, so if we need to work under those exceptional conditions again we have a previous case where it worked out for us.  Personally, having worked freelance for several years I am used to working without a regular office environment, but it's not a solution for everyone.” In the VR industry while the home headsets are a large part of the market, another sector of business is in location-based entertainment.  “These have been severely restricted by Covid so it will be great to see them back in operation again.”

 

Interactive audio often relies on headphones for sound playback, we asked how he mixes object based audio to immerse the player. “With VR obviously we use spatializers as much as possible, and objects will have at least a couple of emitters to distract from the 'point of sound' effect you can get with singular 3d positioning in VR.  One other feature we've been using in our new games is a more realistic type of ducking effect with filters, more in line with how you imagine the human ear to react than the kind of cinematic ducking that has been used before.  I think the latter certainly helps with immersion.”

 

Simmonds has been trying out new techniques with spatial audio. “For Phantom: Covert Ops we started experimenting with 'fake' spatial reflections, basically trying to create the feel of a room bounce effect without the current cpu expense.  I think that's something we'll continue iterating on for the moment.”

 

Spatial audio is still a big growth area in game audio tech, so where does he see it heading? “I feel there'll be a breakthrough in performance with spatial reverb and realistic room simulation audio.  For a lot of the work we do it's not useable to a great extent, which is why we've tried our own solutions to get a similar effect.  When I started doing VR 5 years ago we could have maybe 10-15 spatial emitters and you could hear objects switch between the HRTF fields, which is a thing of the past with today's solutions.  There are already some examples of hardware-assisted spatial room simulation using the GPU which makes sense, seeing as to an extent it has a lot in common with path tracing.”

 

As for projects, Phantom has been the most challenging to date. “It launched simultaneously on both the Rift & Quest, two very different systems in terms of performance.  We decided early on to try and keep parity in content between both machines in terms of audio and it's near exact on both.  Plus, we got to work on some challenging systems design such as a water audio system that had to be very performant.”

 

Simmonds uses various creative techniques to remain focused on the listeners’ perception of the story. “One problem with VR is you never know where the player is looking, so having them get the full effect of cut-scenes and the sense of a location can be a problem. For narratives we do tend to still rely on quite traditional diagetic elements a lot of the time.  An audio log playing from a radio or NPC characters deep in conversation that convey some flavour of the game state.  As the player can be in locomotion or interested in something else, they then have the option of picking up the narrative while focused on their own goals.”

 

With consistency in game play audio being paramount we asked how he provides a seamless transition in terms of mix, assets and aesthetics are at the forefront of player experience. “For first-person games in VR we've tried to work more with 'clean' source assets without a lot of production work applied.  Unless we're going for a stylised approach moving the processing to mostly real-time fx and the mixing can give a much more consistent feel in VR. The other big thing we're using is what we've dubbed 'reactive' audio.  Having sounds that can listen to the audio environment and change to reflect what is happening around them without our direct control.  All of that happens within whatever audio middleware we use so it's a lot more performant than scripting.”

 

An example of this in Phantom was the strength of water ripples affecting structures

standing in the water, in turn affecting objects further out.  “You could feel motor boats approaching from a distance by the creaking wood of the structures becoming more animated, and had a chance to find cover long before you saw the boat visually.  You could also tell how stealthy your own rowing was by the paddle ripples affecting the environment around you.”

 

“For aesthetics at the beginning of each project we work out a consistent set of audio

mood boards and style guides that we can refer to.  These can evolve over the lifetime of a project but usually there'll be some initial essence we'll carry through to the end.”

 

Simmonds will sometimes use traditional focus audio to alter the sound mix to favour certain objects, but it's much more of a style guide dependence. “If we can make the player feel the essence of the scene without this we usually go in that direction.  It depends on the project but we try to make VR feel like more of a personal thing to the player and make the audio react how they would react if the situation was real if possible.”

 

“We have though tried some extra-perception effects with spatial audio.  For Phantom the enemies could be looking at the player from any angle so we needed some way beyond a visual radar to depict this.  We ended up using a positional ring of audio that travelled with the player when they were being watched, from this they could tell where they were being seen from even if the enemies were behind them.  It also gave a good sense of dread that the player could try to escape from.”

 

Simmonds uses spreadsheets, planning and middleware to keep on top of the abundance of audio in a game. “I would say middleware is the definite must for us.   While it helps us deliver a better product it also gives the team a central project point that is only audio focused.  Being able to visualize what is the heart of your game audio in one place is a huge benefit.”

 

As for common problems with creating immersive audio, Simmonds thinks a lot of it comes down to inconsistencies in style.  “It is why at the start of a project we'll spend a large amount of time working on mood boards, style guides and making prototype demos to show the team.  VR is a much more personal experience for the player and they're more likely to be pulled from the experience by any jarring elements because the world in the headset becomes their world.  So it's important we set boundaries for where realism and extra-perception or UI audio can meet, and then find ways to present that in a way the player is comfortable with from the outset.”

 

www.ndreams.com