During the PlayStation 5 marketing campaign, Sony surprisingly highlighted not only enhanced graphics as they usually do, but also three-dimensional sound – so the sound that literally surrounds players like a sphere. But the fact is the spatial sound’s gaining more and more interest when it comes to films and music as well. People can experience it via headphones (Dolby Atmos system, 3D Audio by Sony, Nx by Waves etc.) or a right setting of multiple speakers.
Sound’s gaining more and more interest when it comes to films and music as well.
The whole idea is to recreate the real-world sound within any media using any devices like laptops, smartphones, TVs and so on. The key is the recording or/and mixing process. No extra hardware is needed. Regardless, the device 3D Audio uses HRTF (head-related transfer function, sometimes called anatomical transfer function). It relates to how our parts of the body (like the shape of ears) impacts on the sounds we hear before they reach our eardrums. Since everyone’s ears are different, producers have needed to go for some compromises based on their studies.
The technology is forever in progress, so companies get the most accurate sound differently. Dolby Atmos allows engineers to assign a sound to an object located in 360-degree space. So far, the biggest competitor of Dolby Atmos is Auro-3D. The latter uses three layers to create a three-dimensional impression. Top layer’s right above the listener. Height layer allows the listener to locate the sound, being around 40 degrees over the lower layer which, on the other hand, settles the sound horizontally within 0 and 20 degrees, conveying dialogues. Auro-3D, unlike Atmos, uses the highest quality of sound. And so DTS:X does, but it also pinpoints the sound to the object, like Atmos.
There’s a really promising initiative run together by Eidos-Montréal (Shadow Of The Tomb Raider) and McGill University. Since 2019 they’ve been working on a music-oriented video game demo with spatial sound combining different techniques together. They, for instance, use at least 32 microphone capsules built-in into one microphone to capture the variety of sound directions. Think of it like it was a 360-degree camera for sound. The game’s going to be simple and more like a film. As Rob Bridget, Senior Sound Director at Eidos-Montréal, revealed in his interview – he’s highly inspired by a short film called “Upstream” by Rob Petit and Robert Macfarlane. “Upstream” is a black and white picture of a natural environment shot with drones, where the music leads the viewer from one point to another seamlessly. And this is the developers’ approach to the project.
They once again (after the Tomb Raider title) invited Brian D’Oliveira and his team to collaborate on the music and sounds. One very important thing Bridget mentioned in the interview was about composing 3D music – you have to be careful to not distract and confuse players. In other words, the music should differ from sound effects that are set in 3D space as well. All in all, in-game experience is about keeping the navigation easy, so the music overlapping sound, when applied badly, could potentially have players getting lost in the gameplay.
All in all, in-game experience is about keeping the navigation easy.
As for now Eidos-Montréal have roughly created the game’s environment and tested some music prototypes. They got to know the limits and set the base, so now they’re free to experiment and explore new fields of implementing the music the best way. If you’re curious about the project, you can find many more details and insights in the interview with Rob Bridget.