Virtual Reality applications are getting closer and closer to end users, and music production is starting to take advantage of this technology. It is now possible not only to offer the public images that can be seen in 360º devices, but also sound that responds to changes of location in a dynamic manner.
Andrés Mayo, former AES President, was responsible f0r the Virtual Reality mix of a concert by outstanding Argentine musician Pedro Aznar (a former member of Pat Metheny’s band). The performance took place recently in his home country, and was specially recorded to be released in a VR environment. Andrés is also currently working on the VR mix of the recent concert from Oscar winner Gustavo Santaolalla ,performed in Buenos Aires.
In this interview, Andrés tells us about these projects, comments on the differences between Virtual Reality, 360º and Surround projects, and gives his opinion about the future of this technology.
ZioGiorgio.es: What is the difference between 360º, Surround and VR audio projects?
Andrés Mayo: A project can contain audio in 360º, but this alone cannot be considered Virtual Reality audio. To do so, it must be produced specifically to be experienced in VR helmets, commonly called HMD (Head Mounted Displays). For example, audio mixed in 360 degrees with Dolby Atmos technology, which is initially intended for cinema, wouldn’t be considered as Virtual Reality. Surround audio projects can be prepared in many different formats: from 5.1 up to 22.2, and are all considered to be Surround. Even quadraphonic sound is a form of surround, but it is far from being 360 degrees.
ZioGiorgio.es: At what stage of development are projects put forward for VR?
Andrés Mayo: Were more developed productions are taking place, like the West Coast of the United States, most VR projects already incorporate audio in 360 degrees. In Latin America we are just starting. In fact, Pedro Aznar’s concert is the first that is produced in this format in the whole region, and the first to be officially approved by Dolby.
ZioGiorgio.es: Can you tell us a little more about this project? What will the public experience?
Andrés Mayo: The audience will be able to see the concert as if they were standing on the stage, using an HMD or Virtual Reality helmet. Ideally, I recommend experiencing it on an Oculus Rift. Most interesting in this case, is that as the audio is mixed in 360º and thanks to the incorporation of head tracking technology, the helmet can know where you are watching from and change the audio playback in real time. This means that you hear each instrument exactly as it would arrive to you as the listener. For example, if I look at Pedro Aznar singing, I perceive that his voice comes from his mouth and the sound of his guitar comes from where the instrument is. But if I turn 180 degrees to see the audience, Pedro is located behind me and I hear him sing behind me, with the audience in front. Thus, the audio is 100% realistic and makes the VR experience much more immersive and credible. If you suddenly change to a conventional stereo mix, it will sound far less interesting and like 2D. In addition to the fact that it does not change when you turn the head, then the sensation of immersion disappears.
ZioGiorgio.es: What tools did you use in production?
Andrés Mayo: We worked together with engineer Martin Muscatello using a tool that is still in the test stage, called Dolby VR Suite. As Beta Tester of this tool, I was given access to versions not yet released commercially. This allowed us to greatly simplify the production process.
ZioGiorgio.es: Did you share the production with Pedro?
Andrés Mayo: The high quality 360º video recording was made by VRTIFY, the leader in content development for VR in Latin America. Martin and I mixed the 360º audio, and it was approved firstly by Pedro, and then by Dolby International, based in Burbank, California.
ZioGiorgio.es: Are you working on other VR audio projects?
Andrés Mayo: Yes – the 360 degree recording and mixing of the historical concert of Gustavo Santaolalla in Buenos Aires. At the show, he reviews his entire career, from the days of Arco Iris to Bajofondo. We recorded it together with VRTIFY. Audio engineer “Tatu” Estela was responsible for the audio recording of the concert and I am mixing it in my studio with Martin Muscatello, under the review of the artist and his partner, Anibal Kerpel.
ZioGiorgio.es: Do you consider that the audio for VR only finds a natural place in video games, or is there also a solid place for the application in concerts and other types of audio-visual productions?
Andrés Mayo: There is a lot to take advantage of in different applications, not just in games. The concerts are a great example, but there are also applications in sports, tourism, and for other audiovisual content in general. 360º audio is in charge of the VR experience; it does not go beyond what is looked at, but rather the reverse; it is the audio that makes you decide where you are going to look.
ZioGiorgio.es: What considerations do you take into account when considering the processing resources of the users’ equipment?
Andrés Mayo: We must try to make the application in which we are working as universal as possible so that the viewer can access the VR experience without worrying about processing limitations. This is achieved by working with compression margins, for video and audio, that allow the maximum enjoyment of the experience without overloading the player. It is a mixed technique, quite complex and also quite empirical, based often on trial and error. The most complicated thing is that there are no standards of any kind, so we are in a “Wild West” state where it is still difficult to impose a viewing criterion that serves everyone equally.
ZioGiorgio.es: The sound in a VR project moves with the user, so, for example, if I walk through my house bouncing a tennis ball while walking, the sound of the ball will change, for example, from one outdoor environment to another closed space. How are these changes considered? With what tools do you work with to achieve these changes?
Andrés Mayo: For now, what you have mentioned only happens in an animation and not in video. When used with video, there isn’t the possibility to move in the virtual spaces, as the current technology doesn’t allow for this. What can be done is the building of different virtual spaces, each one having its own acoustic conditions (early reflections, reverb, dimensions and constructive materials, etc.) which allows the listener to move from one environment to the next in an instant, changing, automatically, with the acoustic conditions of each space. There are plug-ins that allow you to define exactly the parameters of each space should be, and also to add HRTF (Head Related Transfer Functions), which are the ones that define how each sound stimulus will reach the inner ear. This takes into consideration whether the sound comes from above, from down, front or back. The accuracy in directionality is astonishing, in the order of 5 to 10 degrees of accuracy.
ZioGiorgio.es: From your perspective, could applications like “The Music Room” come to replace MIDI controllers in recording MIDI tracks in a professional environment?
Andrés Mayo: As always, I think a lot of users will adopt and try, but I really don’t think it will completely replace the conventional use of MIDI, at least for the next 5 years.
ZioGiorgio.es: What can users come to expect when it comes to VR? How long do you think it will be before it is as normal to use as it is today to listen to music or watch a video on a phone?
Andrés Mayo: I think some applications are going to be much more common than you might think in a very short space of time. This can especially be said for those that use augmented reality, which is already advancing in different types of appliances for everyday use. Pure VR applications, on the other hand, will have a slower inclusion rate as the new generation adopts them more naturally and also (especially) as the devices become smaller. But without a doubt, this is the beginning of a gigantic change in the way we perceive and enjoy the world around us.
© 2001 – 2017 NRG30 srl. All rights reserved