Therapy Studios sound designer and mixer Eddie Kim brought his know-how of spacial mapping and special audio elements to handle the immersive sound design for Chris Milk’s game-changing CGI-rendered 3-D virtual reality film Evolution of Verse, which recently premiered at the 2015 Sundance Film Festival. To complete all the sound effects, Eddie worked closely with Jeff Anderson, Founder of 3Dio Sound and inventor of the Free Space Binaural Microphone.
For the latest installment of our ongoing STORYTELLERS series, we chatted to Eddie and Jeff about breaking new ground and changing the technological landscape of sound with VR.
EDDIE KIM & JEFF ANDERSON | THE AUDIO ARCHITECTS
How did you come to be involved with Evolution of Verse? Was this the first time you worked with Chris Milk? Give us some background.
Eddie: I’ve worked with Chris Milk on most of his projects that needed sound design. The first time was on his Kanye West video for “Jesus Walks.” We’re long time friends from film school.
Jeff: I had previously worked with Chris Milk to help engineer Beck’s “Hello Again” – the 360 interactive concert experience reworking David Bowie’s “Sound and Vision.” Collaborating with Chris and his team on the Beck experience was amazing. As a result, they asked me to help with their latest project. I also had the proper tools to do what they wanted to do, although I had not tried them out on such a large scale prior to this.
What was different about your approach to a VR project as opposed to a “regular” film project?
Jeff: I personally have not worked on “regular” film projects at a commercial level. Maybe this is an advantage for me? I’m approaching the binaural sound design for VR from a completely inventive and open perspective. Essentially, I use my own binaural software to create the 3-D audio soundscape so that it sounds as natural as possible. My goal is to make you feel like you are actually there.
Eddie: When sound designing and mixing in traditional mode, sound is placed in a space where the viewer’s focus is in one direction. In VR, sounds are designed for visuals that surround the viewer in spatial relation to where the viewer’s focus is. For instance, in Evolution of Verse, all the birds were sound designed and mixed in a similar fashion. Once the birds were placed in the software, the birds behind the viewer were less apparent when your focus is forward and the birds are louder when you turned around.
Jeff: For this project, the approach was to make it sound like a train is actually about to run you over, and that thousands of birds are flying all around your head. The main difference from a sound design perspective is that you must think in terms of a full 3-D space for the sound design. That includes distance, direction, Doppler effect, reverb, and everything that makes the real world sound the way it does. Audio also must be considered from the perspective that the user can be “looking” in any direction at any time, and the audio must be directionally accurate. Otherwise, the immersive experience is diminished.
What new technology did you use in the project?
Eddie: Jeff’s 3dio software.
Jeff: I used my own binaural audio synthesis engine, which is similar to a binaural audio SDK for video game developers. But Evolution of Verse involved a highly customized version of the software, mainly because of the large number of sound sources required. We were able to simulate the motion of nearly all of the audio objects in the film to reinforce the deep sense of immersion.
Simulating the sound of thousands of birds is not computationally simple. Sure, there were a few tricks I used based on my experience with binaural audio, but I certainly wanted it to sound realistic. So it was a challenge pushing my software to its limits to achieve the large number of sound sources for this specific project.

Therapy Studios sound designer and mixer Eddie Kim
Can you explain what each of your roles was separately, and how they integrated?
Eddie: I envisioned what each visual sounded like and I designed accordingly. Then I re-recorded each element separately and gave Jeff the elements to incorporate into his proprietary software.
Jeff: Eddie would perfect the sound sources, and I would place them in the 3-D space. But sometimes we decided to make changes after hearing the sounds in the 3-D binaural space. We took turns on the headphones to evaluate.
With this kind of workflow, you really don’t know if you’ve perfected the sounds until you hear them in the binaural 3-D space, and with the film. It has to feel realistic during the viewing experience, so changes might be needed in any or all stages of the workflow. For example, the dragonfly sounds in the film are actual dragonfly sounds. That is what worked best.
How do you see audio technology changing in the future in response to VR?
Eddie: I think it’s the wave of the future and more films will be presented this way. To meet with the demand, audio workstations will be redesigned to be capable of handling the VR mixing and sound designing. It will be a union between track-based programs (X-Y relation) functioning in a 3D space (X-Y-Z relation).
Jeff: I see customized tools being developed to make things easier and more efficient for content creation. Translating from 2-D to 3-D is a challenge with audio, and VR audio has not been given as much attention as VR graphics have been given to this date. I think we will see a lot of developments in the near future with regard to accurate 3-D binaural sound for VR, including everything from capture and creation to playback. Personally, I hope to see a lot focus on binaural audio understanding and acceptance as the standard for VR.