A New Vision For Sound: Chris Milk Breaks Down How “Beck: Hello, Again” Was Made

For Lincoln’s “Hello, Again” project, director Chris Milk and a team of experts created new sound and vision technology for a 360-degree rendition of “Sound and Vision,” all in the service of a more human interactive experience.


Modern interactive experiences are meant to be immersive, inviting people into a well-crafted world and handing over the narrative controls. But barriers to true immersion remain: a keyboard or mouse to navigate, and an audio track that usually squeaks out of tinny laptop speakers.


All that has changed with “Beck: Hello, Again,” an interactive concert that not only allows people to control what they’re seeing by moving their head, but it serves up an astounding 360-degree binarual audio experience that makes it feel as if you’ve been dropped right in the middle of the action.

Conceived by automaker Lincoln and agency HudsonRouge, and produced by, “Hello, Again” is a yearlong program that invites contemporary icons to transform classic works of art, fashion, film, and music into a fresh, new creation. Beck, the project’s first featured artist, reworked David Bowie’s “Sound and Vision,” while director Chris Milk set out to reimagine the concert experience in the process by placing a 160-plus-piece orchestra in a ring around the audience.

While those in the audience enjoyed the unique aural event, Milk (known for his interactive pieces like Arcade Fire’s The Wilderness Downtown and The Johnny Cash Project) also worked to replicate the moment online. The site managed to make viewers feel a part of a rare and intimate experience, evolved the mechanics of interactive navigation, and invented a brand-new head-shaped, 360-degree binaural recording device in the process. We asked Milk to break it down for us.


Co.Create: Tell us a bit about how you first got involved in this project?

Chris Milk: The initial brief I was given was to find a way to reinvent an audience’s experience of live music, both at the event and when it’s broadcast online. Beck was already involved–though the song wasn’t locked down yet–and he was interested in working with a wide array of different musicians.

When you stand in a traditional audience you have a wall of amplified sound coming at you from one direction. Everyone’s familiar with that. I’ve played in bands myself, and sat on the floor photographing some of the greatest bands in the world while they rehearse: What’s always struck me is how different the sensory, especially auditory, experience is when you’re in the middle of the music with the musicians playing off each other around you. I wanted to find a way to unlock the intensity of that, to recreate that unique perspective, first for the hundreds of people who attended the concert, and eventually for a much larger online audience. It gives them access to an exclusive phonic experience, one usually reserved for the musicians themselves.

From there my first challenge was how do I execute that environment in a live venue setting. The idea I had was a sort of inverted theater-in-the-round approach. Beck is on the center stage rotating in one direction, the audience is sitting around him on a stage that rotates in the opposite direction, and a huge group of musicians of many disciplines are in a ring around the audience. Sitting in the crowd, the sounds of individual instruments and voices would be coming from every direction around you, changing all the time. I should point out that Willo Perron and Associates were the ones who helped take my initial idea for the staging and figure out a way to physically manifest it all, no small task.

What were you trying to achieve with this online experience?


Given the unique setup of the live event I was proposing, I wanted to create an experience that gave the user the closest possible perspective to being there, not just visually but becoming immersed in the music from an auditory standpoint. In some respects, the goal was to create something even better than being at the concert, because we give the user even greater freedom–you can stand on stage with Beck, or walk around the outside of his stage, or walk through the musicians–all behaviors that would have been severely frowned upon at the actual event.

Tell us a bit about the technical elements. Apparently you’ve invented (patent pending) a 360-degree binaural recording device…

Chris Milk

I was already familiar with binaural recording and had played around with it a bit previously. It’s typically accomplished with a model head that has a microphone in each ear: The microphones then hear and record sounds like our own ears do, so when you listen back on normal headphones the sound is spatially in 3-D. Not only can you hear an instrument to the left or right of you but you can hear if it is in front of or behind you.

Those heads only record with the ears facing in one direction at a time, though. So if you have it facing a band, the sound you’ll hear upon playback will be as if you were standing in front of the band facing them. In essence, it’s the soundtrack to just one camera angle, and because I was recording the Beck performance with three 360-degree cameras, I needed the heads facing in every direction simultaneously. I needed the sound recorded as if you were facing both towards and away from the musicians, in all 360 degrees, all at once. The solution was a head with ears all around its circumference. I assumed the solution already existed. It didn’t, so I had to invent it.


So, how does it work?

It’s a multi-axis binaural recording head that was physically created through the advice and hard work of a multitude of talented people and companies. Jeffrey Anderson–who has his own incredible and affordable binaural recording system called “The Free Space”–helped us tremendously by testing various multi-ear configurations and providing us with endless advice on how to achieve the best binaural recordings. We also used his ear model in our finished head. Stopp, a post house in Sweden with an interactive division in L.A., built the interactive site along with Dianamoe, an interactive sound company also based out of Sweden; they helped us test the head, figure out how many ears we really needed and the best way to configure them. They also built the player tool, which allows the eight tracks of audio to play back in sync with the video and re-creates whatever binaural audio perspective you’re witnessing through the video.

Once we had the technical requirements figured out we brought on Alan Scott and Legacy Effects in L.A. to help us design the shell of the head. Legacy normally does amazing models and makeup effects for big Hollywood feature films. I really wanted something that was functional from the binaural recording standpoint, but also aesthetically intriguing. I wanted the head to lend itself to a larger narrative about how this moment in time was being captured. It was one of those magical coincidences that the song we ended up doing for this project was “Sound and Vision” as the final head has sound for vision; ears for eyes.

What do you think some of its future uses might be?


As entertainment and storytelling move in the direction of more immersive environments, binaural sound will begin to play a larger and larger role in those experiences. My head is a way to capture live 360-binaural audio on location, but depending on the project, the subject matter, and the distribution model, there are a few different ways it could all be accomplished. Jeffrey Anderson is working on a software solution, for instance, which could make things even easier.

Now from sound on to vision… I understand the 360-degree camera was new as well.

The 360-camera that we used is a fantastic rig called 360Heros, invented by Michael Kintner. It was really a stroke of luck and some hard work by my producer Samantha Storr. We were researching online all the existing 360-video camera systems. I was a little disappointed because all of them had what’s called a “nadir hole,” which is the black circle you’re accustomed to seeing at the bottom or top of most 360-videos online. I was on a deep Google search dive when I found a guy who had posted extensively on 360-systems and seemed to be something of an expert. Samantha tracked him down and learned he was working for a brand-new company that hadn’t released their system yet. It didn’t have a nadir hole. He showed us a test and I knew these were the guys we should work with.

Michael’s 360Heros system uses six GoPro cameras to record in every direction. Because of the way he places the support structure to hold the rig, he’s able to visually erase the support later as it lives between the paralaxing of two camera positions. It’s truly remarkable, and even more remarkable to be the first production to try out the technology.


Tell me a bit about the facial tracking controller. Were there any challenges with implementing this?

My goal was to create the most organic, human-centric, first-person perspective and experience I could, and when you’re in meatspace and you want to look up, you tilt your face up; you don’t use a mouse or a keyboard. The controls emulate, the best we can with the current technology, the actual experience you would have in real life. So, if you enable your webcam you will be able to tilt and pan the 360-camera perspective simply by tilting and turning your head. Is it a perfect system? No. It’s an experiment for me, though. I would have liked it to be eyeball tracking but the technology isn’t quite there yet. I expect the evolution of interface design will be more and more about emulating the real physical human actions required to do the virtual digital task–then the interface disappears.

I’ll let Zachary Richter from Stopp L.A. explain the mechanics of our particular facial tracking system…

Richter: We’re using a Flash framework called “Project Marilena,” which detects objects using actionscript. Essentially, what the framework does is capture the bitmap data from a user’s webcam, which tells us exactly where their face is within the webcam image frame. We then wrote some code that pairs the bitmap data to the 360 footage, so when a user moves their head to the right within the webcam frame, we quickly detect that movement and the code tells the footage to move to the right based on a calculated speed.


You’ve done a ton of work with interactive experiences… what does this project represent in terms of interactive storytelling? Were there things you learned in this experience that were “aha” moments?

Milk: My primary goal is always to tell a story that will resonate with people on a deeply emotional level. But what fascinates me, and what I’ve been experimenting with a lot lately, is figuring out how can we use modern developing technology to tell stories that feel more human and have more poignancy than was possible before that technology existed.

We don’t know what the established models of interactive storytelling will be in 100 years, just like the pioneers of cinema didn’t envision a 90-minute feature film with a three-act structure. We can only experiment, keep creating new canvases, keep painting new things on them. The best part about this rapidly evolving interactive canvas is that the viewer or listener or user isn’t a passive receiver anymore: They’re participating in the narrative; they’re co-creating the art. Look at web-based interactive films, video games, or virtual reality environments–all of them have resonance because they’re as much about what the participant says to the piece, as what the piece says to them.

Most of my projects are experimental on some level; I always discover or learn something that I carry forward. This Beck project for me was really a big experiment in sensory immersion. How can you create a surround sound and vision environment that can be distributed on a global scale to people using the digital tools and web technology as they exists today? Full virtual reality storytelling though, where you live inside the narrative, is closer than we think. I took this opportunity that Lincoln provided me to test some of the audio-visual immersion theories I’ve been kicking around lately. The results of this experiment will surely be incorporated into the next one.

About the author

Rae Ann Fera is a writer with Co.Create whose specialty is covering the media, marketing, creative advertising, digital technology and design fields. She was formerly the editor of ad industry publication Boards and has written for Huffington Post and Marketing Magazine