The Algorithm That Lets James Murphy Turn Tennis Matches Into Music

Each player’s movement is mapped to audio samples that the LCD Soundsystem frontman will remix and release.

The Algorithm That Lets James Murphy Turn Tennis Matches Into Music
[Photo: Flickr user]

The drama of a tennis match can be heard in the rhythmic sound it produces. The ball pops and whooshes, a player grunts, then come cheers–in its own way, it’s almost musical. But what if you could turn the players’ movements into actual, listenable songs? With a combination of data, algorithmic smarts, and human creativity, you can–at least if you’re James Murphy.


Murphy, the electronic musician best known as the founder of LCD Soundsystem, teamed up with IBM, Ogilvy & Mather New York and Tool of North America creative director Patrick Gunderson to take tennis matches from this week’s U.S. Open tournament and turn them into over 400 hours of music. The finished product is a browser-based environment that lets you hear the game-generated songs and create your own by selecting different players and courts from the U.S. Open.

The songs probably aren’t something you’d leave on in the background at your desk. “I’m not writing music,” Murphy says in one of the promotional videos released by IBM. “I’m generating probabilities for music.”

For more digestible tunes, you’ll want to keep an eye out for the 14 remixes Murphy will be producing based on the sounds generated by the U.S. Open matches.

Gunderson and his team of developers built an algorithm that maps each player’s movement to audio samples and automatically composes electronic soundscapes. IBM is using these sonic matches to market its Cloud Orchestrator suite of tools, which ingests data about everything from serve speed to the number of volleys.

“Each match we’re scoring actually has people with little handhelds inputting all kinds of data for every point,” says Gunderson. “Data like: number of shots and serve speed. Was there a net approach? Was there an unforced error? Where was the player when they hit the shot?”

That data is then passed through a custom-built music sequencer that turns the data points into musical tones.


“It doesn’t work like a standard sequencer,” Gunderson explains. “Instead of going in and setting where beats are I devised a system that uses composite periodic functions to determine when notes are played.”

To illustrate the concept, Gunderson cites the example of the moon and Earth traveling around the sun in a “spirograph” pattern. Imagine that when the moon and sun are at their farthest apart, a musical note is triggered. That’s essentially what’s happening with the more complex, less constant tennis match data. “Throw in a few more levels of cycle and things begin to become predictably and controllably random, while remaining based on a linear, repeatable function,” Gunderson says.

In addition to wrangling code–the project was built almost entirely in JavaScript–Gunderson’s team built an interface for Murphy to use to tweak sounds using knobs and sliders, much like the traditional synthesizers that fill Murphy’s studio. For inspiration, Gunderson spent time looking at Murphy’s synthesizers and familiarizing themselves with his creative process. The end result is a semi-skeuomorphic take on physical instruments, but with a fresh take on how the buttons manipulate sound.

From now until the end of the U.S. Open on September 8, you can tune in to a live electronic soundtrack to each match. After that, the sounds will be archived online and the remixes will be made available on iTunes, Spotify, and Soundcloud when Murphy is finished producing them.

About the author

John Paul Titlow is a writer at Fast Company focused on music and technology, among other things.