Music + geeks = magic. As a self-proclaimed nerd and music fangrrl, it makes me really excited to see these two trains on a collision course in Boston. (Right? An unlikely suspect.) Within a few weeks Boston hosted two different future of music events—a wild mix of artists and scientists. I got a chance to talk a little music nerdery with Jim Lucchese, the CEO of The Echo Nest, a big believer in the fact that Boston has the just the right mix tech chops and soul.
The fact is, music and data—they're not that separate from each other. I've been recently cramming music theory in with some self-taught guitar, so I see the light. The Echo Nest uses machine learning (think IBM's Watson) to turn everything that's said, sung and played on the web into a huge amount of data, or what the company calls its "Musical Brain." But the interesting part is that they choose to remain behind the scenes and leave it up to the genius of the tech world to decide what to do with that information. This approach makes a bunch of data, well, decidedly human.
"I don't see what we're doing as a man versus machine battle at all," said Lucchese. "We're capturing people's interactions with music - and making that understanding usable." He calls The Echo Nest a "palate of paints"—a combination of data on people's cultural understanding of music plus technical aspects like song tone, key, etc.—for developers to create literally whatever they want. This piece highlights a few examples.
So why remain behind the scenes? Are they the roadies of the musical data revolution?
"We're never going to say we're excellent marketers," said Lucchese. "We capitalize on what we do really, really well—and put that into the hands of companies like We Are Hunted to build engaging apps."
I say it's about time roadies got their due, too. This stuff is about to be huge. And I love that Boston is at the center of it all. Hear that Silicon Valley? Everyone loves an underdog story.