On a recent Friday afternoon, I was riding the San Francisco subway when the doors opened at a station and Mark Johnson stepped into the car. It was a pleasant surprise to see him . . . but not that much of a surprise. Over the years, Johnson has worked at a bunch of startups I’ve covered, including Kosmix (acquired by Walmart), Powerset (later one of the foundations of Microsoft’s Bing) and Zite (which was bought by CNN, which later flipped it to Flipboard). He’s the sort of guy I think of when I think of Bay Area startups.
As we rode a couple of stops together, I asked Johnson what he had been up to since he left he left CNN, where he had ended up after the Zite acquisition, last year. He told me that he’d moved to Los Alamos, New Mexico, to start a company. That was startling.
The company in question is Descartes Labs, and there’s a very good reason why it’s in Los Alamos. It aims to commercialize image-recognition technology developed at the Los Alamos National Laboratory (LANL) under the supervision of Steven Brumby, Johnson’s cofounder, who was accompanying him on his subway trip.
Johnson’s previous gigs were at consumer technology companies. At first blush, Descartes is a wildly different sort of business, and maybe–as long as we’re being superficial–a drier-sounding one. Its goals include helping governments and companies understand environmental and agricultural trends by analyzing decades’ worth of satellite images.
Then again, all of Johnson’s jobs have been at companies that aimed to give computers a deep understanding of big data. Descartes is doing the same. It’s just that its ambitions involve deeper understanding, bigger data, and more profound long-term implications than ever before.
The idea that became Descartes Labs originated with Brumby, and is a continuation of his life’s work. A native of Australia, he received a PhD in particle physics from the University of Melbourne and then came to the U.S. for a logical reason: “Australia has no space program, and I grew up wanting to do space stuff.”
That interest led to an interest in looking at the Earth from space and, more broadly, teaching machines to understand images. Thanks to ongoing artificial-intelligence breakthroughs, “brain-inspired algorithms had the potential to revolutionize computer vision,” Brumby says. “There was the potential for this qualitative advance in the computer’s vision to see.”
At the Los Alamos National Laboratory, which he joined in 1998, Brumby co-invented GENIE, image-analysis software that was capable of identifying elements such as water and beaches in satellite photos. Later, he developed a neuroscience-based deep learning algorithm with applications in social media, satellite imagery, surveillance video, and other areas.
Actually, though Brumby can’t talk about much of the work he did at LANL, which opened in 1943 and remains best known for its founding effort, the Manhattan Project. As stated on its website, the Lab’s present-day mission is “to solve national security challenges through scientific excellence.” Those challenges include nuclear threats, terrorism, and cyberattacks; by their nature, they tend to lead to research that remains classified.
But Brumby does provide a couple of examples. “On September 12, 2001,” he says, “I was analyzing satellite images, trying to figure out how the plumes of toxic stuff coming out of the World Trade Center site were blowing over Manhattan and Queens.” Later, “when Katrina happened, we were looking at the Gulf Coast analyzing how much of it had been destroyed.”
As Brumby and his LANL colleagues were exploring deep learning’s power to help computers understand photographs, plenty of other researchers elsewhere were making progress on related research. As they did, the science started looking less experimental and more like a business opportunity.
“About a year ago, the Googles and Facebooks of the world started to recruit the deep learning experts out of academia, to turn this technology into something commercial,” Brumby explains. “At some point, the technology has to come out of the lab and go into the real world.”
Brumby began to think about building a company around the work he had been doing at LANL. He consulted with Francine Sommer, a venture capitalist based in Santa Fe. They concluded that he should partner with someone from Silicon Valley with experience in data-intense startups. That led them to Johnson, who was contemplating his options after leaving Zite.
“I almost didn’t take the call, because I was skeptical of video search in general and especially from a national lab,” Johnson says. “After talking to Steven on Skype for almost an hour and a half, I was so impressed that I booked a ticket to Santa Fe for a few days later. Steven and I clicked, I met the rest of the team, and the rest is history.” Johnson became Descartes’ cofounder and CEO, with Brumby as cofounder and CTO.
The work that Brumby had been doing at LANL gave Descartes a head start of a sort that built-from-scratch startups do not enjoy. For one thing, the Lab had already invested $15 million in developing the technology. For another, six of Descartes’ nine staffers are former LANL staffers, and another spent time there as a student.
Of course, declaring that you’re founding a startup to do image recognition hardly narrows things down at all. Descartes almost ended up concentrating its efforts on technology for analyzing photographs from social media feeds. Brumby had conducted such research at LANL; it’s such a logical application of image-recognition technology that several other startups are pursuing it, such as Ditto Labs, which helps companies spot their products in photos posted by consumers.
Ultimately, though, Brumby and Johnson decided to zig while others were zagging. Descartes may tackle a variety of image-recognition challenges over time, but for now, it’s analyzing satellite photographs.
“The first thing we’re really focused on is how you can look at natural resources from space,” says Brumby. But it’s not about teaching computers to analyze satellite imagery as competently as an experienced human being could do. Properly trained, a computer might be able to do the job far better.
“What we’re really interested in is stuff where if you look at the data, a human couldn’t even tell what they’re looking at,” says Brumby. Descartes’ technology can analyze “spectral data and types of light invisible to the human eye, and understand it through long time sequences at global scale.”
The implications of the this understanding could be huge. One example: “We’re looking at showing people what crops are being grown where, to a high degree of accuracy,” says Johnson. “As we live in a time of drought, agriculture is going to have to change.”
“There’s a bunch of interesting questions relating to cities that we think are important for people to understand,” adds Brumby. Such as why slum areas develop and whether growth happens through uncontrolled sprawl or via thoughtful central planning. Descartes could turn satellite images into helpful intelligence on such matters.
When Brumby and Johnson explained their technology’s governmental origins, I wondered if it could lead to a windfall for the LANL if Descartes really takes off–a bit like what happened when Stanford University was awarded 1.7 million shares of Google stock for allowing Larry Page and Sergey Brin to turn their academic research project into a company.
Nope. Descartes is licensing the technology that Brumby and his colleagues developed, but Johnson told me that if Descartes does something useful and succeeds at doing it, LANL considers that to be reward enough. The Laboratory’s research “is funded by the taxpayers, and when security is no longer a concern, they try to transfer it out for the general good of the public,” he says.
By deciding to analyze satellite imagery, Descartes narrowed its focus. But it didn’t choose a small, baby-steps sort of project. Its data set includes decades’ worth of photographs spanning the entire planet; it’s tough to think of another image-recognition project that involves recognizing so much imagery.
The numbers associated with the project tell the story. The satellite photos that Descartes has processed so far encompass 10 billion square kilometers of land, an amount of territory a thousand times the size of the U.S. They contain the equivalent of 50 million one-megapixel photographs’ worth of pixels, and take up 700 terabytes of storage.
So far, the company has converted the equivalent of 20 Earths from satellite photography into maps. But that’s only 5% of the satellite imagery that NASA has collected over the past 35 years. Descartes intends to process all of it. Once it’s finished, it says that it will have what amounts to a 400-frame video of the Earth since 1980–with each frame containing 4.5 billion (yes, billion) pixels of data.
Crunching all that information is the sort of undertaking that once required a supercomputer. Thanks to ultra-powerful machines such as IBM’s Roadrunner and Cray’s Cielo, LANL was one of the few outfits equipped to do it. And it kept tight control over the process. “With a national lab, you have legitimate security concerns and interests, and therefore a whole lot of rules about how to use the computers,” Brumby says.
For Descartes, though, the situation is radically different. Over the past few years, companies such as Amazon and Google have begun offering Internet-based computing-on-demand services that provide startups such as Descartes with access to tremendous amounts of computing power on a pay-as-you-go basis. Unlike LANL, Descartes can outsource the heavy lifting to third parties, and only pay for as much computational muscle as it needs. Effectively, the cloud is its supercomputer.
Startups are by no means unknown in New Mexico. (In fact, venture capitalist Sommer, who helped make the connection between Brumby and Johnson, lives there so she can invest in them.) But it’s safe to say that Los Alamos is not San Francisco or Palo Alto, where entrepreneurs are among their own kind and there’s nothing remarkable about running into a founder on the street, in a coffee shop, or aboard public transportation.
“A bit of isolation is not a bad thing,” claims Johnson, who argues that it’s easier for people building a company to go into a productive, heads-down mode if they’re free of the distractions of a startup-centric community. He adds that the area offers some unique benefits, such as the presence of the Santa Fe Institute, a scientific research organization founded by former LANL employees. “There’s a surprising number of resources here if you just know where to look for them.”
Still, the resources of the Bay Area startup scene are also available when Descartes needs them: they’re just further away. The company already has a small office there, and plans to do more hiring. And when I ran into Johnson and Brumby on the subway, it was because they’d traveled to San Francisco for a board meeting.
Descartes’ short-term challenge, as Johnson puts it, is “turning the science into a product and turning the product into a business.” What’s next involves more refinement: The company is still in the process of “understanding the market we’re attacking, and helping the scientists to guide that science towards the problems we’d like to solve.”
That’s not to say that Descartes’ founders think that the potential for its technology is in any way restricted to the applications it’s currently pursuing. Before we’re done talking, in fact, Brumby riffs on the possibility of analysis of the sort the company performs eventually being performed directly on everything from consumer gadgets to robots, using ever more-powerful processors, so they can not only see, but understand what they’re seeing.
“It’s cheap to put little camera systems in everything,” he muses. “The problem is, what do you do with the data coming from all those cameras?”
High-end image recognition was once the province of well-funded research labs equipped with supercomputers. Today, a tiny startup with world-class scientific expertise and access to the cloud, such as Descartes, can leverage it for specialized purposes. If the technology keeps on progressing–and there’s no reason to think it won’t–it could end up everywhere there’s a camera.
“Just in four months, we’ve come a long way,” Johnson says. Given enough time, there doesn’t seem to be any limit to where this vision of machine vision could go from here.