“Your job, having been blindfolded and transported, is to figure out what place you’re at.”
That’s Foursquare senior vice president of enterprise and core technology Matt Kamen, issuing a challenge to visitors to the company’s New York office during a Fast Company Innovation Festival session on Monday afternoon. Audience members chimed in with a variety of ideas as to how they might ascertain their whereabouts, from using their sense of smell to asking a nearby stranger. The point of the exercise was less about right and wrong answers than it was about demonstrating the difficulty of the problem that Foursquare has been working on over the past nine years.
Foursquare is still best known for letting smartphone users check into restaurants, clubs, and other venues—an activity which was originally part of its namesake app and which got spun off into a new one called Swarm in 2014. But a large component of the company’s current business is making its database of locations available for use in third-party apps, which has attracted more than 125,000 developers and customers ranging from Samsung to interactive jukebox maker TouchTunes.
“If you ask a person, ‘Hey, does your phone know where you are?,’ they’ll say yes,” Kamen said. However, knowing someone’s GPS coordinates is only a starting point on understanding location and creating app experiences built around it, a fact that a slide in his presentation to festival attendees summed up as “Congrats, you’re at 40.724210, -73.996937.” And an app that’s trying to determine location might not even have accurate GPS coordinates to work with, especially in locales such as New York City, where tall buildings interfere with satellite signals. Oftentimes, apps such as Uber will show a blue dot with a giant ring around it, an implicit acknowledgement that they have only a rough idea of your position on the map.
Foursquare uses additional signals such as the Wi-Fi networks visible in a user’s location to pinpoint location more precisely than it could do through GPS alone. More important, all the billions of check-ins which the company has collected give it an understanding of location that goes beyond coordinates. “We did a thing with phones that nobody else had done,” Kamen said. “We mapped the world through check-ins and created a phone’s-eye view of the world.”
However, most of that value lies not in raw data, but in the understanding that the company can gain by analyzing it, sometimes with the use of machine-learning technology. For instance, Foursquare doesn’t just have a rough idea that peak business hours for fried-chicken joints tend to be different from those of airports–it knows the precise details because it’s collected data on fried-chicken joints and airports around the world.
For everything that Foursquare has learned, Kamen says that location is still only a partially solved problem. For one thing, smartphones still understand horizontal position a lot better than they do vertical position–an issue given that many businesses aren’t on the ground floor. And machine learning is still a rapidly developing field which should give the company deeper insight into its data in the years to come.
Referencing Victor Hugo’s declaration that “Nothing is as powerful as an idea whose time has come,” Kamen wound up his talk by arguing that an understanding of location is key to the experiences currently being built by an array of industries, from retail to healthcare. “Coming to work every day is continuing to be on the frontlines of this idea,” he said.