Man and his Machines

Forget sci-fi. what to really expect from artificial intelligence--today and tomorrow.

Imagine a Machine with a full range of human emotions," says Dr. Will Caster. "Its analytical power will be greater than the collective intelligence of every person in the history of the world." Caster isn't man or machine: He's actually the latest in a long line of Hollywood-created ­fictional characters--Johnny Depp plays him in the recent sci-fi movie Transcendence--designed to make you feel simultaneously elated and scared about the future.

While Hollywood projects what it thinks artificial intelligence will look like, we are already starting to coexist with actual learning machines. And instead of waiting for AI's Godot--a machine we can converse with--what we really need are ways to use machine intelligence to augment our ability to understand our increasingly data-rich and complex environment.

The world of information has surpassed human cognitive powers. More than 100,000 tweets and nearly 250,000 Instagram photos are shared per second. Now add sensor data from accelerometers, gyroscopes, and the like. "It is not really just a human world," says Sean Gourley, cofounder and CTO of Quid, a data-insights company in San Francisco.

Think about what happens when you visit a website. It may take 800 milliseconds to load, but about 20% of that time is devoted to an algorithm that takes everything it knows about you, visits an online ad exchange, offers up the spot to bidders, and serves it. In 160 milliseconds. No human could do that. Yes, that ad will likely annoy the hell out of you, but that doesn't mean we don't need machines to make correlations between concepts, sentiments, events, and entities and offer help in making decisions.

Google Now and Apple's Siri may each possess the rudimentary intelligence of a 6-month-old baby, but at least they're taking information around us and putting it into a narrative that's easily understandable to average folks like you and me. A less heralded but telling example is Jetpac, a San Francisco–based app startup. The company takes all publicly available Instagram photos and analyzes them down to the pixels, as well as any tags and location data, to create automatic city guides based on what it has been able to learn. Jetpac algorithms can look at, say, the lower half of any face they detect in photos, and if there are a lot of painted lips emerging from a single location, it may be classified as a glamorous bar or a nightclub. If it seems there are a lot of mustaches, it might be dubbed a hipster bar. Jetpac can draw inferences from the photographers and their family names, and it can distinguish between locals and tourists by seeing how often they check into a place in a specific city. It can even discern the difference between leisure and business travelers by their photos.

As intriguing as an app like Jetpac is, the future beyond it looks something like Google's self-driving car. Although it's still many years from commercial production, Google's car is a good showcase for machines taking real-time data from our physical surroundings and mapping it to what's already stored in the machine. Instead of waiting for us to make sense of it, it makes intelligent decisions. Eventually Google will take what it has learned from its car project and apply it to other information it knows about us--data streams from our Android phones, Nest devices, and Google Glass­--creating a hybrid man-machine experience.

This brings up many moral and sociological questions: What will happen to serendipity and the joy of discovery? If we as humans struggle with Google Glass, how will we fit into this society? As Transcendence asks, Is it all really worth it? That's one question that people, rather than a machine, will need to tackle.

[Illustration by Raymond Beisinger]

Add New Comment

0 Comments