Google Was Building Smart Speaker Prototypes Well Before Amazon Echo Showed Up

Google saw back in 2013 that natural language search (and other assistive services) should respond to ambient voice queries.

Google Was Building Smart Speaker Prototypes Well Before Amazon Echo Showed Up

Google was working on a natural language device similar to Amazon’s Echo back in 2013, more than a year before Amazon’s breakthrough device was released.


During a meeting Wednesday at the Google I/O developer conference, two Google execs–VP of Google Assistant Scott Huffman and VP/GM of Home Products Rishi Chandra–told me that back in 2013 Google had already built prototypes of such a device. In one, some additional microphones were added to a tablet device, which was outfitted with natural language software.

The device was designed to sit in a room and listen for a trigger word (“Hey Google”) from a user in the room, then deliver information via a speaker. Not that it worked very well. “I remember waking up one morning and hearing my kids screaming at it ‘Hey Google! Hey Google!’ and it wasn’t responding,” Huffman joked.

Chandra told me Google even had a joke prototype of a voice assistant device, which was the shape of a large hockey puck, and when you lifted its lid off there was just a phone inside talking through its speaker.

Rishi Chandra

The story of the prototypes came after I asked the two men if Google had been influenced by the success of Amazon’s Echo to release its own smart speaker (Google Home in 2016). “Definitely,” said Chandra. “Amazon made a huge bet on ambient voice and it paid off,” he said.

But Huffman and Chandra don’t credit Amazon for the idea of a smart natural language speaker. Google’s own smart speaker device–Google Home–was announced a year ago at Google’s developer event, and only went on sale six months ago.

You can look at Google Home as the second major step in Google’s vision for search, and search-powered assistant features.

Scott Huffman

The first step came to life in 2013 when Google started making text search more contextual and conversational. Google began supporting nested searches where people could make a series of requests to zero in on the desired search result. The search engine, for the first time, would understand that all these queries were related to the same subject matter. It was the beginning of natural language search.

With Home, Google allowed users to speak, not just type, requests in the back-and-forth fashion of natural language.

In fact, as my colleague Harry McCracken wrote for Time back in 2013, Google’s vision for search is to make it something like the computer in Star Trek or 2001. That is, the computer is always listening, always there to assist.

Here’s Google’s then SVP of search Amit Singhal talking to Harry back in 2013:

“As a little child growing up in India, I watched way too much Star Trek. That’s the vision that stuck with me. You can talk to it naturally, you can ask it whatever you need to. It fades into the background. It’s just there for you,” Singhal said.

It looks like that vision is still alive at Google today, as it continues to sink lots of research and development dollars into natural language and machine learning technology. And it points directly at what may be a third major step for natural language search at the company: The smart speaker device might disappear into the walls. It might be a series of tiny microphones distributed around the house, each reporting back (wirelessly) to the Google Assistant’s brain.


Google, Samsung, and Microsoft have each followed Amazon with their own smart speaker products. Apple may join in the fun, too. For now however, Amazon leads the pack with Echo, and shows no signs of giving ground. A research report earlier this month said Amazon has sold an estimated 10.7 million Echos since the device began shipping in June 2015. An eMarketer report says Amazon controls 70% of the smart speaker market while Google has already grabbed a quarter of it in the six months since Home appeared on the shelves.