Google Glass's Unexpected Lessons In Product Launching

Google's Leila Takayama talks about what her team learned by letting Google Glass loose in the real world.

Given the level of hype around Google Glass—which recently announced a fashion-forward redesign—it seems nearly every angle of this piece of technology has been explored. And yet, there’s been less focus on the variety of people actually using the technology and insight into what they do with it. Leila Takayama, one of Fast Company’s newly announced MCP 1000 honorees, last year joined Google[x], the Googly research arm, where she is now a senior user experience researcher who has recently been focused on the development of Google Glass.

Fast Company recently spoke with Takayama about what she's learned from watching Glass users in the wild—and lessons may apply equally to any technology aimed at a diverse and general audience.

Leila TakayamaImage: Flickr user Pop Tech


The Google Glass team takes its products for a spin in various parts of the country, to see how people react to it in real life. "The first surprise was who walked in the door," says Takayama. "We were expecting the people to be extremely tech savvy—to kind of look like us Googlers. But what we actually saw were people who were much more diverse. They were people who were just sort of curious: ‘Hey, here’s a line outside a building—let’s check out what’s going on in there.'" It’s important, says Takayama, not to "develop in a conference room."


As soon as Glass got out into the world, Takayama and her colleagues were surprised to see its manifold uses. Glass had been viewed mainly as a utility for the busy and on-the-go, a kind of on-face extension of the smartphone. But many uses were more casual: "The camera was often used for capturing moments that otherwise it would be hard to capture, like when toddler needs help to walk," says Takayama. A whole new genre of pet video spawned—ones where the user had both hands free to play with and prompt their cat, dog, or hamster. Glass had more of a future in the domestic setting than many had initially imagined.


Too, Takayama and co. were surprised to find that markets that were already served by analogous products were nonetheless taking an interest in Glass. Take surgeons, for instance, who often operate with a camera trained on the site of incision—even they were finding uses for the device. Particularly when it came to education. "When you can see an operation from the surgeon’s point of view, it’s different from having a camera mounted at the site of the incision. Just being able to share the world in the way that you see it—Glass is really good for that." The bounciness of the video becomes the point—because the experience of actually operating on someone does, in fact, involve swiftly changing points of view.


Likewise, Google would have thought that fans of on-helmet cameras—there’s a cottage industry of them, for extreme sports types who like to broadcast their adventures—would have no use for Glass. That market, Google figured, was already served. These cameras are often "super waterproof, super robust," says Takayama—surely a specialized market that Google couldn’t crack. "You’d figure that niche industries that needed a heads-up display would already have it," is the way Takayama puts it—and the extreme sports community seemed like a case in point. "Yet many are adopting it for some reason," says Takayama, much to her surprise.


In a sense, Google Glass’s "users" include those around the person wearing them. Their reactions matter. This was a lesson Takayama learned early, when studying telepresence robots: she learned that "if you make a telepresence robot taller, more people are persuaded by you" (something Takayama, who counts herself as short, found particularly useful). As with a rolling avatar of yourself, so with the funny camera-looking thing affixed to your glasses: "With any of these computing technologies, where they’re sort of mediating your experience in the world, it’s going to change our interactions with each other," she notes.


Okay, Vogue editor Anna Wintour may not be trading in her famed shades and bob for a pair of Google’s eyewear just yet. But in August, her magazine did indeed feature a photo spread on Google Glass and a "futuristic vision of fashion." Here, Takayama admits to being a little out of her depth, but as computers become wearable—and subject to all the trends and forces that "wearability" implies—the pages of fashion magazines become newly relevant to the field of "human-computer interaction." Takayama says she’s not a fashionista herself, "but we do work with people who are, for sure. When I’m sitting in meetings with them, I feel like I can’t compete." If you thought iPhones came out with infuriating regularity, just think of the whims of the stylish. "They know what colors are coming up next season, and I have no clue!" says Takayama. That’s one thing she didn’t learn when getting her Ph.D. at Stanford.

Add New Comment