Google Buys Its Way Into Home Automation And Adds Another Hairball To Privacy Debate

Some see disturbing possibilities in Google’s acquisition of connected-home startup Nest Labs. Others cringe at the idea of regulation based on them.

Google Buys Its Way Into Home Automation And Adds Another Hairball To Privacy Debate

When Nest cofounder Matt Rogers announced yesterday that his home-automation company would soon become part of Google, he included a preemptive Q&A that, among other things, addressed potential privacy concerns. The final question in his blog post read: “Will Nest customer data be shared with Google?”


Public concern about the combination of Google’s massive advertising business and Nest’s perch on home walls took many forms. Will any Google+ user be able to control the temperature in your home (after all, they can email you without your address)? Will Cottonelle send an email to advertise its toilet paper after someone leaves the bathroom? How long before Nest’s Internet-connected thermostats and smoke detectors have mics to record conversations for Google?

Neither Google or Nest answered questions like these, but Rogers addressed them indirectly. “Our privacy policy clearly limits the use of customer information to providing and improving Nest’s products and services,” his blog post says. “We’ve always taken privacy seriously, and this will not change.”

You’ll have to take his word for it, because aside from what the FTC deems to be deceptive or harmful business practices, Nest can change its policies about the data it collects inside your home however it wants. While there are laws that protect user data about children, finances, and health, there is no sweeping legislation in the United States that stipulates what companies can or cannot do with their users’ data. There’s a raging debate about whether this is a good or bad thing, and connected tech like Nest has added fuel to both sides.

With an estimated 50 billion connected objects coming online by 2050, some see good reason to put policies in place that regulate the new categories of data they will collect about the people who use those products. “The basic problem with the Internet of Things, unless privacy safeguards are established up front, is that users will lose control over the data they generate,” Marc Rotenberg, the president of the Electronic Privacy Information Center, told Fast Company in an email. Others see the emerging category as a perfect reason to block omnibus attempts to regulate user data. “If we spend all of our time living in fear of hypothetical worst-case scenarios, then the best-case scenarios will never come about,” says Adam Thierer, a Senior Research Fellow at George Mason University’s Mercatus Center. “That’s the nature of how innovation works. You have to allow for risks and experimentation, and even accidents and failures, if you want to get progress.”

In an upcoming book, Thierer argues that the United States should allow for technologies to develop before regulating them preemptively for privacy, security, or safety fears. “Only through trial and error and experimentation and gradual societal adaptation do we find out what technologies work and then which ones are actually harmful,” he says. At that point, class-action lawsuits, the FTC, industry self-regulation, and, if need be, congressional legislation, can help prevent abuse.

He doesn’t believe the Internet of Things, and the big data innovations it could power, is at that tipping point. Dan Caprio, who spoke at an FTC conference about connected objects, agrees. “It’s too early to be thinking about regulation,” he says. “This is the beginning of the beginning. Business models are developing and evolving very quickly.”

Some legislators, however, would rather not wait for a technology to develop definite abuse patterns before they prevent them. As companies like Google and GM announced new connected-car initiatives at CES, for instance, members of the Senate were busy preparing a bill that, if passed, would stipulate that car owners controlled the data collected about their movements through a “black box” or event-data recorder (EDR). “The major concern is not what an EDR gathers now, but that future in-vehicle technologies will make it possible to virtually record and track a vehicle’s movement from point A to point B,” Thomas Kowalick, who wrote a book about the black boxes, told The New York Times.


Not all privacy concerns, of course, are directed at the future. Many consider ads featuring content from Facebook and Google’s users, for instance, to be a privacy violation. Path recently agreed to an $800,000 settlement with the FTC for mining its users’ address books, without permission, to find potential new customers. And then there’s all the data tech firms collected that ended up with the NSA. “This is not about regulating a company, “ Rotenberg says. “it’s about protecting consumer privacy. Companies that don’t collect user data don’t have a problem.”

Rotenberg says he plans to return his recently purchased Nest thermostat.

Thierer, meanwhile, imagines how he might have used a Google-connected Nest while he was visiting the consumer electronics show last week. Perhaps Google Now could have told him that it was freezing in Washington, D.C., and he could have turned down the heat. “Would that have been creepy?” he says. “To me it would have been helpful. So for everything that people regard as a negative, I can usually find a positive. And if there’s that balance there, then it should be left to individuals to decide for themselves how to decide that balance.”

About the author

Sarah Kessler is a senior writer at Fast Company, where she writes about the on-demand/gig/sharing "economies" and the future of work.