Current Issue
This Month's Print Issue

Follow Fast Company

We’ll come to you.

3 minute read

Technology

Does Privacy on Facebook, Google, and Twitter Even Matter?

Farhad Manjoo discovers the problem with Web privacy — and it's us.

Illustration by Frank Chimero
Illustration by Frank Chimero

What was Google thinking? When the search company launched Buzz a couple of months ago, engineers came up with what they thought was a clever way to kick-start the new social-networking service: They would build Buzz directly into Gmail. Kick-starter number two: Buzz would mine your contacts so you didn't land in a sterile service all by your lonesome (which is what happened with Google Wave). There was only one problem with that plan. Buzz showed the world a list of the people you emailed most often. Apparently no one at Google considered that gaping flaw. What's more, the service was public by default, so anything you said on Buzz would be visible to everyone. Cue Internet kerfuffle, played out mostly over fuming Facebook and Twitter posts. Which was hilarious, because everything you say on Twitter is public by default and the first thing you do after joining Facebook is hand over your email user names and passwords so it can build your social network. Perhaps Google was thinking that we wouldn't mind.

Every few months the Web angrily flares up over some supposedly new invasion of our privacy — the Buzz imbroglio, say, or Facebook's decision to make its users' profile data public by default. Do these companies screw up sometimes? Sure. And they usually move quickly to console users and correct any shortcomings. The real problem may be far less easy to write off in 140 characters: It's all our fault.

These infrequent privacy blowups are actually a sideshow to a much bigger trend. We don't give a flying tweet about privacy. If we did, why are we willingly geotagging photos, telling friends when we're at our favorite restaurant, and revealing so many other once-private details of our lives? Run into the rare Flickr photo restricted to friends and family, or a private Twitter account, and only one thought comes to mind: This person doesn't get it. If we truly cared deeply about preserving a private sphere, none of these phenomenally popular Web services could exist.

Google's mistake — and it's the tell-tale wrench in every privacy dustup — was forgetting what we'll call the paradox of privacy. We want some semblance of control over our personal data, even if we likely can't be bothered to manage it.

In fact, what has to be most galling to Google in hindsight is that if it had followed its own example, it could have avoided being a privacy piñata. Last year, the company made its first foray into "behavioral advertising" — an effort to target ads to people based on their long-term Web-surfing habits. Google was late to this party; most other ad networks already relied on lucrative behavioral ads, but Google had delayed its effort for fear of a privacy firestorm.

When it finally unveiled behavioral ads, Google added a brilliant option to the program — a control panel. It tells you why each ad was targeted to you, it lets you make changes to the kinds of ads you're shown, and it lets you opt out.

The gambit worked. Google heard hardly a peep from regulators or users when targeted ads went live. That wasn't because lots of people chose to opt out of targeted ads. According to Mike Yang, Google's managing product counsel, only a tiny fraction of people, in the "tens of thousands," visit the Ads Preferences Manager each week. And for every 15 people who land on the page, 10 decide to leave their settings unchanged, and 4 choose to change only the kinds of ads they're shown. That leaves just one curmudgeonly user out of the 15 who get as far as the preferences manager who decides to turn off behavioral ads.

The lesson here is striking: Control matters. Privacy doesn't. And as long as we're secure in the knowledge that whatever cool, new Web toy can be turned off, we're fine letting the world peer deeper and deeper into our lives.

A version of this article appeared in the May 2010 issue of Fast Company magazine.