Fast company logo
|
advertisement

TECH

Privacy’s not dead—it’s just not evenly distributed

Digital privacy is about far more than our personal data.

Privacy’s not dead—it’s just not evenly distributed

[llustrations: rosadu/iStock; cundra/iStock]

BY Alex Pasternack6 minute read

This story is part of The Privacy Divide, a series that explores the fault lines and disparities–economic, cultural, philosophical–that have developed around digital privacy and its impact on society.


It’s been almost six years since an NSA whistleblower exposed the reach of government surveillance and a year since a Cambridge Analytica whistleblower unveiled the corporate side of the surveillance coin, and the risks of something as simple as an online quiz. But the public has been hearing and sounding alarms about digital privacy for an eternity. “Issues about violating people’s privacy don’t seem to be surmountable,” Mark Zuckerberg told the Harvard Crimson way back in 2003, after his Hot or Not-esque Facemash website, built from photos he scraped in bulk from dormitories’ online “face books,” caused a campus uproar.

Three months later, thefacebook.com launched. By 2010, he was declaring digital privacy essentially extinct.

In the face of all the data abuse, many of us have, quite reasonably, thrown up our hands. Some have even parroted Zuckerberg and the other big tech executives who have heralded the end of privacy. But privacy didn’t die. It didn’t go away. It’s just been beaten up, sold, obscured, diffused unevenly across society. To mutate William Gibson’s quote about the future and its unevenness and inequalities, privacy is alive, it’s just not evenly distributed.

Like the disparate impacts of surveillence, what privacy is and why it matters to us has always depended upon who you are, your age, your income, gender, ethnicity, where you’re from, where you live, what you value.

And yet, even if we don’t all define or care about privacy the same way, we can mostly agree on what it looks like when it’s in jeopardy.

Privacy is personal. It’s about the creepy feeling that our phones are literally listening even if they’re not, and the numbing feeling from an endless parade of data breaches that tests our ability to care anymore. It’s the unsettling sense you have given “consent” without knowing what that means, “agreed” to contracts we didn’t read with companies we don’t really trust. (Forget about understanding all the details; researchers have shown that most privacy policies surpass the reading level of the average person.)

Privacy is the annoyance of being bothered and the pain of being violated. The concept of modern privacy is often traced to an 1890 law review paper by Louis Brandeis, who was frustrated by the loss of privacy the elite had suffered at the hands of journalists and gossip columnists. The right to privacy, he wrote, was primarily “the right to be let alone.” These days, one of the central tenets of stringent data privacy law is the right to be forgotten.

“Privacy” is about the data about us that’s harvested, bought, sold, and traded by an obscure army of data brokers without our knowledge, feeding marketers, landlords, employers, immigration officials, insurance companies, debt collectors, as well as stalkers and who knows who else. It’s the body camera or the sports arena or the social network capturing your face for who knows what kind of analysis. Don’t think of personal data as just “data.” As it gets more detailed and more correlated, increasingly, our data is us.

How some New Yorkers think about their digital privacy:

But privacy isn’t just about controlling your own data. Privacy harms don’t just hurt us individually. They impact societies and economies and policies, in subtle and not-so-subtle ways. The harms come from everyday software that aims to nab you and nudge you by design, or from million-dollar spyware that governments use to nab terrorists but also political dissidents, and send them to prison or worse. Privacy is damaged by supposedly fair and transparent algorithms that aren’t, turning our personal data into risk scores and advertising that can help perpetuate race, class, and gender divides, often without our knowing it.

Privacy harms are dark ads bought with dark money and the micro-targeting of voters by overseas propagandists or domestic political campaigns. That kind of influence isn’t just the promise of shadowy political consultants or state-run misinformation campaigns, but the premise of modern-day digital advertising. (Facebook’s research division hired one of the psychology researchers behind Cambridge Analytica’s data-scraping Facebook quiz.) “Privacy” isn’t just privacy, but fairness, security, freedom, justice, free speech, and free thought.

Whatever we paid for these increasingly indispensable, sometimes wonderful “free” services, whatever our data was worth, it wasn’t nothing. In the last few years, it’s become clear just how unfair this arrangement was, and the surge in public frustration over a litany of data abuses has reinvigorated the idea of privacy, forcing us to reexamine what we really value and what we expect from the tech giants and other parts of the digital economy.

Even the tech giants themselves have gone into the privacy business, some with a little more credibility than others. At Apple, which guards the mountains of data it collects on customers and isn’t a major player in the ads business, privacy itself has become, essentially, one of its central products. At Facebook, privacy has apparently come back from the dead to become the company’s core philosophy.

Enhancing digital privacy laws is also now a rare point of bipartisan agreement among lawmakers, and even the big tech companies say regulation is important. But what those new privacy rules look like—what privacy will look like in the next few years—will depend largely on why we think privacy matters, and how we define it.


The Privacy Divide

This week we’re breaking the concept out, examining some of the fault lines and disparities–social, cultural, economic, philosophical–that underlie modern-day privacy, as part of a series we’re calling The Privacy Divide.

advertisement

Ciara Byrne looks at the split along class lines: Increasingly the wealthy can pay for more privacy, but the poor have always surrendered their privacy for survival.

Chris Gilliard writes about an experiment in modern-day privacy, and its unexpected lesson in power, and who gets to see whom.

An eighth grader explains how her family’s use of Facebook completely changed her mind about social media.

Glenn Fleishman examines how Do Not Track got caught between the hopes of privacy advocates and the fears of the ad industry.

Also:

The digital bargain: Increasingly, people say they want to own their data; some want to get paid for it. Still others aren’t just happy about the bargain they’ve struck with the tech companies—some people want to hand over even more of their personal data.

Data encryption: The digital security community and the law enforcement community remain divided over end-to-end encryption. Companies are also caught in a central tension in the era of big data and encryption: the more secure the data, the harder it tends to be for those companies and others to actually make use of it. While the law enforcement debate rages on, a long-awaited encryption technology could offer industry and academia a new way to resolve the personal data dilemma.

The workplace: Workers have typically had few privacy protections in their work-related communications, but a new array of technologies may erode what protections remain. Still, American workers appear divided over how worrisome this surveillance is.

Awareness: One central irony about modern day privacy is that we consumers know very little about the companies and data brokers that know so much about us. After all, that’s their business. And it’s a very big business, with what most people would agree is very little transparency and few choices for consumers to opt-out. Increasingly, this “privacy asymmetry” is playing out not just online but in public, where a growing number of random things are surreptitiously tracking you.

Follow the series here and read more digital privacy coverage here.

PluggedIn Newsletter logo
Sign up for our weekly tech digest.
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Privacy Policy

ABOUT THE AUTHOR

Alex Pasternack is a contributing editor at Fast Company who covers technology and science, and the founding editor of Vice's Motherboard. Reach him at apasternack@fastcompany.com and on Twitter at @pasternack More


Explore Topics