Last week I said: We live in a world of unrelenting transparency. What can we do about it?
There are three possible strategies—none of them perfect.
Fight to the Last
We could fight every attack upon our privacy, large or small.
This has been the default response by privacy advocates and civil libertarians for some time. Groups like the Electronic Privacy Information Center (EPIC) and the Electronic Frontier Foundation (EFF) pay close attention to the various ways in which privacy rights can be eroded or ignored through the proliferation of digital tools, and do their best to focus attention on emerging problems. While there have been occasional legal victories, these groups have arguably been more effective at educating the public than changing policy.
The problem is, a growing number of the transparency dilemmas come not from straightforward government or corporate policy decisions, but from myriad personal behavior shifts not readily amenable to legal controls. I doubt that many of us would welcome the kind of intrusive state control necessary to fully rein in surreptitious cameraphone use, for example. Moreover, many of the potentially problematic uses are the very uses that many of us find useful, even necessary. We want to be able to capture the surprising and ephemeral. We want to be able to document our lives.
In the end, this is a story of a desperate defense against privacy intrusions, but one where the "attackers" have an increasing technological advantage.
Conversely, a vocal minority has come to advocate not just the acceptance of greater social and personal transparency, but the furtherance of it. This argument holds that privacy is, if not dead, increasingly moribund, and efforts to shore it up are doomed. Instead, we should strive for greater transparency, in a way that makes us all equally visible. This is sometimes thought of as "symmetric transparency," and ideally includes not just individual citizens, but powerful institutions as well.
Unfortunately, a symmetric-transparency society may be even harder to create than a high-privacy society. We may be able to create a world of ubiquitous documentation of each other—we could probably do so easily, as we're heading that way—but to expand that to the ready documentation of powerful institutions and people would require that those in power agree to giving up a significant chunk of that power. After all, if I can know everything about you, but you can only know a bit about me, I have a clear advantage. Why should I give that up?
There have been cases of those in power giving up or sharing that power without being forced to, but they're few and far between. It's possible that the wave of demands for more corporate financial transparency—and the cases of abuses of power by law enforcement authorities I mentioned last week—could push the notion of symmetric transparency forward. I wouldn't hold my breath, however.
More likely is a world of mutual assured transparency between citizens, but an even greater capacity for institutions of authority to know whatever they want without giving up much in exchange.
The last strategy, deception, boils down to this: we may be able to watch each other, but that doesn't mean what we show is real.
Call it "polluting the datastream": introducing false and misleading bits of personal information (about location, about one's history, about interests and work) into the body of public data about you. It could be as targeted as adding lies to your Wikipedia entry (should you have one) or other public bios; it could be as random as putting enough junk info about yourself onto Google-indexed websites and message boards. Many of us do this already, at least to a minor degree: at a recent conference, I asked the audience how many give false date-of-birth info on website sign-ups; over half the audience raised their hands.
The goal here isn't to construct a consistent alternate history for yourself, but to make the public information sufficiently inconsistent that none of it could be considered entirely reliable. Granted, this won't do much about ubiquitous documentation—although there are techniques that can help—it would be very effective in the Justice Scalia example I mentioned last week. If in digging up info about him, they found a dozen sites saying that he had four kids, another half-dozen saying that he was childless, a Wikipedia page saying that his middle name was Mario and an "official" bio saying that it was Luigi (all of the preceding being entirely imaginary, of course), how would they know what to trust?
There's the implicit potential for a world where such obfuscation of facts in the name of privacy becomes not just commonplace, but commercialized. Depending upon the legality, one might start to see companies that provide "data-pollution" services, commodifying opacity. Sifting services, conversely, would offer to identify misinformation and provide "trustable" data about a target. All for a price.
This strategy, too, has its own considerable drawbacks—not the least of which would be the evisceration of the Internet as a semi-reliable source of information. Are we ready to poison the well in the name of protecting our privacy?
An Imperfect World
None of these strategies gives us a clean, workable response to unrelenting transparency. All have serious downsides, and each one offers the potential for a worse situation than before if implementation was incomplete or temporary.
In the end, it's likely that the best we can do, for now, is to do what we have been doing. We can keep a close watch on our own visibility, for example, whether through regular checks of credit scores or through a basic "Google alert" on our names (those of you with names less... unique... than mine might find that one its own kind of challenge). This is hardly a satisfying response, but it's very human. We rarely if ever reach final conclusions about how we integrate technologies into our societies. More often, it's a constant give-and-take, a coevolution that demands compromise and adaptation—and offers the potential to fix mistakes and pull back from extremes.
It comes down to this: fight what you can; accept what you need to; never give up more than you must.
"Privacy Icon," Electronic Frontier Foundation http://www.eff.org/press/logos
"Digital Decoy," Jamais Cascio http://www.flickr.com/photos/jamais_cascio/3571416788