Privacy: What We Can Do to Protect Ourselves Online

Issues of online privacy and rights are complicated. What can we do as individuals and as a society to protect ourselves? Here are some good ideas about what works…and what doesn’t.

issues of online privacy and rights are complicated (see my previous two posts). What can we do as individuals and as a
society to protect ourselves? James Grimmelmann, (“Saving
,” Iowa Law Review, 1139-1205, 2009), provides some good ideas
about what works … and what doesn’t. Here is my take.


What Doesn’t Work

Allow market forces to work it out: commercial interests are clearly
at odds with people’s desire for privacy, so depending on the Facebooks of the world
to police themselves is clearly not a good idea. The more information these
guys can share, the more important they become to advertisers. Most of us are simply
unaware of how much information is ‘out there’ to expect a consumer backlash.

Offer Privacy policies: People don’t read or understand
the small print. A
2007 poll
found that only 31% of Facebook users carefully read the privacy
policies ‘most of the time.’ While that number may be a bit higher today, many
of us still skip the fine print and simply click ‘I agree.’


Add Technical Controls: installing technology to help users
classify information as ‘private.’ This doesn’t work because people don’t think
of social interactions in terms of access lists or permissions, so they ignore
or misuse these. Case in point, an
official UK report
found that almost ½ of UK social network site users left
their privacy settings on default, even when presented with the option of
classifying information.

Institute Commercial Data Collection Rules: People will continue to post
phony, denigrating, or undesired information on the Internet. Data collection
rules don’t address these issues.

Impose Use Restrictions: trying to limit access to social
networks to people over 18 years of age does not work. Teens lie about their
age and join anyway.


Assign Data Ownership: trying to assign rights to online
digital data is extremely complicated. For example, while we may own our
profile information, do we own data relating to our relationships? For example,
if I wanted to port my social graph to a new social networking site, do my friends
from the old network have a say? Locking down data ownership would likely
squelch the online social experience, and it is therefore not going to be
acceptable to most of the online population.

What Might Work

Institute Public Disclosure Torts: clear guidelines about what is
considered legitimate access and exposure of information should be addressed.
For example, information accessed through surreptitious methods online, such as
pretending be someone else, should be treated differently than information
posted openly on social network sites.

Defining Rights of Publicity: online sites need to make it clear
how they plan to use information and get permission for each type of use, such
as advertising. While this might be onerous for users, I believe this is the
only fair use of personal information. Some sites, including Facebook, have over
time, improved their statements regarding how information will be used.


Provide Reliable Opt-out: sites should allow people to opt
out of publicizing information and this operation should be reliable. While
advertisers would disagree, it is better to require users to ‘opt-in’ rather
than ‘opt-out’ for posting information.
By opt-outing, the onus of protecting information is on the unsuspecting
user and not the knowledgeable site. ‘Opting in’ places the onus of publishing
information on the online site, which seems more fair, even if it impacts the
user experience.

Provide a Predictable Experience: changes introduced by sites to the way personal information is used, potentially leads to problems. For example,
when Facebook changed the way its news feed works, information that was
previously obscure suddenly became prominently displayed. The service provider
must roll out changes slowly, publicize the change in use implications, and get
users to opt-in, to create a stable social environment.

Eliminate ‘Chain Letters’: many online sites employ
incentives to get other users to participate in promotions; for example, to get
some discount if they get other users to register for a service. This practice should be discontinued because
it is encourages people to encroach on others rights to gain a personal reward.


Facilitate User-driven Education: telling people, particularly teens, about the dangers of posting information online, largely fails. However, by
getting teens themselves to speak about privacy issues through examples to
which they can relate is a much more effective method.


The issues of online privacy are complex. Traditional privacy protections largely work because they assume most people are obscure and the number of venues for
publishing information are few. Online, both of these assumptions are invalid.
“Privacy through obfuscation” doesn’t work online. Legal, technical, and educational methods must be used in concert to bring order. Furthermore, the state of privacy will
necessarily change as the online forum stabilizes and people become
increasingly aware of the implications of their online actions.


About the author

A technology strategist for an enterprise software company in the collaboration and social business space. I am particularly interested in studying how people, organizations, and technology interact, with a focus on why particular technologies are successfully adopted while others fail in their mission. In my 'spare' time, I am pursuing an advanced degree in STS (Science, Technology, and Society), focusing on how social collaboration tools impact our perceptions of being overloaded by information. I am an international scholar for the Society for the History of Technology.