advertisement
advertisement

What the first porta-potty can teach designers about digital privacy

Our insecurities about public bathrooms have evolved over time—and so, too, should the way we design for digital privacy.

What the first porta-potty can teach designers about digital privacy
[Source Photo: Justin Smith/iStock, StudioM1/iStock]

In 16th-century Paris, the starkly primitive clashed with the beautifully refined. The dainty bourgeois would gather dressed in their finest frocks, wearing tall, veiled, conical headpieces atop golden brocades and long, pointed shoes.

advertisement
advertisement

As the wealthy class wined and dined and their bladders filled with drinks, distinguished citizens “did not go to the toilet—the toilet came to them,” Witold Rybczynski, the University of Pennsylvania professor and architect, wrote. It was an early version of a porta-potty, a moveable chamber pot in a box. Luckily, individual awareness had yet to emerge in renaissance life—the words “embarrassment” and “self-consciousness” only appeared toward the end of the 17th century.

Let’s flush forward 600 years. A customer needs to use the restroom in a coffee shop with a mediocre click-lock handle. Someone yanks on the door. Panic strikes as their bowels clench into a knot. The door could have swung open revealing someone sitting on the toilet with a frozen, wide-eyed look of frenzy, fear, and embarrassment. Clearly, norms about physical privacy and social shame have evolved over context and time.

Today, we are grappling with how to control and design for privacy in the digital realm—and how to better communicate the urgency of these problems to users, many of whom may not be concerned.

“Physical and digital privacy are more closely aligned than different, but data is abstracted from a person or community,” explains University of Florida associate professor Jasmine McNealy. “Data privacy doesn’t have a body or tangible object to look at, which means we can’t necessarily imagine the harms.”

Without sights, sounds, and touch, it feels practically invisible. People are used to walls of legalese as a mental safety net denoting digital privacy—but that’s far from foolproof. It is only when someone sees a sudden, unbidden change in their bank account or health insurance status, for instance, that they realize their privacy has been breached. At that point, it’s likely too late.

[Source Images: SutidaS/iStock, StudioM1/iStock]

What do we call this era of digital privacy? How might we better understand it as we try to shape norms that empower citizens to both protect themselves and be free to be themselves?

advertisement

To create stronger privacy norms and policies in the digital world, we must stop thinking about privacy one-dimensionally. Privacy, like architecture, is a concept that embodies human values. It draws on psychology, sociology, economics, politics, and the complex systems that surround it. User preferences are not stagnant.

We must recognize the nuances of privacy in different contexts and cultures and design experiences that reflect the values of privacy as empowerment, freedom, and anonymity. Privacy is more than a tech problem with a tech solution.

When I interviewed dozens of participants donating genetic data for precision medicine research, privacy manifested in the form of social stigmatization or fear. “If I have an STD, abuse drugs, or have a terminal disease, I don’t want everyone to know that,” explained a participant. “I don’t want people to find out [my information] and jack up insurance policies,” explained another. That same person originally donated genetic data to “improve science so future generations might not have the same issue.” The tension is the desire for both precision and anonymity.

In another context, data privacy harms can appear in the form of racial bias, prohibiting future opportunities. Najarian Peters, assistant professor at Seton Hall’s Institute for Privacy Protection, is studying black parents who seek alternative education for their children like homeschooling or unschooling. “[Parents expressed they] do not want a permanent document to be created about their child because of possible misinterpretation and bias,” she explained. “Children should get their fair shake and not be categorized as a problem.”

Since there is often a divergence between how the user expects their data to be handled and how designers actually embed these features, we must translate and understand privacy for end users from norms to product design. Norms like anonymity and solitude in a physical realm are often difficult to achieve online. There are never-ending trade-offs like user empowerment versus convenience and individual versus population benefits.

No single product can provide a privacy experience that will work for everyone. We need to shift how we embed privacy in products and services by revisiting user expectations with actual outcomes and challenging cultural privacy norms.

advertisement

For instance, it is a norm that many video conferencing technologies have a visible option to turn off video before starting a chat room. This is good progress that gives the user choice and agency on how much to reveal. Can “unsubscribe me” be replaced by “delete me” in email marketing letters? Perhaps we should question this norm and revisit user expectations with actual outcomes. Is surveillance less invasive if it is two-sided? Amazon Key is a service that gives delivery drivers the ability to enter your home via a smart lock on your door and leave your packages inside. The service is all about building a system of trust, but it’s a privacy conundrum. Amazon surveils you. The delivery person surveils your home and has physical access to everything you own. You surveil the delivery from your phone. Who’s controlling—or consuming—whom?

We are in the age of human data-collection ubiquity. Like an infinite game of tag, both society and technology are racing to catch up to one another.

We need more than just a terrifying bathroom-door rattling moment with data-collecting companies—something that translates privacy into tangible harm worthy of change. Even with the Equifax breach and YouTube’s privacy violations on Children’s Privacy Law, some argue that current enforcement mechanisms are not enough.

“People don’t understand why [data privacy] is important⁠—it needs to be clearer,” Emily Peterson-Cassin, digital rights advocate at Public Citizen, tells me. “What we need is a recapturing of the idea that our digital lives are our lives now.”

Until privacy norms begin to calcify in our time, we’ll need to double check the door is locked on that shoddy bathroom door . . . and that nobody is recording what’s behind it.


Stephanie Thien Hang Nguyen is a research scientist at MIT Media Lab focusing on data privacy, design, and tech policies that impact marginalized populations. She previously led privacy and user experience design projects with the National Institutes of Health and Johns Hopkins’ Precision Medicine team. Follow her on Twitter.

advertisement
advertisement