Recently, Mark Zuckerberg released a 3,200-word Facebook privacy manifesto, highlighting six privacy principles around which the platform will be rebuilt over the next several years. Critics were quick to cast doubt on Zuckerberg’s sincerity. Others questioned how the new manifesto would square with Facebook’s business model, which hinges on advertisers’ liberal access to user data.
But I see another fundamental problem: What does it even mean to focus on privacy? Privacy is amorphous. There are many definitions of privacy, and they depend on countless factors. For one person, the value of privacy may be about freedom of choice–the ability to manage her own state of well-being without her data getting scooped up and monetized. For another person, privacy might be about anonymity–the status quo that you mind your own business.
Intriguingly, there is no single word or phrase for “privacy” in Vietnamese. Privacy might be used as a form of physical seclusion or isolation or secretive information hidden from the public–it depends on the context and the people involved. Even then, some words are still interchangeable. Riêng tư means personal or private. Bí mật means secret. Chuyện mật means a personal or confidential issue. The word “privacy” has similar complexity in Russian and French.
Zuckerberg’s plan doesn’t account for such nuance. Instead, it spells out key tenets such as interoperability–the ability to communicate across networks, such as Whatsapp, Facebook, and Instagram easily and securely–and reducing the “permanence” of information so it doesn’t exist (and become vulnerable) longer than it has to.
These are admirable goals in the abstract. And to Zuckerberg’s credit, he acknowledges that the principles are a “first step” in a long journey. But they also reflect a classic engineer’s approach: Identify a problem, offer a one-size fits all solution for a product with over 1.7 billion users. What’s missing is a reckoning with the complexity of the problem itself–the myriad ways people interpret privacy. And to do that, you need human-centered design.
In my work, I have conducted dozens of interviews with individuals who have donated or transferred their personal data to tech companies in the context of healthcare, medical research, online social networks, immigration, and financial services. Their stories offer insight into how human-centered design can be crucial to uncovering different user needs. Here are some of my findings:
All privacy issues are not equal
For many users, the worst-case scenario is not that advertisers might sell you more things. It’s that your private information may fall into the hands of someone with malicious intentions. One person I interviewed who used mental health apps shared the fear that this information might leak to their boss and impact potential earnings or upward mobility at work. Another person, who had participated in a online research study–in which they contributed multiple sources of health and personal information–was primarily concerned with, “Will anyone know my immigration status?” The stakes around privacy are different–and higher–when you face the risk of being separated from your family and home.
By contrast, take the example of a secret banh mi sandwich enthusiast Instagram account. There is no personally identifiable information in the account, but they see a wave of local sandwich shop and restaurant ads. If someone traced the account back to the individual, the stakes for negative repercussions on their personal, financial, or professional life would be low.
Design practitioners should understand these different privacy needs. Conducting observational research as people use a product or service is an important first step. It allows teams to determine what information is most critical to users.
Users want transparency, not “interoperability”
Not a single person I interviewed wanted “interoperability” or a single app to “get replies from [their] friends in one place,” as Zuckerberg’s manifesto promises. Both the technobabble language and the vision for making an ultra-social platform highlights the divide between user and company desires.
Many people I interviewed wanted transparency into how the company makes use of their data. They wanted to know if there would be a fallback plan in case the technology that is supposed to protect their privacy fails: Will the company support them? The law?
Facebook’s missed opportunity
Human-centered design is all about prioritizing people. It is understanding the “why” behind people’s wants and needs, taking social and cultural contexts into consideration, and making products and services that work well for users rather than just a company’s bottom line. It’s the opposite of solving a problem as fraught and as complex as privacy with a list of principles.
Facebook missed the mark. This was an opportunity to shift discussions from feature-building visions to acknowledging how people use technology in real life. Some Facebook leaders say design is “incorporated in everything [Facebook] does.” But this perspective clearly isn’t strong enough. With more human-centered language and insights, Facebook could have included populations and communities that are often left out of the privacy discussion.
With looming state and federal privacy legislation in 2019, Facebook’s manifesto should be a stark reminder for researchers, designers, and engineers who create products and services with user data: Be proactive about understanding what privacy means to your users. Legislation might force companies to consider privacy in a way they haven’t before, and human-centered design is a crucial step toward getting it right.
Stephanie Nguyen is a user experience designer and policy researcher working on user data ethics and privacy in precision medicine at NIH and at Harvard. She was recently a digital service expert at U.S. Digital Service at the Obama White House working on improving accessibility to services as lead designer on projects at the Centers for Medicare & Medicaid Services, the State Department, and the Department of Education. She is a master’s in public policy candidate at Harvard Kennedy School and a Gleitsman Fellow at the Center for Public Leadership.