As the host and managing editor of WNYC's Note To Self podcast, which calls itself the tech show about being human, Manoush Zomorodi spends a lot of time thinking about that fleeting, uneasy feeling you likely get every time you download a new app and grant it access to your data. You might have had that feeling two months ago, when Uber started asking for permission to track your location even after you've exited the app—or perhaps the feeling surfaces each time you see an ad on Facebook that mirrors what you were searching for minutes prior via Google.
This is the inspiration behind a five-day interactive project Zomorodi has dubbed the Privacy Paradox, which starts today. Each day this week, Note To Self will present listeners with a challenge, podcast, and newsletter to nudge them toward a better understanding of what being connected means for their digital privacy. I spoke to Zomorodi about what listeners can expect from the Privacy Paradox, and why it all matters.
Fast Company: Tell us a bit about the idea behind your Privacy Paradox series.
Manoush Zomorodi: For me, it was really that split-second moment that I think we all experience when we go to download a new app or sign up for new platforms, and the Terms of Service pops up. And we've all been through it, right? We don't read it. We know that we're not supposed to. I think there's some study that says it would take us 22 days of our every year to read all the Terms of Service. But that split-second of ickiness, where you're like, "Am I a bad person? Am I giving away access to my entire identity for the rest of my life by clicking 'agree?'" And then we do it anyway. I was like, "Wow, this is happening to me all the time."
I was curious to know more about it and I sort of dug into what research there was. I learned that behavioral economists actually call this the Privacy Paradox. Actually, Americans do care very deeply about their privacy. In fact, 74% of Americans, according to Pew, say it's extremely important to them that they be in control of their personal information, and yet to be a person in the modern world, we have no choice. We need to be on all these platforms to be searchable, to be relevant, to pay our colleagues back for lunch, to sign up our kids for camp.
So is it just too difficult for us to disentangle ourselves from all these platforms and apps?
I don't think it has to be on or off. Really, that's what the project is: the idea being that each person has to figure out where they set their own personal privacy boundaries. But also, all of us need to think more broadly about what our digital civil rights are. I think it's totally fine to be on Facebook. What bothers me is that we don't have control over our data once we put it on there. We know that if we post something, Facebook is going to look at it, and they're going to target ads at us. But there's all sorts of new technology that a lot of people don't know about.
For example, they know that cookies are a thing, but they don't know about digital fingerprinting. Even if you opt out of cookies, you can be followed around the web because of all these data points that are put together to figure out who you are. They don't realize that there is now technology that can parse not what you post online, but what punctuation you use—what sort of word choices you use. It can literally read between the lines of what you post. So I think we would feel better, and it would also be more American if we knew more what was being done with our data and were given a choice to opt out or even to take it back.
On the very shortest scale, I think that might be deciding whether or not to use the particular app. But what I'm really interested in is exploring what we can do more broadly and systemically. I went to see Sir Tim Berners-Lee at his lab at MIT—the guy who invented the web—and he was like, "This is not what I had in mind." So one of the things he's looking at is personal data stores—this idea that instead of you logging into Facebook, Facebook would log into you, and you would decide what you gave Facebook, and you could also take it back. It sounds sort of fantastical, but this guy invented the web, so I feel like if anyone could do it, it's him.
Another thing that I think we need to talk about is: Maybe there should be a Hippocratic oath for technologists. We ask that of physicians, lawyers, even us journalists—truth to power. People are like, "Oh, no one will ever stick to that." Well, you have to start somewhere. The other paradox about privacy is that we need to start talking more publicly about it—about what sort of role it plays. Our digital rights have not been defined by the courts yet. We're at this crucial moment where we need to have a public conversation about what we want going forward.
During each day of the Privacy Paradox project, you're sending out a newsletter, posting a short podcast, and assigning listeners a challenge. Can you talk about what you'll be covering each day, and what you're hoping listeners will get out of this series?
We want to start on a very straightforward, technical basis. It's time we start talking on a micro-scale, with each person thinking a little more purposefully about how they use their digital tools and what they're okay with or not okay with. So for example, on day one we're going to ask people to check the privacy settings on their phone. That's pretty basic, probably, for your readers. We're also going to ask them to get on Signal, the encrypted text messaging app—not because they necessarily have to have a private conversation, but for people who maybe these days are worried that they could be targeted for their beliefs or affiliations or origins going forward. We're also talking about metadata. I think people hear the word and they start to space out. So we want to explain, "what is metadata?"
I'm learning more and more that we really have to start from basics. Open your phone, go to the settings app, scroll down to privacy, and see what each of your apps have asked permission for. Why does this app that's for your to-do list want access to your microphone? There's no reason why they need that. We know that the tech companies obviously want as much information as possible. But make a choice—take back just a little bit of your metadata. What I've learned with these projects is just starting to feel slightly empowered is a big thing. And it keeps you from feeling like, 'Oh, this is just the way it is. There's nothing I can do, so I might as well give up.' No! We have to keep the fire alive.
Our goal then is, through the week, to look at the various aspects of privacy. On day two, we're talking about specifically marketing and advertising, and how it works. And it's not just advertising. We're partnering with ProPublica on day two, to talk about their work on how Facebook categorizes people. And then, we get a little bit more psychological—which I think is different—and talk about the importance of privacy to how we become fully formed human beings. We talked to Dr. Elias Aboujaoude, who runs the psychiatric clinic at Stanford University, and he talks about the idea of individuation—that the mind needs a private space in order to become a fully formed adult. He's dealing with a lot of Stanford undergrads. (One could say that our current president also shows some regressive behavior online, as well.)
We're also talking to Luciano Floridi, a professor of philosophy at Oxford Internet Institute who was Google's in-house philosopher and advised Google on how to make decisions that algorithms can't make, like when the right to be forgotten was passed in Europe. How does a tech company start to make moral and ethical decision when they're dealing with things that can't be automated? I thought that was really cool—that seems like a sign of progress to me. We have a totally weird guest on day four, which is [Elan Gale], the executive producer of The Bachelor. He has some really insightful things to say about how people's behavior changes when they know they're being observed.
And then day five is with Sir Tim Berners-Lee. We've made a madlib for your personal Terms of Service. His, obviously, are far more stringent than, perhaps, yours or mine might be. But if you just lay out some very basic principles to download by, maybe you can feel a little better and feel like your technology and the way you use it aligns with what your values are. It sounds kind of lofty, but I don't think it's too big to ask. The tech reporters, we get this. But I'm hearing from my listeners that they're like, "I feel like it matters, I can't quite put my finger on why, and I certainly don't know what to do about it. But it's bugging me."
That brings me to: How do you try and convince people who don't think too much about privacy that it is important?
I think it's particularly related to what our country is going through right now, which is defining what it means to live by American principles. And that has really been what has connected it for me. This sounds so cheesy, but I went to Washington, D.C. with my 9-year-old over the winter break, and we went to the National Archives, and we looked at the Bill of Rights to see what the Fourth Amendment said. It's weird: The word privacy is not even mentioned in it once. But this idea that is enshrined—it's not about having something to hide. It's just self-determination; it's autonomy. It's free will.
That, to me, matters far more. I find that far more motivating than "I should have a strong password." That really is what gets me excited—the deeper idea—and instilling this in my kid, that privacy is important. In your real life, it means being able to go to your room and close your door and read and think. You should also be able to do that in the other places we live a lot of the time, which is online. We're treating our online lives differently than we're treating our real life. The Supreme Court has barely ruled on any of this, and yet this other virtual world is where we spend so much of our day.
In this moment of media turmoil, what have you been thinking about and hearing from listeners?
For me, what was very powerful was after the election, I sort of figured—just by what we know about the typical public radio listener—that maybe [my audience] leaned toward voting for Hillary. But we had a lot of Trump people get in touch and say, "I'm here. I'm listening to you, and these are issues that matter to me too." It was really gratifying.
We've been trying to do matchmaking with listeners because you figure well, you both listen to this show, so at least you're starting on one spot of common ground. We're going to do this, I think, in a couple of months. So I don't feel like I quite know where that's going to go, but I'm intrigued by it. And I felt great that this was a bipartisan thing—I found a bipartisan thing, everybody! We can all care about privacy and living life a little smarter and better online. And I feel energized as a journalist, so that's a good thing, too.
This interview has been edited and condensed.