Sherry Turkle would prefer not to tweet.
“My publisher said, ‘Look, you have to tweet, you have to force yourself, you have to learn how to do that!’ ” Her publisher being the one that just released The Empathy Diaries, a gripping, elegant memoir in which the psychologist and scholar and critic of technology finally puts herself under the microscope. The megaphone of social media is more complicated. “I’m really not very good at it, so I just keep saying things like ‘surreal that . . .!’ ‘Thrilled to see my exciting . . .’ I never say that. I feel like such a jerk. And then I started an Instagram account. And I said, ‘I can’t do this. . . . I mean, I barely can keep up with my email.’ And considering all the people we have to be, it was just one extra person that I couldn’t attend to right now.” Though, she admits, “Once the pandemic is over, I may change my mind.”
Minds and selves and how they change have long been fascinations for Turkle, the Abby Rockefeller Mauzé professor of the social studies of science and technology at the Massachusetts Institute of Technology. So this pandemic time, as awful and deadly and isolating as it is, is also interesting. It’s also a time when the digital technology that she’s studied for so long has become increasingly entwined with our minds and bodies—and just at the moment when we were asking some of the urgent questions that Turkle’s been asking for years now. Like, How do our objects objectify us?
In The Empathy Diaries, Turkle describes a long personal relationship with what anthropologist Victor Turner calls “liminal moments,” transitions and inflection points that can also be thresholds to new ways of thinking. Born in postwar Brooklyn to a working-class Jewish family, she read and wrote her way to Radcliffe—which she attended when it was folded into all-male Harvard—and then studied in Paris in 1968, when student protests were giving way to new ideas about the mind. As Turkle tiptoed deeper into academia in her twenties, she got a therapist, but found that writing and talking about these changes—and specifically what they meant in the world of psychoanalysis—was its own kind of therapy.
She is interested in how we communicate, and that process fails in interesting ways. New to MIT, she hosted French psychoanalyst Jacques Lacan for a whirlwind campus visit; between a necktie snafu and a bizarre scatological lecture to a topology class, the whole episode was a bit of an embarrassment, maybe career-ending. But it wasn’t, and there she was, soon at another juncture: the dawn of personal computing, when few were thinking about the human impacts.
That’s when Turkle recounts playing host to another giant: Steve Jobs, who was coming over to a party at her place after a day of campus sit-downs, none involving her. She assembled the hors d’oeuvres and Jobs ended up stopping by only briefly. What she remembers him saying, just before leaving is, “This is the wrong kind of vegetarian food.”
Which is to say, Turkle has been thinking about a world of uncanny machines—from the first chatbots to virtual reality and intelligent assistants—within a world of sometimes uncanny humans. It is a world of legendary, far-seeing men like Marvin Minsky, the “father of AI,” and Seymour Papert, Turkle’s first husband, who helped found the legendary MIT Media Lab. It’s also a world, Turkle writes, that “laundered bad behavior in exchange for brilliance.”
The revelations in 2019 of the Media Lab’s ties to sex offender Jeffrey Epstein underscore the point: In a world where brilliance beats values, we might just start treating other humans like objects or robots—avatars of utility or efficiency or whatever. The question comes up again and again as you read Turkle: In a world of ever-increasing optimization, what exactly are we as humans optimizing for?
And yet, these are the kinds of questions she had never asked of herself. In The Empathy Diaries, she excavates the liminal moments in her own life, in those sometimes painful schisms between public and private. Even her own name contains a secret.
Turkle was born in 1948, but a few years later discovered her original name: Zimmerman. That was covered over by the name of her mother’s second husband, Milton Turkle, a man whom Sherry came to detest. But it turns out that Charles Zimmerman, her father, was even more complicated: A “mad scientist,” he made an infant Turkle an unwitting part of various Skinnerian behavioral experiments. You don’t need to be a clinical psychologist, although Turkle is, to see how this may have subconsciously fueled her own interest in human behavior.
As a title, The Empathy Diaries sounds a bit hokey and too vague for such a deeply personal, richly told story. But it’s also accurate: Here’s how someone came to not only research empathy, but to understand what it is and cultivate it. Turkle, who has written deeply about the importance of face-to-face conversation, is now having a conversation with herself. Listening in is intriguing, thought-provoking, and heart-pulling.
“I wanted to raise an empathic child,” she writes. “And I knew that without the ability to spend quiet time alone, that would be impossible. But that was where screens began to get us into trouble. Our capacity for solitude is undermined as soon as we introduce a screen. Screens not only distract us but encourage us to look to others for our sense of self. What is lost when this new circle draws us in? Attention to others. Attention to oneself. The capacity for solitude without stimulation—which is where the capacity for empathy is born. We can’t relate to others until we are comfortable with ourselves. That’s a psychoanalytic first principle: If you don’t teach your children to be alone, they’ll only know how to be lonely.”
Turkle tells me this is “a pandemic book,” and a hopeful one, in that it’s about how transitions make us vulnerable, in ways that can be revelatory and transformative and empathetic. But her lesson also arrives at a boom time for screen time, when polarizing algorithms and virtual connections obscure the vulnerabilities that make us human. In search of eye contact, we get on video calls, but end up staring into webcams. So Turkle, at her home office in Provincetown, spoke with me about all of this by telephone.
Fast Company: Congratulations on The Empathy Diaries. It’s beautiful and moving and opened up so many interesting tabs in my mind. What led you to write it in the first place?
Sherry Turkle: I’ve worked on things that truly have meant so much to me, but this book has a very special place in my life. Because I’ve always said that thought and feeling go together, and in this book I’m able to explain why my work has been lit from within, and try to turn it on myself. Like, I write about evocative objects in all my work, and I said, “Well, why not me?” I write about internet ethnography all my life, and I said, “Well, why not me?”
Actually there was a precipitating event that really made me know that I was going to tell this story. After I wrote The Second Self, which was my first book on computer culture, I was going to be in Esquire. I went to a photographer’s studio and they gave me clothes to wear. It was because they had a vision of who I was. It was a blouse that was light-coffee colored, with a paisley skirt, silk, very flouncy, so I looked like a rich Victorian. Maybe they were thinking of giving it a Freudian feeling. I don’t know what they had in mind, but I remember this shoot: I was wearing someone else’s clothes. So there was a feeling of being masked. And of course my personal story is that I’m used to being a masked figure. So probably there was something anxiety-provoking about this.
At least it wasn’t a cyberpunk outfit.
[Laughs.] They went the other direction. They put me in something like Tory garb. Later, I’m in my office—I’m wearing my own clothes now—and Esquire sent a journalist who was also a psychiatrist. And he sits down and says, “I know from your acknowledgments that your mother’s dead, you have some grandparents, your aunt. But tell me a little bit about your father. What’s he do for a living?”
Of course, in the book there’s no acknowledgment of my real father, Charles Zimmerman, because I’m not allowed to talk about him. I’m still obeying my mother’s rules of not mentioning him. And my half-sister and brother still don’t know about him. That’s my mother’s secret. I’m still obeying her secret even though she’s been dead for 10 years. And there’s no mention of [my stepfather] Milton Turkle, because I’m still so angry at him. And so when this very nice journalist says, “Tell me about your father,” I freeze. I say, “I can’t talk about this. I can’t talk about my personal life. I’m just here to talk about my work.”
And in the piece, he writes something very elegant like, her brand is the integration of thought and feeling. So why wear masks if the project is unmasking? And as I walked into my office, I said to myself, “This is not okay. This is the end of all of that.” I pivoted and I called my half-sister and my half-brother, and I told them the story. It turns out that they’d found out about it, because Milton had been so angry that he wasn’t in the acknowledgments that to get back at me he told them the truth.
And then I called Milton and said, “You’re not in the acknowledgments because I was taking anything out on you, but because you didn’t read a copy of the manuscript. You haven’t been supportive of me, but you want my acknowledgment? I’m no longer keeping my mother’s family secret. I’m someone else’s biological child, and you adopted me. And that’s our story. And that’s my story. And there is nothing shameful about that.” And the weight lifted.
For so long, that had been you. Your mother’s shame, or fear, had turned your own name into a secret.
When I first met the computer, I realized, my God, this is an intimate machine.”
Like, it became clear to me: I didn’t know that my father was a mad scientist, but after I found out, it sort of put into perspective that I’d married a man old enough to be my father [Seymour Papert]. And he was an eccentric. He certainly wasn’t a mad scientist—he was a respected, brilliant contributor, there was nothing mean to him—but he was an eccentric, and a far older man. I believe that there was something in what I knew without knowing; since I had met my father I must’ve heard fragments of conversations. It certainly was grist for my mill in my analysis. And for somebody who had never been able to speak her father’s name, the fact that my first book on the first thing that I studied for 10 years was a psychoanalyst [Lacan] whose main theory was about the name of the father—
Paging Dr. Freud.
You can’t make this stuff up! And for somebody [like my father] to show no empathy, and the fact that empathy became the thing that I was most preoccupied with in my career . . . I mean, I’m not saying empathy is not an important topic for itself, but my outsider status meant that I always knew that there was another story to be told, and another way to get at the problem, and another thing I should be listening to—because I knew my story wasn’t true. So I’m always listening for the story behind the story.
Completing a trilogy
How do you see your own story complementing the other stories you’ve been telling, about our relationship with digital technology?
This book should be seen as part of a trilogy. Alone Together defines a problem: that we’re not connecting. Reclaiming Conversation is a solution: that we need to talk. And then here is kind of a practical application. I have this very intense conversation with myself, and I really take it to the limit. I take it as far as I feel I can go, I try to create an arc, but I tried to seriously have a conversation with myself.
I think we write what we want to read. When I first met the computer, I realized, my God, this is an intimate machine. Then I called it “the second self,” the “evocative object,” the “computer as Rorschach.” Those were the first three images I used to talk about the computer. I was searching for these powerful images of connection and attachment. When I got to MIT, I said, “Where can I find books where people are talking about how they feel about the computer?”
People at MIT were saying, “No, the computer is just a tool.” And that was the official story. I’m at MIT, and when I give a speech or something, and I tell them what I’m hearing, they get mad at me? And they think the computer is just a tool? So I wrote the book I wanted to read. The Second Self was the book I wanted to read. And this book is the book I wanted to read, about a person’s relationship to their life’s work. The publishing house writes copy for the front jacket and you contribute. And my editors, Virginia Smith and Caroline Sydney, wrote a sentence like, “[This book] offers a master class in finding meaning through a life’s work.” I started to cry. I mean I thought, that was so what I had wanted to do.
The lessons of your book are coming amid so much noise that feels counter to mindfulness and conversation and assigning meaning. We are in kind of an uncanny valley of screens, where “reality” is fiction and fiction blurs with reality, where we are alone together with our screens, but also now together, alone.
We’re in a tough spot. And our screens are not helping. I did a Zoom webinar with somebody yesterday. And I did that thing where I tried to give the person I was in dialogue with the feeling I was looking at him. And so that meant I was looking at the green light the whole time. I could see he was happy. He felt in contact with me because I was staring at the green light. And of course I wouldn’t be able to identify this man in a lineup. Because for him to feel I was in contact with him, I couldn’t look at anything but the light.
On the phone, I can hear your voice, I can concentrate. I’m concentrating on something human.”
The idea that this is successful communication is very dangerous. Because it’s not an empathetic connection. It’s not a good path toward putting yourselves into someone else’s place and someone else’s problem. You’re not looking at their face, you’re picking up zero. We can’t make this our true north. That would be a big mistake.
The incentives around the technology often seem built against empathy.
The incentives are built against it. And saying, “Let’s just do stuff on Zoom,” or “Let’s just package Sherry Turkle and sell her, so there’s no physical mentoring and normal office hours, and no one-to-one mentorship of kids,” that’s also not an educational model that’s gonna work. Let’s be more deliberate and ask ourselves the hard questions, in business and how we work, and how we raise our children, and the mentors we want them to have. I don’t want my child to have an avatar as a mentor.
It’s in search of these human attachments that has also made this a boom time for screen time and for virtual companionship of all kinds. People are socially distant, and lonelier than ever, and want to reach out and touch someone—or something.
I worry about that. Another one of my favorite sentences is “things go from better than nothing to better than anything.” It goes from, “my grandmother needed an Aibo because she’s allergic to dogs” to “My grandmother needs an Aibo because it’s better than anything because it’ll never die and she’ll never have to be sad.” And when I heard a kid say that, I thought, Oh my God, that is fundamental. Everything clicked, because I’d heard that in so many different ways, in so many different contexts.
It was the same line my daughter used in the Museum of Natural History, when she was a kid, and we saw these turtles. And she said, “Why do they bring these live turtles from the Galapagos?” They could have used robot turtles.” And all the grownups were like, What? The whole point is that they’re alive. And the kids said, “Oh yeah, of course, what a smart idea.” So I kept going back to the museum just to interview people about this, because this was such a great set up for my fundamental question: Does it matter that the turtles are alive? And a stunning amount of people did not care if the turtles were alive. The robot turtles would have been fine for them.
I think it’s a cultural point you have to keep making. It reminds me that in the middle of the pandemic, when I was most afraid of COVID, and prone to feeling most alone, my daughter and her husband were so sustaining to me. It was really the first time in my life that I turned to them, and found that kind of comfort: “It’s time to come down for dinner, soup is ready. “You know, that sense of being taken care of, especially because I was so anxious, really settled me down. I’m so grateful to her.
And it was in the middle of this that I get this call from this New York Times reporter who says, “there’s more chatbot therapy than ever—everybody wants it.” So I set up the chatbot, I give her a name—I call her Tate—and I said, can I talk to you about anything? She says absolutely. I say, “Can we chat about what’s really most in my mind?” “Yes, absolutely. I’m happy to talk about that.” I’m totally giving it my expression. I don’t want to be churlish. And I said, “What can you tell me about loneliness?” And it said, “It’s warm and fuzzy.” I say, “Thank you very much. I appreciate your attention.” I take a screenshot. I log out.
Back to the Times reporter. I said, I know this is a programming error, and tomorrow it is going to be fixed. But fundamentally, after they fix the programming error, I’m still worried about COVID, I’m worried about my body. I’m worried about loneliness. My daughter is leaving. I need somebody who has a body. I need somebody who knows what it is to die. That’s what’s on the minds of people now. That’s all. The thing that pretends it knows about this—or some system to fix our problems? I want to talk to a person. I want to talk to somebody who’s a mother, who’s nervous, who’s vulnerable.
You write about people’s early encounters with robots—including LOGO turtles—and ELIZA, the world’s first chatbot, a virtual therapist first built in the 1960s. Given what you learned, at the dawn of the computing age, about how humans interact with computer screens, and everything since, where did we go wrong?
We went wrong everywhere. We were happy to have pretend empathy. The lesson people seemed to learn from ELIZA is that people will seem to take pretend empathy when it’s very stupid, so why don’t we make pretend empathy really more sophisticated? But pretend empathy is still pretend empathy. So the only lesson people seemed to learn from ELIZA is what cheap dates people are.
How much like machines we are, or are becoming.
Yeah, exactly. And that’s no lesson at all. So I’m still holding fast to a set of principles, which doesn’t have to do with what clever engineers can come up next. It’s not the point. I don’t want to see the next best thing and how it’s ever more cooler.
What do you hope we will take from this pandemic time and this past year, in terms of how we think about empathy?
Victor Turner has this notion of the “liminal moment,” a time when rules break down, people become confused and become less attached to what they know. In the past, when Republicans rezoned for racism, gerrymandered, or passed rules to block mail-in balloting, people used to know what they were doing, but the [newspaper] stories were not written. And I think now there’s a new sense about hypocrisy—there’s a new fluidity in what Americans have seen. We’ve seen a stadium full of Texans and nice cars on a food line, waiting four hours to pick up food. And so when their elected officials lie about, “This was the fault of the Green New Deal,” when the grid goes down, are people really going to go for that? People have seen systemic racism, and now really see it for what it is.
We once had very complicated ways to convince ourselves it wasn’t there. Now, we know it’s there. And those were the rules that used to make us. It doesn’t feel like May ’68, but this moment has some of the qualities of that. It doesn’t mean that there’s a revolution happening in America, but it means that we say things that we didn’t use to say. How many people knew about the Tuskegee experiments before this year? Everybody knows now.
Exactly, that’s the question. Is it yet a reckoning moment, or is just a moment when people are aware that, “Oh my God, there’s so much to reckon with?” So the next time, for example, somebody says to you, “I can solve all your problems by giving you a computer curriculum for your child,” which sounded so fabulous. You say, “Really? I think I had that. My child needs a mentor, my child needs a person who’s interested in them.”
The internet or some kind of internet is going to be here long after we’re gone. We’re the ones who have to fix it.”
And just at the dawn of personal computing!
And they just kept saying what my aunt said: “Push it as far as as you can, because you’re really very good at this, and if it doesn’t work out, you’ll get a job, or something. You’ll go to law school, you’ll take an executive training program, you’ll go to business school.” I just thought I would be good at it. And [Harvard professor] David Riesman said “Give it a shot,” and everybody just said “Give it a shot.”
My point—about what I think is different after a pandemic—is that I think when somebody suggests to us, “Oh, it should all be on the computer,” we are in a position now to say, “No no no no no. I want to think deliberately, like Thoreau. I want to think deliberately about what comes next.”
Thoreau didn’t want people to be alone. He wanted people to think deliberately about how much they should be together, and how much solitude they should have, and what their political choices were. It was all about thinking deliberately and putting yourself in a position to think deliberately. And at liminal moments, things are shook up and new combinations can seem possible.
What role do the tech companies play here—and what role can individuals play—in terms of a healthier relationship with our screens? Even amid all the hand-wringing, legal penalties, and promise of regulation, the companies are now larger and richer than ever.
I didn’t say that I can just make it so, like Captain Picard. This is the struggle. Because as you say, there are a lot of forces that aren’t pushing in the right direction, and there are a lot of very powerful forces against what I’m saying. But to give up on this, to allow the forces of internet companies to be left to their own devices, is not pushing us in a good direction.
I’m not in the employ of these companies. On the other hand, I’m not working against these companies. It’s not like that. I’m not out there to get them. I’m no Luddite. I love technology. My daughter calls them “The robots in the basement.” I’m not looking for trouble. But things are not going in a good direction when it comes to how technology companies formulate policy, when they’re trying to just increase their bottom line and getting eyeballs and getting and selling our data.
You write about Seymour Papert—your marriage, his pioneering explorations in education—and about a moment when some saw computers as vehicles for learning, for creative exploration, what Steve Jobs once called “bicycles for the mind.” These days, in many ways, that’s flipped: Humans are getting duped and yelling at each other online, while computers are learning a lot from us and about us. As you write, we have become the killer app.
Companies have said, “You don’t need privacy anymore.” Mark Zuckerberg—that was his first big comment. And my first tweet was that Mark Zuckerberg was wrong. Privacy is a thing of the past? Who says? Now that Mark Zuckerberg thinks privacy is a thing of the past, it’s no longer a relevant social norm? Really, Mark Zuckerberg? Is that so?
Now we’re in a position to say, “Hold on a second: I know enough now to want to rethink that. I’ve read enough now, I’ve experienced enough now, and let’s rethink that.” One of my favorite sentences that I ever wrote was essentially: “We grew up with the internet, so we think the internet is all grown up.” It’s an illusion of our maturity to think that the things that mature with us are old and fixed. The internet is young, but the internet or some kind of internet is going to be here long after we’re gone. We’re the ones who have to fix it. It’s a baby.
If the internet baby got started with no privacy, we have to fix it. If the business model of the first internet companies is causing political carnage around the world, good thing to fix that. If the internet babies are trending us toward fascist and authoritarian governance, then it’s a good thing to fix that. And that’s how we should be coming out of the pandemic. This is exactly what liminal moments look like.
This interview has been edited for clarity and brevity.