Meet Facebook’s Compassion Czar

Humans have spent millennia learning how to read each other’s emotions and treat each other right. Facebook is now trying to figure out how to help them do that online.

Last year, Facebook started noticing something odd in the way people were using its "Report This Photo" feature. The company had put the link under every photo on the site so users could flag images that violated the company's terms of service, like ones that included scenes of illegal drug use or graphic violence.

But now they were getting reports of images that didn't seem to include anything out of the ordinary. Just regular pictures of regular people. 

Facebook eventually realized that the people reporting the photos were actually in the images themselves—and the company realized that the problem was that those users didn't like the pictures or, in some cases, particularly among younger users, felt that the images were being used to bully them.

"The two default pathways we provided to users to deal with this," Arturo Bejar, one of Facebook's directors of engineering, tells Fast Company, "were either to comment on the picture in front of all of your friends, which is calling further attention to the picture, or hit Report."

So Bejar's team, which is responsible for safety on the social network, developed new workflows to help users communicate privately among themselves. 

But what started out as a simple effort to solve the mystery of why the pictures were being reported is slowly blossoming into a larger initiative at Facebook—one spearheaded by Bejar (pictured, right) that seeks to develop new ways of facilitating conflict resolution between users, and in the process promote more empathy and compassion within the community itself.

While trying to solve the photos problem, Bejar sought help from experts in real-world social interaction, like Emiliana Simon Thomas at Stanford's Center for Compassion and Altruism Research and Education and Dacher Keltner at the University of California, Berkeley's Social Interaction Laboratory.

Along the way, Bejar discovered that, while humans are very good at reading each other's emotions in the real world and, usually, modifying their behavior in order to avoid hurting each other's feelings, software interfaces have not evolved to give humans the same capabilities in the digital world.

Online communities are still relatively nascent, and the typical mechanism for mediating disputes, which emerged out of online bulletin boards and forums, was simply to turn to the moderator for help. That, Bejar says, explains why Facebook users started clicking the Report link. They were used to the idea of reporting issues instead of handling them themselves.

But with almost a billion people using Facebook, Bejar's team realized it would quickly become unsustainable for the social network to have to mediate every conflict. Plus, he says, it's not even the ideal solution for users.

"People think that communities online are different from communities in real life," Bejar says. "But they're not. Communities online are a mirror of communities in real life."

And in the real world, you don't ask some anonymous third party for help when a friend inadvertently hurts your feelings. You take it up directly with them.

So with Facebook becoming a place where people increasingly socialize in ways previously limited to the real world, Bejar says it's increasingly important to develop mechanisms online so people can negotiate their social interactions in digital space as easily as they do offline.

As a result, Bejar has started investigating ways to modify Facebook's interface to aid this evolution. Earlier this month, the company hosted a Compassion Research Day at its Palo Alto campus, where Simon Thomas, Keltner, and other researchers in the fields of human behavior and social interaction from Yale, Berkeley, and local schools shared what they know about human-to-human interaction.

This past spring, Bejar's team released enhancements to the Report This Photo feature, called "Social Reporting," that let users handle photo issues themselves. 

Now when you click the Report link, there's an option that says "I don't like this photo of me." When you select it, there's an option to send a message to your friend. And when you click that, Facebook opens a text box where you can type a message. For those who aren't sure what to say, the box comes with a pre-populated note saying, "Hey, I don't like this photo. Please remove it."

It's not rocket science, of course, and some might wonder why Facebook users couldn't simply have thought to message their friends on their own.

"While it feels very intuitive, no one was doing it," Bejar says.

While humans have spent millennia fine-tuning interpersonal interactions in the real world, online, people sometimes are at a loss. Bejar says that before Facebook started prepopulating the text box, usage of the workflow would drop off before hitting Send, as if users didn't know what to say. Once Facebook added the canned text, usage of the feature skyrocketed.

And it's working, Bejar says. Over 75% of people who receive those messages delete the image in question.

Facebook's enhancements so far, which also include a way for teenagers to get help from a trusted adult when they feel a picture is being used to harass them, are just the beginning of a long list of potential innovations to strengthen human-to-human interaction in the social network.

"Every time I hear one of these people talk," Bejar says of Keltner, Simon Thomas, and their colleagues in the field of human social interaction, "I walk out with a to-do list of things we need to do."

[Image: Flickr user Jesslee Cuizon]

E.B. Boyd is FastCompany.com's Silicon Valley reporter. Twitter | Google+ | Email

Add New Comment

0 Comments