Meet Facebook’s Compassion Czar

Humans have spent millennia learning how to read each other’s emotions and treat each other right. Facebook is now trying to figure out how to help them do that online.

Last year, Facebook started noticing something odd in the way people were using its "Report This Photo" feature. The company had put the link under every photo on the site so users could flag images that violated the company's terms of service, like ones that included scenes of illegal drug use or graphic violence.

But now they were getting reports of images that didn't seem to include anything out of the ordinary. Just regular pictures of regular people. 

Facebook eventually realized that the people reporting the photos were actually in the images themselves—and the company realized that the problem was that those users didn't like the pictures or, in some cases, particularly among younger users, felt that the images were being used to bully them.

"The two default pathways we provided to users to deal with this," Arturo Bejar, one of Facebook's directors of engineering, tells Fast Company, "were either to comment on the picture in front of all of your friends, which is calling further attention to the picture, or hit Report."

So Bejar's team, which is responsible for safety on the social network, developed new workflows to help users communicate privately among themselves. 

But what started out as a simple effort to solve the mystery of why the pictures were being reported is slowly blossoming into a larger initiative at Facebook—one spearheaded by Bejar (pictured, right) that seeks to develop new ways of facilitating conflict resolution between users, and in the process promote more empathy and compassion within the community itself.

While trying to solve the photos problem, Bejar sought help from experts in real-world social interaction, like Emiliana Simon Thomas at Stanford's Center for Compassion and Altruism Research and Education and Dacher Keltner at the University of California, Berkeley's Social Interaction Laboratory.

Along the way, Bejar discovered that, while humans are very good at reading each other's emotions in the real world and, usually, modifying their behavior in order to avoid hurting each other's feelings, software interfaces have not evolved to give humans the same capabilities in the digital world.

Online communities are still relatively nascent, and the typical mechanism for mediating disputes, which emerged out of online bulletin boards and forums, was simply to turn to the moderator for help. That, Bejar says, explains why Facebook users started clicking the Report link. They were used to the idea of reporting issues instead of handling them themselves.

But with almost a billion people using Facebook, Bejar's team realized it would quickly become unsustainable for the social network to have to mediate every conflict. Plus, he says, it's not even the ideal solution for users.

"People think that communities online are different from communities in real life," Bejar says. "But they're not. Communities online are a mirror of communities in real life."

And in the real world, you don't ask some anonymous third party for help when a friend inadvertently hurts your feelings. You take it up directly with them.

So with Facebook becoming a place where people increasingly socialize in ways previously limited to the real world, Bejar says it's increasingly important to develop mechanisms online so people can negotiate their social interactions in digital space as easily as they do offline.

As a result, Bejar has started investigating ways to modify Facebook's interface to aid this evolution. Earlier this month, the company hosted a Compassion Research Day at its Palo Alto campus, where Simon Thomas, Keltner, and other researchers in the fields of human behavior and social interaction from Yale, Berkeley, and local schools shared what they know about human-to-human interaction.

This past spring, Bejar's team released enhancements to the Report This Photo feature, called "Social Reporting," that let users handle photo issues themselves. 

Now when you click the Report link, there's an option that says "I don't like this photo of me." When you select it, there's an option to send a message to your friend. And when you click that, Facebook opens a text box where you can type a message. For those who aren't sure what to say, the box comes with a pre-populated note saying, "Hey, I don't like this photo. Please remove it."

It's not rocket science, of course, and some might wonder why Facebook users couldn't simply have thought to message their friends on their own.

"While it feels very intuitive, no one was doing it," Bejar says.

While humans have spent millennia fine-tuning interpersonal interactions in the real world, online, people sometimes are at a loss. Bejar says that before Facebook started prepopulating the text box, usage of the workflow would drop off before hitting Send, as if users didn't know what to say. Once Facebook added the canned text, usage of the feature skyrocketed.

And it's working, Bejar says. Over 75% of people who receive those messages delete the image in question.

Facebook's enhancements so far, which also include a way for teenagers to get help from a trusted adult when they feel a picture is being used to harass them, are just the beginning of a long list of potential innovations to strengthen human-to-human interaction in the social network.

"Every time I hear one of these people talk," Bejar says of Keltner, Simon Thomas, and their colleagues in the field of human social interaction, "I walk out with a to-do list of things we need to do."

[Image: Flickr user Jesslee Cuizon]

E.B. Boyd is FastCompany.com's Silicon Valley reporter. Twitter | Google+ | Email

Add New Comment

8 Comments

  • missionmom

    How about Facebook just follows their terms of use?  For the most part, the reporting just doesn't work.  Groups advocating violence (usually against people with special needs) remain, and photos of kids with Down Syndrome posted solely for a laugh are rarely removed.  Wonder how Mr. Bejar thinks vulnerable people can protect themselves from being exploited on Facebook when in fact they can't just write to a friend saying that a picture is upsetting to them.

  • Stan

    Regarding the comment from the article:  "And in the real world, you don't ask some anonymous third party for help
    when a friend inadvertently hurts your feelings. You take it up
    directly with them."
    Sometimes.  However, in the "real world" we have mediators, lawyers, family therapists, etc. who often help people communicate in the midst of conflicts. We DO take conflict situations to third parties who can help us work with them when there's a breakdown. That Facebook hasn't taken the lead in creating best practices for working with conflict situations (e.g., sample videos on useful ways to communicate) indicates that they don't really understand the nature of conflict and communication--either online or in the so-called real world.  They have been VERY slow to protect users from cyber-bullying. 

    Also, Fast Company:  stop using the STUPID metaphor of "Czar" to describe someone who's been selected to run an initiative.  Czar isn't the appropriate metaphor.  Let's move away from dictatorial, totalitarian metaphors to something a little more constructive like "lead organizer".

  • Kristie Ebeling

    I would like a way to remove photos of my daughter. A woman I met at my son's baseball games has become obsessed with my daughter. She took pictures of her and posted them on facebook. My daughter was a year and five months when the photos were taken. I am not friends with this woman. I have asked her repeatedly to remove the photos and my requests are ignored. I have repeatedly reported them with no success. I've even had the police go talk to her. I am so frustrated because I cannot get facebook to remove them. Now some rapper is tagged in my daughter's photo. Facebook needs a way for parents to protect their young, innocent children. It's sad that lunatics can post pictures of children without the parents consent and there is nothing the parents can do about it.

  • Allan Hytowitz

    "Technology is the use of increasingly accurate, self-evident, and reproducible information to replace energy and matter.  The benefit of technology is NOT in what it lets people accomplish, but in how it improves the character of people."
    Thank you for the additional documentation.

  • Bob Jacobson

    PS Wired, "The Epic Saga of The WELL," Katie Haffner, May 1997, provides a retrospective of The WELL in its early days.  The system ( www.well.com ) still survives as a subsidiary of Salon and is having a sort of resurgence in these days of big, Big, BIG "social networks."  Of course.

    http://www.wired.com/wired/arc... 

  • Bob Jacobson

    Facebook's to be commended for addressing this issue of intimacy and "compassion," but really, a billion-user system may find it difficult to be either of these.  Hooray for those users who find the means for promoting these values, but Facebook itself, as in institution, remains so impervious that it's unlikely to remain anyone's favorite medium for close contact, if ever it was. 

    The WELL (Whole Earth 'Lectronic Link) in the 1980s and 1990s established perhaps the most favored, hospitable, compassionate online community.  It relied heavily on people, not interfaces, to make connections, mediate conflict, and provide knowledge or perspectives, and hold "F2F" in-person events that solidified relationships. 

    I don't see FB -- which remains totally aloof from its users (while it gets closer to its advertisers and apps vendors) -- achieving any of those goals.  In fact, many users, the most sophisticated, early users are on the verge of leaving for smaller, more intimate associations.  Google+ awkwardly anticipates this migration, though how to use its highly abstracted interface successfully still eludes most users.  More of us are going back to the time when people used email to write real letters that meant something and graphics that showed items of special interest, not every event in users' lives or the lives of their cats or imaginary wild fowl.  Twitter suffices as a Web pointer, once one learns to master its cacophony.

    I note that for the last two years I have been unsuccessful in having FB acknowledge, let alone address a simple request, that it delete a label (a former employer) associated with my name each time someone communicates with me on FB.  That's the sort of mechanical dullard that the system has become.  I hope the new research leads FB in another direction so that it can give me back an unencumbered identity.  Attention to user needs on a granular basis might do more for FB's reputation than a host of interaction innovations that further mechanize human relations

    PS Fast Company:  when will you fix my account so that I can display my face alongside my name? I've been after you for that simple request for over two years, also.  It seems large systems inevitably become inhuman (and inhumane) the bigger they get and the longer they last.  Look at the US Government.

  • cia

    Robert Jackson - Have you tried updating your profile on Disqus.com? If you login in through there using the same email address you use here on FastCompany.com and upload a profile photo, it should give your name a face. Please email me at cia at fastcompany dot com and let me know if that doesn't work.

    Thanks,
    Cia
    Producer, Fast Company