Why Academics Are Incensed By Facebook's Emotion-Manipulating Social Experiment

It calls into question the notion of "informed consent."

For a long time, Facebook operated under an incisive motto: "Move fast and break things." Acting on this mantra has a tendency to upset Facebook's change-averse user base—and sometimes, Facebook doesn't even have to break anything.

Over the weekend, New Scientist revealed that Facebook's data team manipulated the news feeds of 689,003 users for one week in January 2012. Using an algorithm, Facebook and researchers at Cornell University either removed all positive posts or all negative posts from a user's news feed. The joint study was published in the Proceedings of the National Academy of Sciences.

As first pointed out by New Scientist, the team then evaluated those users' moods by looking at the content they would later go on to post, and discovered that "emotions expressed by others on Facebook influence our own emotions, constituting experimental evidence for massive-scale contagion via social networks." In other words: Positive status updates beget positive usage, etc.

The Internet was not pleased. Even though the sample size was relatively small, Slate, for example, called the research "unethical." Comments on Metafilter filled a spectrum ranging from "meh" to furious: "I don't remember volunteering to participate in this. If I had tried this in graduate school, the institutional review board would have crapped bricks."

Regardless, the outcry prompted lead researcher Adam Kramer to take to—where else?—Facebook to defend his work. "The reason we did this research is because we care about the emotional impact of Facebook and the people that use our product," wrote Kramer. "We felt that it was important to investigate the common worry that seeing friends post positive content leads to people feeling negative or left out." He continued:

The goal of all of our research at Facebook is to learn how to provide a better service. Having written and designed this experiment myself, I can tell you that our goal was never to upset anyone. I can understand why some people have concerns about it, and my coauthors and I are very sorry for the way the paper described the research and any anxiety it caused. In hindsight, the research benefits of the paper may not have justified all of this anxiety.

It calls into question the notion of "informed consent," which Facebook claims it had, since users have to agree to Facebook's rather liberal data-use policy. Academics have specifically called the "informed" part of that interpretation into question, since the agreement page itself is no less than 2,200 words. "One reason is simply that some walks of life are regulated, and Facebook shouldn’t receive a free pass when it trespasses into them simply because it does the same things elsewhere," writes University of Maryland law professor James Grimmelmann. "The unwitting participants in the Facebook study were told (seemingly by their friends) for a week either that the world was a dark and cheerless place or that it was a saccharine paradise. That's psychological manipulation, even when it's carried out automatically."

Another key question is whether the experiment had approval from its local institutional review board, or IRB, which is the governing body that makes the call of whether an experiment is ethical. (The Princeton University professor who edited the PNAS study, Susan Fiske, told The Atlantic the study was IRB approved, although, as The Atlantic reports, there has been some confusion as to whether this was actually the case or not.)

The study, while instructive about how we use Facebook, presents murky new territory, and it could speak to our growing discomfort that large corporations like Google and Facebook have unprecedented troves of our personal data at their disposal.

But not everyone feels Facebook was out of bounds with its experimenting. Tal Yarkoni, a psychologist at the University of Texas, Austin, writes that "it's not clear what the notion that Facebook users’ experience is being 'manipulated' really even means, because the Facebook news feed is, and has always been, a completely contrived environment." He continues:

I hope that people who are concerned about Facebook "manipulating" user experience in support of research realize that Facebook is constantly manipulating its users' experience. In fact, by definition, every single change Facebook makes to the site alters the user experience, since there simply isn’t any experience to be had on Facebook that isn’t entirely constructed by Facebook.

[Image: Chris Gayomali]

Add New Comment

1 Comments