advertisement
advertisement
advertisement

Why Scientists Are Upset About The Facebook Filter Bubble Study

Facebook says users, not its algorithm, are responsible for the so-called filter bubble. But Facebook’s own research says otherwise.

Why Scientists Are Upset About The Facebook Filter Bubble Study
[Photos: Flickr users Matt Biddulph, Marco Paköeningrat]

Yesterday, the journal Science released a study by Facebook employees examining what content you do (and don’t) see on Facebook’s news feed. Its conclusion, at first glance, was that the Facebook news feed algorithm does not keep users from seeing opinions they disagree with (a reference to the so-called filter bubble of social media, in which you assume most people agree with you because you are not exposed to other viewpoints). But after prominent media outlets covered the study’s findings, data scientists began to speak up. Actually, they argued, the study has major flaws, and its conclusion suggests that the news feed algorithm does hide news stories it thinks you will disagree with.

advertisement

New York Times writer Farhad Manjoo covered the study in a fairly straightforward way. The researchers observed 10.1 million Facebook users who self-identified as either liberal or conservative from July 2014 to January 2015. They found that 29% of any person’s news feed content contained contrary opinions, which the study authors called “cross-cutting” articles. But the study also found that Facebook effectively hides 1 in 20 “cross-cutting” links if you are a self-identified conservative, and it hides 1 in 13 “cross-cutting” links if you identify as a liberal.

Backlash to the study happened overnight. Zeynep Tufekci, a professor at the University of North Carolina, Chapel Hill, criticized the study in a Medium post. Tufekci points out that the study’s sample is far from representative of Facebook as a whole.

The research was conducted on a small, skewed subset of Facebook users who chose to self-identify their political affiliation on Facebook and regularly log on to Facebook, about ~4% of the population available for the study. This is super important because this sampling confounds the dependent variable.

Tufekci also accuses the study authors of minimizing the impact of the Facebook algorithms. The study’s conclusion discusses how a person is more likely to click on and like stories that support their own beliefs, which means people tend to create their own filter bubbles. That’s true, but it’s not the point–the point is that the news feed algorithm also filters out diverse opinions. As Tufekci says, it is disingenuous of the researchers to change the focus of their paper. Her analogy is apt:

Comparing the individual choice to algorithmic suppression is like asking about the amount of trans fatty acids in french fries, a newly-added ingredient to the menu, and being told that hamburgers, which have long been on the menu, also have trans-fatty acids — an undisputed, scientifically uncontested and non-controversial fact.

The problem lies with the way that the study’s authors, members of Facebook’s data team, framed the results. “This may go down in history as the ‘it’s not our fault’ study,” wrote social scientist Christian Sandvig in a blog post. In short, the data scientists saw that Facebook’s algorithm limits the diversity of articles that you see, but they abstain from taking a stance on whether that’s a good or bad thing. This is ridiculous, wrote Sandvig.

“So the authors present reduced exposure to diverse news as a ‘could be good, could be bad’ but that’s just not fair. It’s just ‘bad.’ There is no gang of political scientists arguing against exposure to diverse news sources,” wrote Sandvig.

The debate moved to Twitter:

advertisement

This isn’t the first time Facebook research has angered the academic community. In January 2012, the site manipulated the news feeds of users to show either more positive or more negative content, and then looked at whether those users went on to post more positive or more negative statuses. Because Facebook did not obtain informed consent for the experiment, it may have violated research rules.

As election season approaches in the U.S., it’s important to be aware that the opinions and stories you see on Facebook are strongly influenced by what Facebook’s news feed algorithm thinks you should see. And because Facebook’s algorithm changes frequently, it is hard to understand why you might be seeing one article instead of another. The news feed, despite what Facebook repeatedly claims, is not simply a reflection of your interests.