• 3 minute Read

The Filter Bubble Is Your Own Damn Fault, Says Facebook

If you’re not seeing content on Facebook that challenges your personal views, it’s because of your own choices, not an algorithm. But that could change.

The Filter Bubble Is Your Own Damn Fault, Says Facebook
[Photos: ldambies via Shutterstock]

Whether you’re shopping on Amazon, searching Google, or browsing Facebook, algorithms personalize the experience and point you to content that a machine thinks you want to see. In the sphere of civic discourse, experts have feared that this “filter bubble”–exposure only to news and opinions that users already agree with–will erode political dialogue and create a more polarized society.

In a peer-reviewed study published in the journal Science on May 7, Facebook’s own data scientists offer evidence that the filter bubble is a myth. Their message? “It’s not us–it’s you.”

“We conclusively establish that on average in the context of Facebook, individual choices more than algorithms limit exposure to attitude-challenging content,” the three authors, Eytan Bakshy, Solomon Messing, and Lada Adamic, write. “Our work suggests that the power to expose oneself to perspectives from the other side in social media lies first and foremost with individuals.”


The team looked at an anonymous data set of 10.1 million active U.S. Facebook users who share their political affiliation on their profiles, and isolated the “hard” news (e.g. national news, politics, world affairs) links that this group shared between July 2014 and January 2015. Studying who shared what, they measured the partisan alignment of each story; i.e. Fox News links tended to be shared by conservatives and Huffington Post links by liberals. The researchers then looked the political alignment of what appeared in these users’ News Feeds–shared by friends, and then filtered by Facebook’s algorithms–and which links they actually clicked on.

The good news is that people were exposed to “ideologically cross-cutting viewpoints” from hard news content shared by their friends. Overall, 24% of hard news content shared by liberals’ friends was cross-cutting, compared to 35% for conservatives.

But the degree of exposure to challenging views was reduced by the News Feed algorithm: conservatives saw 5% less cross-cutting content than what their friends actually shared, and liberals saw 8% less. Compartively, however, people’s own choices about what to click on resulted in higher degrees of filtering–17% for conservatives and 6% for liberals–of the cross-cutting articles they did see in their News Feed. And of course, this was all very affected by who people choose to be friends with in the first place: For people who identified their politics, a median of about 20% of liberals’ friends were conservatives, and 18% vice versa.

So there is a filter bubble, though it’s not as big as what we see from our own behaviors. But the problem is that this study is limited and isn’t frozen in time.

Or, in the words of David Lazer, a computer scientist at Harvard University who studies the effects of algorithmic filtering and wrote an opinion piece accompanying the study: “The deliberative sky is not yet falling, but the skies are not completely clear either.”

He warns that a small effect today might become a large effect tomorrow. Lazer points out that changes to News Feed curation, announced by Facebook on April 21, after the study was finished, could enhance the filter effect by showing more updates from “friends that you care about.” Also, the study didn’t get into a host of other questions posed by algorithmic curation, such as whose voices the system prioritizes over others and the consequences of the fact that Facebook’s curation may generally favor pets over politics.

Lazer applauds Facebook for conducting the research: “The information age hegemons should proactively support research on the ethical implications of the systems that they build,” he writes. But it can’t only be internal Facebook scientists that study this data, he says–as privacy becomes more important, the access that independent researchers have to Facebook’s user data is shrinking.

“There is a broader need for scientists to study these systems in a manner that is independent of the Facebooks of the world. There will be a need at times to speak truth to power, for knowledgeable individuals with appropriate data and analytic skills to act as social critics of this new social order,” he says.

About the author

Jessica Leber is a staff editor and writer for Fast Company's Co.Exist. Previously, she was a business reporter for MIT’s Technology Review and an environmental reporter at ClimateWire.

More

Video

More Stories