advertisement
advertisement

Can we trust Facebook’s users to decide what news we should trust?

Can we trust Facebook’s users to decide what news we should trust?
[Photo: Flickr user Marcus Quigmire]

Facebook really, really wants people to stop thinking of it as a forum for the spread of fake news. So it’s been making changes, like those earlier this month to the News Feed algorithms that will prioritize posts from friends and family and downplay those from brands.

Now, Mark Zuckerberg’s company says it’s going to start leaning more on its 2 billion-plus users to determine which news sources are, in fact, trustworthy. In a post today [embedded below], Zuck wrote that:

The hard question we’ve struggled with is how to decide what news sources are broadly trusted in a world with so much division. We could try to make that decision ourselves, but that’s not something we’re comfortable with. We considered asking outside experts, which would take the decision out of our hands but would likely not solve the objectivity problem. Or we could ask you — the community — and have your feedback determine the ranking.

We decided that having the community determine which sources are broadly trusted would be most objective.

Zuckerberg said that Facebook will ask users if they’re familiar with specific news sources, and if so, if they consider them trustworthy. “The idea is that some news organizations are only trusted by their readers or watchers, and others are broadly trusted across society even by those who don’t follow them directly.”

Sounds fine. Except, can we really trust Facebook’s own users to make that kind of determination? Look no further than the fake news story from earlier this week alleging that the Center for Disease Control had said that this year’s flu shots were potentially deadly that spread like wildfire, gaining more than 176,000 engagements.

In general, you would think that the wisdom of the crowds would mean that the cream will rise to the top. But it’s also worth questioning the wisdom of crowds that have already proven to be far too credulous when it comes to fake, and often truly damaging, content. After all, would people really share content that they don’t trust? If the answer is yes, we have a serious problem. If the answer is no, then it’s evident people are trusting too many untrustworthy sources.

Read moreWhat Facebook’s Fight Against Fake News Got Wrong (And Right)