As has been well-chronicled, one of the most frustrating dynamics of the 2016 election cycle was the spread of “fake news” across social platforms like Facebook and Twitter. No matter what side of the political aisle you were on, you doubtless were bombarded with articles of dubious origin, often posted by your own friends or family members. Today, Facebook said it was making changes to the algorithms that govern its trending news section in an attempt to clean up the quality of articles that get enough traction to trend.
A Facebook spokesperson told Fast Company that one thing the company is not doing is compiling a list of publications to blacklist or promote as it strives to improve the quality of trending articles. Rather, the company said, the main criteria is going to be looking at engagement around topics and articles, as well as around a publication’s history—in terms of likes, comments, shares, and so forth—and the amount of links an article gets in other articles. The more a story gets such engagement, the theory goes, the more likely it is not fake news. Facebook will also be incorporating signals from users’ news feeds, things like people flagging articles as fake or spam. But this is a process, Facebook notes, which means there will likely be growing pains as it works to improve the algorithms. And that means there’s still likely to be questionable articles that from time to time rise to the level of trending.