Facebook is once again acknowledging what users have been murmuring for years—that the website has become a vessel for supercharged political commentary, and people are fatiguing of it.
Earlier this year, in the aftermath of a particularly fraught election season, the social media platform revealed it would be temporarily reducing the distribution of political content in a handful of markets, including parts of the United States, and testing new algorithms that would rank such content in users’ news feeds. According to the company, that was in response to common feedback from many users that they don’t want politics to overwhelm their news feeds, which CEO Mark Zuckerberg mentioned in his first-quarter earnings call.
In theory, the algorithms could allow Facebook to continue supplying a healthy diet of political content to people who have an appetite for it, while also lowering the political noise for people who feel it’s grown too loud.
Now six months later, the results are in: On Tuesday, Facebook said it had collected positive data and would expand the tests to four more countries, including Costa Rica, Sweden, Spain, and Ireland.
According to the company, it’s also learned that “some engagement signals can better indicate what posts people find more valuable than others.” With respect to which signals matter more, Facebook says it will consider how often it receives negative feedback from users when political topics or current events are ranked high in their news feeds. As for what matters less, it will de-prioritize a post’s likeliness to rack up comments or shares—which it’s perhaps hoping will dampen increasing criticism over social media’s role in propagating viral misinformation and disinformation.