Is Facebook at fault for providing an echo chamber of rumors and lies?
The automatic answer one often gets when asking that question is that the First Amendment reigns supreme—and social networking channels simply provide a neutral medium where people can say whatever they want.
Facebook definitely provides such a medium, and it does relatively little to check free speech. But it’s also set up in a way that encourages and nurtures echo chambers where the most outrageous rumors, half-truths, and outright lies can go completely unchecked.
Perhaps most troubling is that many news consumers on Facebook aren’t concerned with questioning the legitimacy or political bent of the sources of the stories they read and share. I know this from the stories my Trump-supporting friends and family have shared with me. Sure, there are voters whose dissatisfaction with the Washington establishment is rooted in righteous anger, and many voted for Trump accordingly—not because Hillary is a felon and has cancer, or because Obamacare kills the elderly via death panels, or other popular falsehoods.
But let's not forget that Trump’s political rise was boosted to life by the false claim that President Obama isn’t a U.S. citizen. He did this to generate heat throughout his campaign. Remember him boosting the story that Ted Cruz's dad was connected with the JFK assassination? "All I did is point out the fact that on the cover of the National Enquirer there was a picture of him and crazy Lee Harvey Oswald having breakfast," Trump said at the time.
Goal Number One of the Trump campaign was to send (and repeat) messages that elicit emotional responses (usually anger or fear) from disaffected people. He often used as fodder bogus or half-true news stories circulated on social media.
While genuine political arguments do happen among friends on Facebook, the company's "personalized news front page" is not designed to foster meaningful discussions. On the contrary: The stories surfaced there are based on the interests of like-minded friends. Facebook believes, and has from the start, that agreement and harmony make for better engagement than argument and disharmony. That’s why there’s never been a "thumbs down" button on Facebook, and the reason that adding the "angry face" emoji was such a tough internal decision for the company.
Take a forum like that, add a truth-averse persona like Donald Trump and his millions of followers, and you have an echo chamber that feeds on its own volume and gets louder and louder. Facebook and the Trump movement are a bad combination for reasoned discourse.
And let’s call Facebook what it is: the biggest media organization in the world. It exerts more control over the news people see than any other single entity. It’s bigger than the New York Times, and yet you’d never refer to Facebook as a "newspaper of record."
Traditional news sources are accountable for what they publish, but because Facebook claims to be merely a "platform," it escapes accountability for the information it surfaces for users, even as it makes editorial judgements like allowing Donald Trump to promote hate-fueled speech that would not be permitted by everyday users.
Perhaps regulating free speech on social media channels would create more negative side effects than benefits, but this election should be a lesson that Facebook (and other social networks like Twitter) has not provided safe places for measured political discussions among people who disagree. And they've done a poor job of managing the bigoted talk has tainted the national discussion this election cycle.
Facebook provided a forum, a breeding ground, for misinformation—and benefited handsomely from it. Social engagement is Facebook's business. Each new engagement—a comment, a like, a share—is a new opportunity to deliver an ad message. Or a new opportunity to add another piece of user data to the "social graph," which might be used to help target some ad. In Facebook's political echo chamber, each echo represents opportunity.
And by the time election season came to a close, the denizens of that echo chamber were full of passionate intensity—and Trump rode it all the way to a win.
Is Facebook partly responsible for Trump's rise? You bet they are. Facebook will say "we only provide the platform; what people do on it is their business." For years, that stock answer has worked pretty well for internet companies. But in the year of Trump, it rings a little hollow. Facebook provided the public space where the wild expression of frustration and hatred was OK. It played host to the oxygen fire of pent-up working class and rural frustration that carried Trump to the White House.
Ironically it’s people like Donald Trump and Peter Thiel who have benefited so richly from social media free speech who want to stifle the free speech of news outlets they don’t like. Trump wants to "strengthen the libel laws," while Thiel believes that people with enough money should be able to bleed enemy publications out of existence through the courts.
Full disclosure: I quit Facebook last week and deleted my account. I have been very suspicious of Facebook’s business model for a long time, but it was the tiresome political drone of the site over the past year that, I believe, finally did me in. I suppose I’ll miss some of my acquaintances on there, but I feel good about my decision so far.
For me, Facebook looks different in the wake of this painful election cycle. It’s not such a friendly place. It’s a place where millions of disaffected people met and talked each other into electing the most dangerous president in U.S. history.