How Facebook’s News Feed Shift Could Lead To A Big Spike In Xenophobia

A newsfeed that emphasizes posts from friends and family may leave people feeling more like members of tribes than ever–and resistant to outside views.

How Facebook’s News Feed Shift Could Lead To A Big Spike In Xenophobia

After spending much of 2017 acting like a prospective politician, Mark Zuckerberg unveiled a new persona this past Thursday: Martyr to his own algorithm.


In an interview with the New York Times‘ Mike Isaac, Zuckerberg announced that he was changing Facebook’s algorithm. Facebook’s research had found that the “passive viewing” of news articles and videos stressed or harmed users, Zuckerberg explained. Seeing posts by friends and family in your existing circle is much healthier. So Facebook is adjusting its algorithm accordingly.

Zuckerberg acknowledged that the change would likely result in people spending less time on Facebook. Following those comments, Facebook shares dropped by 4.4% on Friday, and Zuckerberg lost $3.3 billion from his personal fortune. But, Zuckerberg reasoned, the change would be worth it, if his two young children could one day look back on his legacy and “feel like what their father built was good for the world.”

Science says that Zuckerberg might be making a very wrong and very dangerous bet. By removing stories that come from outside users’ close circle of connections, Facebook will likely make the world more divisive, not bring it closer together.

Xenophobia, And The Danger Of Tribes

Xenophobia is the fear of those who are culturally different. In the early days of humanity, it was an extremely useful instinct, since it helped us identify people outside a tribe who could be a threat to its survival.

The antidote to xenophobia is a neurochemical called oxytocin, an empathy drug produced in our brains. It sends us a signal that we should care about someone–that they’re a part of our tribe, and not a threat.

But how do you develop empathy for someone–and thus fight off xenophobia–if you don’t know them already?


As I cover in my upcoming book, The Storytelling Edge, great stories are a primary driver of empathy. When we read or watch a great story, our brain releases oxytocin, and we develop empathy for the main character. We turn that person into a member of our tribe.

This works for all kinds of stories. One of my colleagues, for instance, recently had his brain hooked up to a brain machine called INBand to measure his oxytocin levels as he watched a heartwarming video produced by HP. As he became engrossed in the story of a father’s challenging relationship with his teenage daughter, his oxytocin levels spiked off the charts. He developed empathy for the main characters.

Which brings us back to Facebook. This decade, one of the most successful content types on Facebook has been short, engrossing videos posted by publishers like Upworthy, Vox, NowThis, and the New York Times that tell us the story of a person–or a group of people–outside of our own tribe. While Upworthy was often derided for its clickbait headlines, it was extremely successful at packaging stories about serious issues like anti-Muslim bigotry and the Dakota Access Pipeline and getting people to share them hundreds of thousands of times.

Facilitating the spread of these stories is arguably one of the most impactful things Facebook has done. It’s helped raise money for causes like charity:water by telling the story of people outside most Americans’ circle, and encouraged the spreading of social justice movements like Standing Rock and Black Lives Matter.

But now, the social giant wants to restrict those stories in favor of reinforcing the connections we have with people who are already in our tribe.


The Disturbing Downside

Last year, the FBI and other U.S. intelligence agencies concurred that the Russian government used Facebook to spread false stories amongst over 125,000 million Americans and influence the U.S. election, while amplifying partisanship in the electorate. Many people have viewed Facebook’s algorithm change as a response to the heavy criticism that ensued–if fake news is tricking and dividing people, let’s just stop showing the news altogether.

But the existence of news isn’t the problem. For years, Facebook has been designed to reinforce the homogenous posts and viewpoints of the like-minded people you’re connected with and engage with the most.

In the quest to deliver what he calls “more meaningful interactions” between users, Zuckerberg is emphasizing our existing echo chambers over stories from the world beyond it. He’s reinforcing the power of our tribes, while limiting the opportunity Facebook users will have to hear the story of someone outside of their tribe, and feel empathy for them as a result.

With over two billion users, Facebook is incredibly powerful. It’s troubling to think of the horrors that could result from even a small drop in empathy across the globe. If Zuckerberg truly wants to leave a legacy of doing good for the world, this is a very dangerous start.

Joe Lazauskas is the the head of content strategy at Contently and co-author of The Storytelling Edge, an upcoming book about the science of storytelling and how to use it to transform your business.