Ever wonder what happens to Twitter or Facebook users after they’re thrown out for hate speech? A team of researchers from Germany, the U.K., and the United States found out.
Their research process was creative: They gathered 29 million posts from Gab, a right-wing platform known for its neo-Nazis, conspiracy theorists, and anti-Semitism, and then backtracked to find users’ other profiles on Twitter or Reddit, some of which had been suspended.
They found that banned Reddit or Twitter users simply jump to sites with even less moderation, such as Gab and Parler—with more frequent postings, and increased toxicity. “It does have a positive effect on the original platform, but there’s also some degree of amplification or worsening of the behavior elsewhere,” says coauthor Jeremy Blackburn, an assistant professor of computer science at Binghamton University.
Fun fact: Banned users tend to use the same profile names on multiple platforms for recognizability.
Overall, the researchers found it questionable whether, in the big picture, banning helps. After all, much of the January 6 attack on the Capitol was planned on Parler. “Reducing reach probably is a good thing, but their hardcore [followers who] may be the group we’re most concerned about, are the ones that probably stick with someone if they move elsewhere online,” says Blackburn. “Is it worse to have more people seeing this stuff? Or is it worse to have more extreme stuff being produced for fewer people?”