With the election just weeks away, tensions are running high. False stories are becoming increasingly difficult to dodge, especially on social media.
Paul Barrett, an expert on political disinformation and the deputy director of the Center for Business and Human Rights at New York University, has some advice for people who are feeling overwhelmed at the prospect of sifting through misinformation just to get the daily news.
“I think one thing to do that spans platforms and individuals is just to slow down,” he says. “Just stop and read things more carefully and don’t just send a headline because it attacks a politician you dislike or supports a politician you like. Instead, think about it a little bit—both what you send out yourself and what you read and believe. That’s pretty much the core smart thing to do in a chaotic environment like this.”
October has brought on a deluge of questionable information in the lead up to the election. As usual, most of it is coming from inside the White House. The New York Post published a story—planted by President Trump’s lawyer Rudy Giuliani—claiming Joe Biden used his position in as Vice President to materially benefit his son. The story was so dubious at least two reporters who worked on it refused to put their names on the byline.
Meanwhile, since President Trump’s hospitalization for COVID-19, he’s returned to the campaign trail with a vengeance, taking aim at Anthony Fauci, the director of the National Institute of Allergy and Infectious Diseases. The president is claiming victory against the pandemic and denouncing Fauci, the nation’s most reliable authority on the virus, as an idiot. Trump has also been pushing the idea that there will be widespread voter fraud in this election, an idea that multiple news outlets have now debunked.
And there may be more misleading and downright false news coming your way. “I do expect a ramp up in disinformation and misinformation—both in intentionally false material and perhaps, in the case of misinformation, unintentionally false information,” Barrett says.
The level of misinformation is so unparalleled that YouTube, Twitter, and Facebook have all committed to banning content about the right-wing conspiracy theory QAnon from their respective platforms—a move they’ve all long avoided.
“The platforms have gotten noticeably more aggressive in trying to police some of this,” says Barrett.
He points to Twitter, which removed a tweet from one of the President’s advisers doubting the efficacy of a wearing mask to curb the spread of COVID-19. Along with applying labels to misleading tweets, Twitter has started to encourage people to read an article before they retweet it and to promote quote tweets over retweets—much of which aims to convince users to slow down, to Barrett’s point.
“It seems to be a fairly innocuous and even pallid gesture—the idea is that if people have to stop and at least write a sentence about something before they retweet it,” he says. “It might occur to them that what they’re retweeting is not good substantial information.”
He notes that it’s an insurmountable challenge to keep misinformation off these platforms entirely. “There’s already been some material online about liberal cabals destroying ballots and the post office is involved with some big scheme to destroy ballots and I think all of that needs to be viewed with the skepticism it deserves,” he says. “People just need to focus on what’s important, which is going out and casting their vote for their candidate, however they choose to do it.”