TikTok videos are quick bursts of comedy, home-made ingenuity, dancing, weirdness, and personality that make user-created content look like a really good idea again. But the medium’s best traits may also help spread falsehoods and propaganda.
You don’t have to look far to find various forms of misinformation on TikTok, from anti-vaxxers to people selling the flatly false claim that 5G networks cause coronavirus symptoms. You can also find plenty of pro-Trump accounts featuring the president spouting half-truths.
Now, TikTok is debuting a new set of videos on the platform that aim to educate its users on how to recognize misinformation posted by other users, then refrain from sharing it. The campaign, called “Be Informed,” features a number of TikTok’s most popular video makers, who address topics such as how to scrutinize the credibility of the sources of TikTok videos and how to distinguish fact from opinion.
TikTok is right to be nervous about the threat of misinformation on its platform. With the coronavirus surging, the economy struggling, and a major election looming, the short-form video platform can’t afford any big scandals. It’s already facing the real possibility of a U.S. ban, as both legitimate security concerns and more abstract worries over TikTok’s China connections have grown (its parent company, ByteDance, is Chinese).
TikTok tapped the National Association for Media Literacy Education (NAMLE) for help with the content of the videos, which look like funny infomercials. TikTok says it’s also working with online safety organizations, such as the Family Online Safety Institute and ConnectSafely.
“The series is meant to provide advice on how to evaluate content and use those skills to protect against incorrect or misleading information,” says TikTok’s director of creator community, Kudzi Chikumbu. “So that at the end of the day they will really think about what they consume and what they create.”
TikTok has become a breakout hit by offering a fresh video format and simple but effective creation tools that none of the large U.S. tech companies was able to anticipate. Naturally, when the crowds of users come, so do the propagandists, conspiracy hawkers, and foil hat wearers.
But the misinformation section of TikTok’s community guidelines is a work in progress. In response to the coronavirus, TikTok prohibited videos containing misinformation “that could cause harm to an individual’s health or wider public safety.” It added language prohibiting videos meant to incite “fear, hate, or prejudice.” It also prohibits videos that contain “hoaxes, phishing attempts, or manipulated content meant to cause harm” (malicious or defamatory deepfakes might fall into this category), as well as content that misleads people about elections.
It created a “misleading information” category in the app, with which users can report content they believe may violate its misinformation policies. The reports are sent to a dedicated group of moderators based in Los Angeles that reviews the accounts and videos in accordance with TikTok’s policies on misleading information. On videos that relate to the coronavirus, TikTok now affixes a link to a coronavirus information page, and it reminds users to seek accurate information from credible sources.
Alex Stamos of the Internet Observatory at Stanford said during a recent The Verge podcast that TikTok is just now assembling the infrastructure and people needed to deal with misinformation.
” . . . they’ve got, you know, some really legit trust and safety people who’ve joined here thinking a lot about it,” Stamos said. “But they’re also having to work within the strictures of ByteDance, a Chinese company, and they’re having to bootstrap very quickly.”
TikTok denies that the timing of the new videos has anything to do with current world happenings or an uptick of misinformation on its platform. But TikTok’s window for getting out in front of the problem will stay open only so long. And U.S. regulators are watching closely.