Right now seems like a particularly bad time to be looking at the Internet. Between Gaza, Syria, Iraq and Ferguson, Missouri, it can be hard to know what image or video is safe to click on.
Of course, those working online face exposure to violent imagery every day—and the effects of repeated viewing can go way beyond momentary discomfort. At feminist site Jezebel, for months, an anonymous troll has been zealously posting rape GIFs in the comments section. On Monday, Jezebel’s staff—which had been tasked with removing each violent image individually—published an open letter to the site’s parent company, Gawker, asking it to provide a technical solution to the problem. The rogue post prompted action from Gawker bosses, but when they shut down image uploading capabilities for Jezebel comments, the problem actually got worse. All Gawker properties got hit with violent and disturbing images, flooding the comments, over and over, until image, GIF, and video uploads were suspended for every commenter.
"None of us are paid enough to deal with this on a daily basis," the Jezebel editors wrote.
Police officers who investigate child pornography, prosecutors, and safety teams at social sites like Facebook who do accept repeated exposure to violent images as part of their jobs also accept real mental health risks. "Definitely when you look at these images, you will have a stress reaction to that," says Heather Steele, the director of an organization called The Innocent Justice Foundation that runs training programs for these workers. "It’s not just whether you are a tough or not tough person. You will have a fight, flight, or freeze response, and chemicals will dump into your system."
Much of the discussion of Jezebel’s problem has focused on Gawker’s commenting system, Kinja. BuzzFeed asked other Gawker employees to comment on Gawker's comments. Joel Johnson, Gawker’s editorial director, invoked the "vision of Kinja" in an apologetic comment to Business Insider. And The Washington Post has used the incident to argue against comments in general.
Anyone exposed to violent imagery can be damaged by it. But for those tasked with moderating user-generated content—think the small army employed to surf YouTube for beheadings and porn, or outsourced workers charged with monitoring smaller sites for actionable child porn, the psychological impact can go deeper.
Steele says that employees who are repeatedly exposed to violent images sometimes exhibit warning signs, like nightmares, that might indicate a "vicarious trauma" similar to that experienced by counselors, emergency nurses, and others who are exposed to violence indirectly offline. Though her organization focuses mostly on supporting investigations of child pornographers, this type of trauma extends to other dark corners of the Internet. Employees in safety and moderation roles from GoDaddy, Facebook, and Yahoo, as well as anti-terrorism investigators, have also attended. "If it’s adult rape or adult snuff films, where someone gets killed, that’s all traumatic as well."
Research supports the idea that graphic violence doesn’t necessarily need to occur to you, or in person, to cause trauma. A study published last week in the Journal of the Royal Society of Medicine Open found that journalists who viewed user-submitted images of violent incidents—even though they were doing so from the safety of an office—were more susceptible to depression, anxiety, PTSD, and alcohol abuse. Another study, this one in Psychological Science, found that people who viewed a lot of 9/11 news coverage experienced more acute and post-traumatic stress symptoms two or three years later.
Steele stresses that there’s no "typical" reaction to viewing online violent images repeatedly, and that people cope differently depending on past traumas, the difficulty in their personal lives, and other factors. "Some people cope well with viewing this stuff over and over again, especially if they know they are helping defeat evil," she says.
This morning, Jezebel posted an update to Gawker’s response to the problem. In addition to continuing to ban images in the short term (no mention of when that feature will come back), the company is bringing back a "pending" comment system that requires users to click to see unapproved comments. Unfortunately, as the editors of Jezebel, online moderators of all stripes, and unfortunate Internet users already know, it’s impossible to unsee such filth as a rape GIF. No one should have that job who hasn’t signed up for it.