When Emily May founded Hollaback in 2005, the organization’s mission was to fight street harassment. But as it grew—expanding to local chapters in 92 cities and 32 countries—May began to take an interest in fighting harassment in another public space.
The Internet, which 77% of employers say they reference for information about job candidates and where one-third of married couples find their spouses, had become, for many, prohibitively ugly. A quarter of young women have been sexually harassed online, according to a Pew survey released last year, and among all Internet users, 18% have experienced severe harassment online, like physical threats and stalking.
Women who talk about this harassment, meanwhile, are often told to “ignore the trolls” or to shut off their computers. May recognized the pattern.
“We had 10 straight years of being told we were crazy [because we thought] people shouldn’t be able to hurl nasty comments at one another on the street,” May says. “And we felt that experience made us ready to go into this battle and to apply those lessons meaningfully.” There was one important difference between harassment on the Internet and harassment on the street. “Even if nobody is totally paying attention at the time,” she says, “you can send out a bat signal, and people can fly through time and space and the interwebs to get to you and help you out.”
May wanted to build that bat signal.
She proposed the idea to the Knight Foundation, which gave Hollaback a $35,000 grant to build a prototype last summer. Hollaback unveiled the result, called HeartMob, when it launched a Kickstarter campaign in April. It will use funds from the campaign to build a closed pilot version it plans to launch in July–and eventually a public version it plans to launch in September.
When someone is being harassed online, she will be able to go to HeartMob for help. Depending on what help she wants, volunteers will be able to use the platform to send her a supportive message, report abuse to platforms like Twitter and Facebook, or document harassment on her behalf. More unexpectedly, HeartMob wants to develop a framework for addressing harassers directly in an attempt to help them understand why what they’re doing is hurtful.
Hollaback could have built any number of tools for addressing online harassment. It could have focused on filtering harassment from Twitter timelines. Or built tools for lobbying lawmakers. Or found some way to shame harassers, like calling their mothers. But it built HeartMob after listening to and collaborating with the stories of people who have been harassed online–people who often feel alone when law enforcement, technology platforms, and even friends don’t take their situations seriously.
An Anti-Oppressive Design Process
“Usually the designer comes in, maybe does a little research, but then kind of goes off and creates a design,” says Jill Dimond, who led the development of the Hollback platform. “We have done it in a way that has people participating in the whole way.”
Dimond calls this process, which she developed as part of her PhD dissertation while working on Hollaback’s first app in 2010, “anti-oppressive design.” The idea is that people who are experiencing a problem can offer the best solutions. “It’s like designing with rather than designing for,” she says.
Though both May and Dimond have been harassed online, neither of them knew what it felt like to be, say, a woman of color facing harassment, or a journalist. To help fill in the blanks, they conducted one-on-one interviews with 40 targets of online harassment who had diverse experiences. One woman explained how she had a friend go through her Twitter mentions and screenshot rape threats and harassing comments so that she could avoid looking at them herself. She saved them in a folder on her desktop that she still hasn’t opened—she just wanted to be prepared with documentation in case the situation escalated. That led Dimond to think about HeartMob as a place to store online harassment history, even if only privately, which is now a feature in the current prototype.
Another woman, a political analyst and rape survivor named Zerlina Maxwell, had gone on Fox News to discuss the possibility of arming women to prevent rape (she argued the responsibility to not rape should lie with men), only to be flooded with a torrent of harassment, including rape and death threats. A cohort on the Internet reacted with a hashtag, #tyzerlina (thank you, Zerlina) in support, which Maxwell found comforting. “[The hashtag] completely transformed the experience from one where I was basically in tears, feeling isolated, attacked, and alone, into one where I realized that there were thousands of survivors and allies out there who not only agreed with what I was saying, but also who understood that the backlash was a manifestation of the ‘rape culture’ I was talking about,” she says. Her story reinforced the idea that support was important, and that the ability to say something encouraging to someone facing harassment can be powerful.
Building A Task Force
Online harassment can be an isolating experience. When targets report threats to local police departments, many of their complaints are met with confusion or dismissed. Though many types of online harassment—like stalking, defamation, and true threats–are illegal, the laws that make them so are rarely enforced. And technology companies typically rely on the targets themselves to report or tag inappropriate content. On top of all this, the problem is widely misunderstood. “It’s hard to convey the seriousness and influence it can have on your life to a lot of people who don’t understand the Internet,” Zoe Quinn, who faced so much harassment as a Gamergate target that she fled her home, told me recently. “So you’ll talk to friends and family, and they will be like, why don’t you stop posting? And that’s the most useless advice you can give somebody who works online or who has to be online for work.”
In response to this feeling of isolation, Quinn and other targets of online harassment have built informal networks of people who have had similar experiences. “People reach out,” feminist author Jaclyn Friedman told me. “If you see somebody being targeted, it’s natural to say, ‘hey, there’s this community of us, and we offer support.'” Quinn formalized her network by launching a support site called Crash Override with her partner, Alex Lifschitz, in January.
Last year, Hollaback invited a group of targets of online harassment to be part of a “task force” that would work together on solutions, including helping with the design for HeartMob. Eventually it would become a group of about 70 women who kept in touch through an email list.
Together, the group worked on user scenarios for the product: What would it be like to experience the platform as a user who had experienced a couple of forms of harassment? As someone who receives 500 harassment instances per day? As a volunteer who wanted to help? As a troll who was trying to break into the platform? They decided against a Facebook login, worried it could make some people unsure about the security of the site, and suggested creating tiered levels of volunteers who gain access to certain jobs after building trust doing less sensitive tasks.
Then, last December, about 20 members gathered at a summit in New York to test the working prototype on which they had collaborated.
When Hollaback originally pitched the idea for HeartMob to the Knight Foundation, it used the name “Report a Troll.” It was the task force that helped change “Report a Troll” into HeartMob.
On one level, the group agreed that “troll” wasn’t the right word. The word encompassed people who were just being annoying or pesky, rather than the problem Hollaback actually wanted to address, which was violent and discriminatory harassment. “Report a Troll” also didn’t quite convey what the platform hoped to become: a space for reducing trauma.
This focus was solidified during the summit. At one point, Dimond passed out supplies like pipe cleaners, fluorescent-colored paper, googley eyes, and glitter glue, and invited participants to create their ideal solutions for mitigating harassment, ranging “from practical to magical.” May says the results had two common themes: One was a place of support for people who were being harassed. The other was a process by which harassers might be reformed. Considering that many targets of online harassment have received death and rape threats (with their addresses sometimes attached to the messages), have had trouble finding jobs after defamatory information was posted about them online, and in some cases have been frightened to the point that they’ve left their homes, it was a surprisingly empathetic desire.
That doesn’t mean it will work. Months later, Lindy West would tell a story on This American Life and The Guardian about confronting her worst troll, who impersonated her deceased father on Twitter. He apologized, writing, “I think my anger towards you stems from your happiness with your own being. It offended me because it served to highlight my unhappiness with my own self.” But hoping for this type of resolution on a regular basis seems idealistic.
Some summit participants were skeptical, but it was still something most of them would want in a perfect world. “And so,” May says, “we went with it and we figured, we’ll just keep testing it and figuring out whether it works.” Part of this project is going to involve research on whether bystander help, including attempting to talk with harassers, actually has any positive effect.
One thing that is certain is HeartMob won’t end online harassment. Fully addressing the problem will require more rigorous technical solutions, as well as changes in existing technology platforms, the law, police enforcement, and culture. Part of what HeartMob hopes to change by focusing on bystander intervention is the feeling that targets of online harassment are alone–and by gathering a task force on the issue, the organization has already made a very small difference.