Fast company logo
|
advertisement

The crackdown helps make hate speech less visible online, but it also serves to further ostracize extremists and could harden their convictions, say former extremists turned peacemakers.

Could The Tech Purge Of Hate Sites Backfire And Actually Harden The Views Of Extremists?

[Photo: H. Armstrong Roberts/ClassicStock/Getty]

BY Marcus Baram6 minute read

The tech crackdown on hate sites has been fast and furious: In the four days since white supremacists and neo-Nazis marched in Charlottesville, GoDaddy, Google, and WordPress all denied domain registration for The Daily Stormer, Facebook started deleting white nationalist accounts, and PayPal said it wouldn’t do business with hate groups, among other efforts. “There is no place for hate in our community,” wrote Facebook CEO Mark Zuckerberg in a post. And yet the push to remove such bigots and racists from the public sphere could backfire, say experts in how to counteract white supremacism and neo-Nazism.

Many of the web-hosting firms and services that are cutting off hate sites say that it’s necessary to help prevent violence at future rallies by those inspired by such rhetoric. “That’s why we’ve always taken down any post that promotes or celebrates hate crimes or acts of terrorism—including what happened in Charlottesville,” explained Zuckerberg. “With the potential for more rallies, we’re watching the situation closely and will take down threats of physical harm.”

That potential for future violence was also cited by GoDaddy, which has long defended its decision to register the Daily Stormer’s domain by claiming that it’s a free speech issue. The company reversed course after the hate site published a post that mocked Heather Heyer, the young woman killed when she was run over in a car reportedly driven by a white supremacist in Charlottesville on Saturday afternoon. “Given their latest article comes on the immediate heels of a violent act, we believe this type of article could incite additional violence, which violates our terms of service,” a spokesperson told CNN.

Certainly, extremists from neo-Nazis to ISIS members have long recruited new members online, via websites or their social media presence. And making their vile content harder to find is sure to reduce its exposure to young minds, who might be lured by their heinous ideologies.

But when it comes to reintegrating such extremists back into the community, to teach them the value of empathy and love, such censoring or ostracizing tactics may actually backfire, says Sammy Rangel. The former gang leader spent years in a maximum security prison, seething with violence and taking part in race riots, before he learned the power of forgiveness. Now, he helps lead Life After Hate, a group founded by former white supremacists who now seek to help extremists transition out of their belief system and way of life.

Shutting down these sites is going to have a double-sided effect, says Rangel. It makes such extremist rhetoric less visible, but “it fuels the extremists to dig in further with their justifications because they take it as proof of their grievances rather than an indication of their own wrongdoing,” he tells Fast Company. It also could play into their narrative of themselves as an oppressed group that’s being unfairly maligned, even helping them attract new recruits.

Should The Focus Be Isolation Or Engagement?

But Rangel does believe that online platforms have an important role to play and should be much more “socially conscious and responsible.” These extremists tend to use the internet within filter bubbles and they need to be exposed to new ideas and ways of thinking.

In the midst of his rage, Rangel says his needs were being met through his gang lifestyle—”violence felt like the natural way to express what we were all going through and anyone outside our group was the enemy.” But in prison, a counselor started talking to him and exposed him to new ideas. “Once I felt that he was listening, I wanted to talk more. Not challenging me and make me feel like I was evil or crazy.” Eventually, that counselor showed him how to empathize with the main scapegoat in his life, his mother, and to see that she was a victim too. “I started to identify with her, to seek and pursue forgiveness.”

That experience guided his new life and how he approaches extremists. Instead of condemning them and shaming them, he tries to have a dialogue. “We try to position ourselves to be ready to talk to someone who’s vulnerable.”

The majority of their contacts happen through social media, he says, using that presence on Twitter and Facebook and other platforms as a “counter-narrative for those who are questioning their way of life” and then developing a connection with that person.

One particularly compelling example of social media’s role at changing minds is that of Megan Phelps-Roper, the granddaughter of the founder of the Westboro Baptist Church, the extremists who picket the funerals of fallen soldiers to preach their message of hate. She ran the church’s Twitter account, posting homophobic and anti-Semitic diatribes until a Los Angeles rabbi reached out to her and started engaging in a dialogue about religion with her over Twitter for months and months.

advertisement

They eventually became friends and she left the church and now calls herself a peacemaker. She says that the best way to deal with extremists is to engage with them and share your own views, outlining her four principles: don’t assume the worst, ask questions, stay calm, and “make the argument.

Phelps-Roper believes that the general approach to Charlottesville is flawed, and that shutting down sites is not an effective strategy.

“Isolating people with those ideas often serves only to push them deeper into echo chambers and further their sense of persecution,” she wrote in an email. “From my own time with Westboro and also from the hundreds of stories I’ve heard from others with similar experiences, it seems so clear that civil engagement is an incredibly potent antidote to extremism. I absolutely believe that the best way to oppose bad ideas is by promoting and advocating and defending and living better ones — not by using force or violence (which tends to push people deeper into extremism and will surely make it more likely that those we oppose will also resort to violence) and not by isolation.”

Her example serves as one way to increase engagement, but in general social media can make it very difficult to encounter different points of view. As has been much analyzed since Trump’s election, Facebook and Twitter allow users to create their own information bubbles, exposed only to the views of their ideological allies and isolated from opposing viewpoints.

As Rangel explains, the same algorithms that surface ads on Google for products you searched for just minutes ago also serve to reinforce your worldview on platforms like Facebook. “They’re tracking your keystrokes and providing what you think is a choice but it’s focused on what you’ve already expressed as your interests.” As a result, extremists tend to be insulated in a world of similar viewpoints and won’t see that post about a good deed performed by someone they consider their enemy.

Several years ago, he set out to try to disrupt that cycle of what he calls “selective information processing” and started a project with another group, Against Violent Extremism (AVE) to use technology to stop radicalization, with the support of Google Ideas and the Gen Next Foundation. “We wanted to create counter-narratives with a “counter algorithm” to expose extremists to different viewpoints when they were online. “We couldn’t stop hateful messages from surfacing online, but there’s nothing to stop us from putting our message out there too, exposing them to those opinions and stories.” But that project remains a work in progress and for now AVE publishes stories of redemption to increase the peace.

Recognize your brand’s excellence by applying to this year’s Brands That Matter Awards before the early-rate deadline, May 3.

PluggedIn Newsletter logo
Sign up for our weekly tech digest.
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Privacy Policy

Explore Topics