Current Issue
This Month's Print Issue

Follow Fast Company

We’ll come to you.

3 minute read

Technology

Digg Bans 20-30 Users for Hate Speech In Response to Fast Company's Women in Web 2.0

After Fast Company's Women in Web 2.0 article elicited a controversial reaction from Digg, we interviewed Beth Murhpy, the site's Marketing Director, about Digg's culture, problems, and policies.

After Fast Company's Women in Web 2.0 article and the subsequent blog post that followed both ended up on Digg, I talked to Beth Murphy, the site's Director of Marketing and Communications, about Digg's culture, its problems and policies.

What are Digg's policies about offensive comments?
All users have to sign Digg's Terms of Service, which prohibit spam, defamatory comments and hate speech. Users are permanently banned for hate speech. We banned between 20 to 30 users who commented on Fast Company's article. Sometimes if it's a grey area, they try to influence a person's behavior rather than banning them. The problem is that people can easily set up alternate accounts.

Digg is a massive site. How are offensive comments found and policed?
Digg has over 30 million users every month. It gets about 16,000 submissions on any given day, and about 32,000 comments. We have a skeleton team of folks, one to two people on the site answering emails, deleting spam, that sort of stuff. We basically rely on the wisdom of the crowds. People can dig up and down comments. People can report others - internally we call it the jerk report.

What happens when your community is largely skewed towards a particular demographic? Does the wisdom of the crowds still hold?
The crowd based community management is always imperfect. We've tried to give the community as many tools to do this. Digg does skew younger, it does skew male, it does skew early adopter. But it's tough to characterize Digg as a monolithic community - there are pockets that are sexist racist, homophobic - certainly. There are trolls and people might see over indexing of these folks. But equally there's the other side of the coin.

Digg has seen a broadening of its demographic since its inception. The site has also experienced an expansion of interest into the arenas of politics, business and the environment. Digg represents the growth of the internet — different thoughts and ideas. There are different micro-communities following different information. We've seen that there's serious stuff and funny stuff. The comments around different issues represent what these issues are about.

Are sexist comments defined as hate speech?
They are evaluated on a case by case basis. Sexism in general is a cultural grey area. We have to ask ourselves — is this using the First Amendment right to be offensive yet not crossing the line to become true bannable offense.

Kevin Rose says that he wants the site to be a useful source of information. Is there a discrepancy between the culture of the site as it exists, and what the staff and Digg want it to be?
There are micro ecosystems of snarky behavior on Digg. But nobody in the Digg office would be like yeah that's awesome. I don't think I'd describe the culture of Digg as sexist. But there's the whole question of troll management — how do you manage this as you grow? There are pockets of groups that behave really badly and the anonymity of internet creates an acrimonious culture. The question is what is our philosophy of community management? Are we hands on or hands off? The front page of Digg represents what people want to vote up - how much personality should we impose?

Does Digg plan to expand its community moderation team?
We're always looking into new tools to help community policing team — like automating components of it. You need to both the human and the automated elements.

What are the site's plans for the near future?
We want to create more experiences outside the homepage that makes the site more relevant to particular people- so it's not just a one size fits all homepage of 35 million users but is based on what information you're digging.