Reddit calls itself “the front page of the Internet.” But unlike a traditional newspaper’s front page, the stories and comments on the site aren’t picked by professional editors.
Instead, they’re submitted by everyday users and reviewed by an army of about 20,000 volunteer moderators across the site’s roughly 9,000 active, user-created forums, or subreddits, covering topics from science to soccer to nail art. They’re in charge of enforcing a handful of sitewide rules—no spam, no child pornography, no harassment or doxxing—and the individual policies of the subreddits they manage.
“Moderators are essential to Reddit,” wrote the company’s community manager Kristine Fasnacht to Fast Company in an email. “While we provide the platform and enforce rules to maintain the integrity of the site, they are the ones who work day in and day out to make subreddit communities function and thrive.”
But the relationship between Reddit’s volunteer moderators and corporate management has long been rocky. Moderators often complain there’s little they can do about issues like spotty communication from Reddit headquarters, abrupt policy changes, and antiquated moderation software, yet a quickly organized moderator protest over a popular employee’s firing earlier this month brought international media coverage and led to the resignation of interim CEO Ellen Pao.
But while the protests, which temporarily shut the virtual doors of dozens of popular subreddits, drew plenty of attention, moderators say their role typically involves little corporate intrigue. Instead, they spend hours a day fielding user questions through the site’s modmail system and culling off-topic or offensive posts, hoping to make the site a better place and, perhaps, make an online name for themselves.
“It’s a bit like why people pick up litter off the street, or go into politics, or try to tend their front lawn for whoever’s in the neighborhood,” says Daniel Allen, a Chicago-area designer who until recently moderated several large subreddits under the name solidwhetstone. “It’s sort of like this: if you don’t contribute to improving communities, and you want to consume or enjoy good content, you’re kind of expecting something that you’re not putting into it.”
Allen says he got his start on the /r/Chicago subreddit and began to moderate other subreddits when they asked for experienced volunteers.
“Kind of the way that it works on Reddit is it’s very much a trust-and-reputation-based structure, so if you do a good job on a subreddit and foster the growth of your community and other people start to take notice, then they might offer you a mod position elsewhere,” he says.
He says he often focused personally on content standards for the subreddits he moderated: polling users on what kinds of material they want to see, posting guidelines, and removing content that breaks the rules.
“Without guidelines, once a community gets to 50,000 people, the quality of the content starts to decline drastically,” says Allen, who watched the Chicago forum grow from about 5,000 subscribers to more than 70,000. The /r/Art subreddit, which he also moderated, itself rose from 50,000 members to more than 3 million on his watch.
Users who don’t agree with existing moderators’ guidelines—or the lack thereof—can, and often do, start their own rival subreddits. The cannabis-focused subreddit /r/trees, for instance, famously spun off from /r/marijuana after such a dispute, and when fans of non-intoxicating varieties of trees, like oaks and maples, found the name taken, they jokingly gave their own subreddit “for all things dendrologic” the name /r/marijuanaenthusiasts.
On some subreddits, growth without strict guidelines just leads to generic posts like references to popular memes, but others can take a nastier turn. On the Chicago page, for instance, Allen says he and other moderators patrolled comment threads for racist commentary of a type they had seen on other local sites.
“There are Chicago newspaper websites that have comment sections that are full of hate speech, and we wanted the Reddit community to be something different,” he says. “We banned them. We silenced them. We removed their comments. We told them to go away.”
Trolls are a notorious problem on Reddit, just as they are on many Internet forums, and moderators are typically the ones forced to deal with them.
“They can wreak havoc on our threads and really mess with people’s heads,” writes the lead moderator of the /r/sex subreddit, who uses the name Maxxters. “I don’t think most people realize what little it takes to seriously damage someone because of a way you respond to their question or sexual information they’ve divulged.”
Users can flag inappropriate or spammy posts, and moderators can remove them from the site and ban repeat offenders from their subreddit. Many moderators use a scripting tool called AutoModerator—initially created as a third-party extension, and later made an official part of the site after its developer Chad Birch was hired by Reddit—that lets them define certain automated filtering rules, but posts often require human review to see if they violate the complex rules of individual subreddits.
“You can call an action an abomination,” wrote /r/Christianity subreddit moderator RevMelissa in deleting one post. “You are not allowed to call a person an abomination, as that is a personal attack.”
An ordained minister, she leads an online ministry called Fig Tree Christian that takes prayer requests and holds Bible discussions through its own subreddit. Fig Tree is part of the Disciples of Christ denomination, but on the general Christianity subreddit, she and the other moderators work to enforce the forum’s rules of civility, not the doctrine of a particular church.
“It’s a difficult sub to moderate because Christianity is defined differently depending on your denomination or sect,” she wrote in a Reddit private message. “Some people want moderation to run down certain denominational lines, and they get very frustrated when it doesn’t.”
But the forum’s intended to be welcoming to members of all beliefs, she writes.
“I believe all people deserve a place where they can feel safe and can connect with others,” she writes. “/r/Christianity stands apart from other subs as just that. I want to keep it safe.”
That’s a sentiment echoed by moderators from across the site.
“That’s the main job, really helping shape the community to be a friendly, welcoming and useful space,” says Randal Olson, an artificial intelligence researcher and head moderator, under the name rhiever, of /r/DataisBeautiful. “For me it’s more community building—I love the fact that we have this massive community that’s focused on data analysis and visualization.”
He does have to deal with spammers, and with racists posting dubious data about human genetics, but also gets a unique look at developments in the field.
“Most or maybe getting close to all of the data visualizations that come out on the web end up on /r/DataisBeautiful,” says Olson, who usually spends a few hours a day reviewing posts. “it really helps me keep up with what’s going on, who’s talking about what, what’s the latest cool data visualization.”
While moderators continue to contribute their time and energy to the site, many wish Reddit’s management would take steps to make their lives easier, like clarifying the company’s own rules on acceptable conduct and improving the aging suite of tools used to filter posts and communicate with rank-and-file users.
“Modmail is one of the most unpleasant, confusing, and downright frustrating things to use, and we have to rely on it to communicate directly with subscribers,” writes user K_Lobstah, who moderates a number of prominent subreddits. “One of the most prevalent problems for which moderators have zero solutions is ban evasion—when a subscriber receives a ban and just makes a new account in the next two minutes to continue doing whatever got them banned in the first place.”
Moderators’ dissatisfaction became more apparent on July 2, when dozens of popular subreddits effectively shut down after Reddit’s surprise firing of communications director Victoria Taylor. Taylor worked extensively with moderators of the popular IAmA subreddit, helping them host “Ask Me Anything” question-and-answer sessions with public figures from President Barack Obama to Parks and Recreation star Amy Poehler.
Upon her sudden departure, moderators at IAmA said they were given little information by Reddit about transition plans or even a reliable way to contact scheduled AMA guests. IAmA’s moderators made the page invite-only to regroup, and other subreddits’ moderators followed to protest what they saw as the latest evidence of Reddit’s neglect of the site’s volunteers.
“The shutdown was about communication and better mod tools,” wrote the moderators of the 9-million-subscriber subreddit AskReddit. “This was about problems moderators have been complaining about for years.”
Days before her departure, Pao apologized on behalf of Reddit’s managers, vowing better communication and better moderator software were in the works.
“We haven’t communicated well, and we have surprised moderators and the community with big changes,” she wrote on the site. “We have apologized and made promises to you, the moderators and the community, over many years, but time and again, we haven’t delivered on them.”
Reddit’s since said it assigned AutoModerator creator Birch and community manager Fasnacht to focus on improving tools for moderators, though both have acknowledged the company just doesn’t have the ability to make sweeping updates overnight, perhaps especially to the creakiest parts of the site’s infrastructure.
“Modmail was written a long time ago as an extension of the current user to user messaging system, which itself was an extension of the current commenting system,” Fasnacht wrote in her emailed statement. “As a result, it’s connected pretty strongly with other parts of the system and is difficult to make changes to.”
The company also continues to wrestle with the ongoing problem of better defining sitewide content standards. It’s taken steps this year to ban revenge porn and, under a new harassment policy, purge subreddits, such as /r/FatPeopleHate, that were essentially serving as nests for trolls.
Those changes come at a time when many online publications are moving away from the anything-goes, free speech absolutism of the early Internet toward something more civil—and more welcoming to the advertisers keeping their sites afloat. Earlier this month, Gawker CEO Nick Denton removed a story alleging a married, male publishing executive attempted to hire a male prostitute, citing both “the 2015 editorial mandate to do stories that inspire ‘pride’ and ‘business concerns.'”
Denton’s decision wasn’t without consequence for the company—two high-ranking editors left the company over the article retraction and general concerns about editorial independence—but imposing such policy changes is certainly still easier at a publishing organization like Gawker than at a user-driven site like Reddit. Reddit’s moderators are unlikely to accept any new sweeping mandates about what constitutes acceptable discourse or relish being caught between angry users and site management, and paid administrators likely lack the resources to enforce any radical new standards on their own.
Any changes the company does make or suggest inevitably bring apprehension to moderators and users afraid their communities will suddenly be on the wrong side of a new policy. The site’s paid administrators are ultimately in the same position as many of its volunteer moderators—trying to govern communities where they quickly discover disagreement is the rule, not the exception.
“At first I thought /r/Christianity was the next evolution in church, but that could never be the case,” writes RevMelissa. “To be a church you have to have community and doctrine. That particular sub would never agree on doctrine. It just couldn’t happen. But they are the most ecumenical (hoity toity word for multiple faith traditions getting along) I have ever found.”