Fast company logo
|
advertisement

In the Facebook era, the volunteer editors behind the archaic-looking website have built Wikipedia into a formidable force for truth.

How Wikipedia’s volunteers became the web’s best weapon against misinformation

[Source Images: Flickr user Gage Skidmore (Sanders, Trump, Warren); Sean Gladwell/Getty Images (newsprint)]

BY Alex Pasternacklong read

For a few minutes near the end of his first presidential debate, Mike Bloomberg was dead. At 9:38 p.m. Eastern time, a Wikipedia user named DQUACK02 added some text to the Wikipedia page for the former Democratic presidential candidate and New York City mayor:

“death_date   = {{Death date and age|2020|02|19|1942|02|14}}; |death_place  = [[Las Vegas, Nevada]], U.S.; |death_cause = [[Getting stabbed by Warren, Biden and Sanders]].”

Within three minutes, another user named Cgmusselman had reverted the page back. By then the inevitable screenshots and joke tweets had already begun to spread. It was an obvious hoax, and a rather cartoonish example of Wikipedia at its worst—the reason why many people still believe it can’t be trusted: Anyone can edit it! But it was also Wikipedia at its best: Anyone can also edit an edit!

“Most of these edits are small improvements to phrasing or content, a few are masterpieces, and some are vandalism,” says Cgmusselman, who is Charley Musselman, a 73-year-old retired physicist from Massachusetts who happened to notice Bloomberg’s demise while double-checking the age of his senator and his then-preferred candidate, Elizabeth Warren. (“She is three years, two months younger than I am,” he reports.)

Cgmusselman isn’t among the experienced minority of editors who tend to patrol the front lines of Wikipedia’s war on misinformation—his hundreds of edits have mostly involved copy editing. But like those other editors, he has put his faith in the power of the crowd to be fair and honest. “Weight of sincerity, truth, and goodwill will bit by bit bury falsehood and malice,” he told me by email.

Weight of sincerity, truth, and goodwill will bit by bit bury falsehood and malice.”

Cgmusselman

Amid the chaos of partisan battles, epistemic crises, and state-sponsored propaganda, it’s nice to think that good-hearted people who care about a shared reality could defeat all the b.s. out there. And there’s so much of it. If 2016 was the debut of a new kind of information war, this year is promising to be something like the darker, more expensive sequel. Yet while places like Facebook, YouTube, and Twitter struggle to fend off a barrage of false content, with their scattershot mix of policies, fact-checkers, and algorithms, one of the web’s most robust weapons against misinformation is an archaic-looking website written by anyone with an internet connection, and moderated by a largely anonymous crew of volunteers.

“I think there’s a part of that that is encouraging, that says that a radically open, collaborative worldwide project can build one of the most trusted sites on the internet,” says Ryan Merkley, the chief of staff at the Wikimedia Foundation, the 400-person nonprofit that provides support to Wikipedia’s community of editors.

“There’s another piece of that that is quite sad,” he adds, “because it’s clear that part of being one of the most trusted sites on the internet is because everything else has collapsed around us.”

Lessons from the internet’s knowledge bank

Wikipedia is not immune from the manipulation that spreads elsewhere online, but it has proven to be a largely dependable resource—not only for the topics you’d find in an old leather-bound encyclopedia, but also for news and controversial current events, too. Twenty years after it sputtered onto the web, it’s now a de facto pillar in our fact-checking infrastructure. Its pages often top Google search and feed the knowledge panels that appear at the top of those results. Big Tech’s own efforts to stop misinformation also rely upon Wikipedia: YouTube viewers searching for videos about the moon landing conspiracy may see links to Wikipedia pages debunking those theories, while Facebook has experimented with showing users links to the encyclopedia when they view posts from dubious websites.

Against the fevered backdrop of elections and the Twitter-speed torrent of news, when the tiniest digital snippet can shape Americans’ political thinking, Wikipedia’s lessons in protecting the truth are only growing more valuable.

I don’t think it’s ever been more important for people to have reliable access to knowledge.”

Ryan Merkley

“I don’t think it’s ever been more important for people to have reliable access to knowledge, to make choices about their lives,” says Merkley, who is also a 2020 Berkman fellow researching misinformation. “Whether it’s about who you vote for or how you respond to climate change, it matters a lot. And getting it wrong will have potentially catastrophic effects for our families and everyday folks, for your health and the way we live.”

Wikipedia’s scope is immense—in January, Maria Elise Turner Lauder, a Canadian teacher, linguist, and philanthropist, became the English edition’s six-millionth entry—but unlike parts of the web where toxic information tends to spread, the encyclopedia has one big advantage: Its goal is not to “scale.” It’s not selling anything, not incentivizing engagement, not trying to get you to spend more time on it. Thanks to donations from thousands of donors around the world, there are no advertisers or investors to please, no algorithms to gather data or stir up emotions or personalize pages; everyone sees the same thing. That philanthropic spirit drives Wikipedia’s volunteers, too, who come to the website not to share memes or jokes or even discuss the news but, marvelously, to build a reliable account of reality.

“It is the realization of a great, perennial dream,” Musselman told me, “pointed to by Babel, Alexandria, and the Hitchhiker’s Guide: All knowledge within reach.”

There is still a lot to do to get there. As many of the site’s own editors readily admit in dozens of forums, the community is plagued by problems with diversity and harassment. It’s thought that only about 20% of the editing community is female, and only about 18% of Wikipedia’s biographical articles are about women. The bias and blind spots that can result from those workplace issues are harmful to an encyclopedia that’s meant to be for everyone. Localization is also a concern given Wikipedia’s goal to make knowledge available to the whole world: The encyclopedia currently exists in 299 languages, but the English version still far outpaces the others, comprising 12% of the project’s total articles.

The community has also struggled to retain new blood. Editors often accuse each other of bias, and some argue that its political pages exhibit a center-left bent, though recent research suggests that the community’s devotion to its editorial policies washes that out over time. Less-experienced editors can also be turned off by aggressive veterans who spout Wikipedia’s sometimes arcane rules to make their case, especially around the encyclopedia’s more controversial political pages.

“I have seen some become solid contributors, but it seems like a lot of them, especially [those] who try to jump into this really sensitive area, get overambitious, are swatted down, claim left-wing bias, and leave,” says Muboshgu, an administrator and one of Wikipedia’s trusted editors, who asked not to be identified for fear of harassment. But around some sensitive articles, a muscular approach to newbies can be hard to shake, editors like him argue: There are just too many trolls, paid hacks, propagandists, and partisans on Wikipedia to let their guard down.

The biggest threat is that we lose sight of what’s actually true.”

Muboshgu

Muboshgu, who in real life works as a psychologist in the Bay Area, has spent over a decade and thousands of edits battling misinformation across U.S. political pages. These days, keeping entries like these clear of falsehoods is harder against a noisy backdrop of distrust and manipulation, in which “various politicians are working very hard to discredit the media.”

“The biggest threat,” he says, “is that we lose sight of what’s actually true.”

A Byzantine body of rules

By the time you finish reading this paragraph, people from around the world will have changed about 100 things on Wikipedia. Some of these people may be logged in, identifiable by a username and a profile, which gives them permission to edit all but the encyclopedia’s most sensitive pages. Many more editors will be anonymous, identified only by an IP address. Some may be on the site to promote themselves, their idea, their company, their client. A rogue editor might want to “kill” a presidential candidate, or add a few words here and there to sow doubt about his record, or embellish it. Maybe they support the candidate’s campaign, or maybe they’re on its payroll; perhaps—in the case of a mysterious user who has closely tended to Pete Buttigieg’s Wikipedia page—they are secretly Pete Buttigieg. (The Buttigieg campaign denied this.)

Despite the trolls and propagandists, the majority of errors, especially on controversial and highly trafficked pages, go away within minutes or hours, thanks to its phalanx of devoted volunteers. (Out of Wikipedia’s 138 million registered users, about 138,000 have actively edited in the past month.) The site is self-governed according to a Byzantine body of rules that aim for courtesy and a “show your work” journalistic ethics built on accurate and balanced reporting. Vigilant community-built bots can alert Wikipedians to some basic suspicious behavior, and administrators can use restrictions to temporarily lock down the most vulnerable pages, keeping them safe from fly-by editors who are not logged in.

But if you do log in and try to update an article on a divisive or news-worthy topic—think East Jerusalem, Bernie Sanders, Russian interference in the 2020 United States elections, the coronavirus—your edit will be closely scrutinized, perhaps reverted by another editor, and may become the subject of feverish debate. Behind each article is a talk page, a forum where editors hash out what an entry should or shouldn’t say. Here, veterans might lob thousands of words at each other at a time. Some make their cases with impressive rhetorical flourishes and others with exhaustive reference to the site’s policies, like NOR (“no original research”), NOTABLE (does a given topic merit its own page?) and BLP, which describes the stricter standards for biographies of living people. Veteran editors also keep track of topics that are prone to misinformation, while groups like Guerrilla Skepticism on Wikipedia, which organizes itself on a dedicated Facebook group, regularly patrol controversial pages about vaccines, aliens, and various forms of pseudoscience.

I asked Betty Wills, a television producer and grandmother from Texas who has made thousands of edits on Wikipedia under the name Atsme, which topics tended to be the most challenging when it comes to battling falsities and half-truths.

His name comes up in articles you wouldn’t imagine—including the article Fuck. LOL.”

Atsme

“That’s an easy one,” she wrote in an email. “Anything and everything Trump. His name comes up in articles you wouldn’t imagine—including the article Fuck. LOL.”

Topics related to Trump frequently monopolize editors’ time. On his talk page, exasperated editors took the unusual step of appending a list of rules to the top, based on current consensus by the community. “Do not include allegations of sexual misconduct in the lead section,” cautions one. Others include: “Omit mention of Trump’s alleged bathmophobia/fear of slopes,” “Do not include any paragraph regarding Trump’s mental health,” and “Do not call Trump a ‘liar’ in Wikipedia’s voice.” Instead, editors advise that this claim may be included in the article’s lede: “Trump has made many false or misleading statements during his campaign and presidency. The statements have been documented by fact-checkers, and the media have widely described the phenomenon as unprecedented in American politics.”

The president’s political rise has also coincided with an uptick in misinformation on Wikipedia’s political pages.

“We’ve had serious editorial challenges in the American Politics-related articles since the early days of the 2016 campaign,” says P., an editor who has spent years battling misinformation, and who asked to not use even his username out of fear of harassment. His patrols have included highly charged places—Gun Politics in the United States, Rudy Giuliani, the conspiracy theory perpetuated by President Trump that there was a government spy in his campaign. But like other longtime editors, he strives to leave his politics at the login screen, out of deference to the work.

“It’s like any other workplace that can be disrupted by a few ill-equipped or ill-informed colleagues. The problem here is that, of course, anyone can step up and edit,” he says.

Even in the most heated talk forums, Wikipedians aspire to AGF, or “assume good faith.” But when that and all else fails, editors can resort to what is effectively a growing body of case law and make their arguments to a last-resort Arbitration Committee, or ArbCom, a virtual court made up of 15 elected administrators. If an editor has repeatedly undone other editors’ work, is “disruptive”, or appears to be a troll, they may be deemed to be NOTHERE—”clearly not here to build an encyclopedia”—and could be blocked or even banned.

Much of this bad behavior is the result of what Wikipedia’s editors call COI, or conflict of interest editing, which has threatened the site’s integrity since its early years. The community has agreed to allow editors to be paid for their work, provided that they disclose their clients, and such editing is frowned upon for political or other sensitive topics. But the rule can be difficult to enforce. Propagandizing on Wikipedia pages has long been a cottage industry: According to an investigation by Ashley Feinberg published last year by HuffPost, Facebook, NBC, and Axios were among the companies that reportedly paid Fast Company‘s former head of digital Ed Sussman to “do damage control” for their Wikipedia pages. Sockpuppet accounts, a favorite tool among Wikipedia’s paid manipulators, are also rampant, and can lead administrators to suspend or ban users.

The war on truth

Wikipedia’s battle against misinformation relies upon one of its core tenets: Editors must back up every “fact” with a reliable source, or “RS.” The “truth” on Wikipedia isn’t based in firsthand experience or even common sense, but by what its rules call “reliable, third-party published sources.” Of course, like everything else, what counts as “reliable” is up for debate. “All Wikipedians do is argue about the quality of our sources,” Wikipedia cofounder Jimmy Wales joked at a recent panel on misinformation.

All Wikipedians do is argue about the quality of our sources.”

Jimmy Wales

Around some of its most divisive U.S. topics, those arguments can reveal a troubling mix of divided information diets and a surfeit of supposedly credible media. “Misinformation comes at us from all directions, and that is an important factor to keep in mind when citing sources,” Wills says.

The problem, she says, is heightened by “clickbait” news media, and a tendency among some toward “inadvertent POV editing, regardless of political persuasion.”

advertisement

Another problem is that sometimes even trusted sources can be spotty. “When misinformation does make its way into sources that are usually reliable, it can unfortunately end up on Wikipedia as well,” says Molly White, an administrator based in Boston who goes by the username GorillaWarfare.

To help keep editors straight, administrators keep a running list of over two dozen “unreliable” sources, which now includes sites like Occupy Democrats, the British tabloid The Daily Mail, and Breitbart News, which has been criticized for inaccurate and incendiary reporting. By contrast, last year Facebook included Breitbart in a new section of its app devoted to “deeply-reported and well-sourced” journalism, with the goal of representing what CEO Mark Zuckerberg called “a diversity of views.”

Wikipedia takes another tack: Its editors also strive to include different viewpoints, but any assertions, quotations, or statistics must be backed up by reliable sources and presented in a neutral, balanced way. Another core principle, NPOV, or “neutral point of view,” means “representing fairly, proportionately, and, as far as possible, without editorial bias, all the significant views that have been published by reliable sources on a topic.”

Sometimes, editors might come to an agreement over how to achieve NPOV—how many words to devote to a given controversy, or whether a lawsuit should be mentioned in a living person’s biography—only for the debate to start up anew a few days later. During one prolonged edit battle in the article on the Trump-Ukraine scandal, Muboshgu repeatedly fended off efforts by other editors to include the name of the alleged whistleblower who first reported the phone call that sparked the impeachment inquiry.

“If the whistleblower wants to remain anonymous, they should remain anonymous,” he says. “Meanwhile, there’s zero confirmation that the person alleged to be the whistleblower actually is the whistleblower.”

As talk of impeachment ramped up through the summer and fall of 2019, editors also scrambled to respond to attacks on Joe Biden and his family, battling subtle edits and additions that implied corruption on the part of the presidential frontrunner. His son Hunter served for five years on the board of directors of Ukraine’s largest gas company, Burisma, during a period when prosecutors were probing the company. The investigation fizzled out, and later Vice President Biden pressured Ukraine to fire its top prosecutor, who was widely seen to be deficient at pursuing corruption in the country.

On Hunter Biden’s talk page, where insistent editors linked to a range of “mainstream” outlets like ABC News and The New Yorker to make their corruption case, Muboshgu and his confederates repeatedly demonstrated that no reliable source could back up the allegations. He recalls “being asked over and over again to add Joe Biden’s comments bragging about getting the corrupt prosecutor fired, as though it proves that Joe and Hunter Biden were corrupt, not understanding or caring to understand that Biden [helped get] the prosecutor fired for not investigating Burisma.”

We went through this several years ago with Hillary Clinton and her emails.”

Muboshgu

“We went through this several years ago with Hillary Clinton and her emails, and also Benghazi and the Clinton Foundation,” he says. Eventually, he and other administrators imposed a series of protections to Biden’s page, which he expects to extend past election day.

Partisans are a constant challenge to Wikipedia’s neutral description of the world, but worse are the trolls who come specifically to spread lies, Muboshgu says. “What concerns me is not just that people are listening to right-wing news media and taking what is said there as gospel,” says Muboshgu. “It’s that the single-purpose accounts are coming here specifically with the intent of spreading misinformation relating to the election. I don’t know if these single-purpose accounts originate from somewhere in the U.S., the Internet Research Agency in Russia, or some other troll farm.”

The Wikimedia Foundation is monitoring state-sponsored information operations on the platform, and investing in methods to identify and respond to it, Merkley says. “To date, we haven’t seen as much of that type of activity as some other platforms, but that doesn’t mean we won’t see more in the future, as Wikipedia is at the center of the global knowledge ecosystem.”

Hero of the information commons

The Wikipedia model has another not-so-secret advantage over the rest of that knowledge ecosystem, be it social media or the news media: Rather than millions of scattered, fleeting messages on a given topic, Wikipedia offers a single, updating page. On the English Wikipedia, the entry for, say,”Donald Trump” will look the same, regardless of where we are, who we are, and what other websites we’ve visited.

Unlike inscrutable personalized news feeds and private chats, this shared collection of information about a topic permits speedy collaboration and focuses editors’ capacities, says Brian Keegan, an assistant professor of social science at the University of Colorado Boulder who has researched Wikipedia’s response to breaking news. But it also keeps the spread of falsehoods to a minimum.

We’re all sitting in our own proverbial Platonic caves making sense of different shadows.”

Brian Keegan

“Hyperpersonalized news feeds sit in opposition to information commons,” Keegan says. “Moderating the latter is easier because we’re all looking at the same thing, while the former is harder because we’re all sitting in our own proverbial Platonic caves making sense of different shadows. In other words, you can’t fact-check everything, and most low-virality messages circulate in their own filter bubbles where they’re unlikely to be challenged.”

The market-based social media model may be diametrically opposed to Wikipedia’s, but could Big Tech still borrow a page from its more transparent, bottom-up approach to moderating content? Twitter seems to think so. Its new prototype for a community feedback tool, called Community Notes, lets any user add contextual information to suspicious tweets according to rigorous standards, “like Wikipedia.” In an email, a Twitter spokesperson clarified that the concept, which would let users add context or clarification around misleading tweets, is at a very early stage.

Keegan likes the thought of Twitter cribbing from Wikipedia, but he is skeptical that a community like Wikipedia’s could be as effective on a platform like Twitter, given the sheer scale of the network. And it could be easy to game the system. “The costs for participating need to be sufficiently high to dissuade . . . inauthentic behavior,” he says.

Wikipedia’s committed volunteer community could also be hard to replicate, says White. “I’m perfectly willing to provide my labor for free on Wikipedia because the Wikimedia Foundation is a nonprofit organization with a noble purpose. If Google asked me to do something for them, I’d better be getting a paycheck.”

If Google asked me to do something for them, I’d better be getting a paycheck.”

GorillaWarfare

On a recent afternoon, I was also willing to donate my labor for a noble purpose, so I opened WikiLoop Battlefield, a community-built website which lets anyone review a random recent Wikipedia edit for possible vandalism or misinformation. The system depends upon bots that scan new edits and then scores them according to how false or problematic they are likely to be.

I clicked through some of the newest edits on Wikipedia, small changes to pages like Environmental impact of mining, Prosecutor General’s Office of Azerbaijan, and He Man’s Power Sword. As each entry popped up, I cautiously clicked “Not sure.”

Then the site prompted me with a recent edit on the page for Peter Strzok, the former FBI agent who opened the bureau’s Russia probe in 2016 and later became a target of “deep state” conspiracy theories. An anonymous user had added a sentence:

“A comprehensive review in February 2018 of Strzok’s messages by The Wall Street Journal concluded that “texts critical of Mr. Trump represent a fraction of the roughly 7,000 messages, which stretch across 384 pages and show no evidence of a conspiracy against Mr. Trump” This report of course is false as Mr. Strzok clearly attempted to undermine the American electorate with his resources at the department[.]”

I hesitated, figuring that some other more experienced editor would take care of this blatant bit of opinion, slipped like poison into our encyclopedia. And then I quickly clicked the red button at the bottom, “Should revert,” and saved my changes. The sentence was gone.

A few days later, a message popped up on my Wikipedia user page, from an administrator I’d never met.

“Congratulations,” it read. “You have been recognized as the weekly champion of counter-vandalism of WikiLoop Battlefield.”

For a moment I felt like a hero.


This story is part of our Hacking Democracy series, which examines the ways in which technology is eroding our elections and democratic institutions—and what’s been done to fix them. Read more here.

Recognize your brand’s excellence by applying to this year’s Brands That Matter Awards before the early-rate deadline, May 3.

PluggedIn Newsletter logo
Sign up for our weekly tech digest.
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Privacy Policy

ABOUT THE AUTHOR

Alex Pasternack is a contributing editor at Fast Company who covers technology and science, and the founding editor of Vice's Motherboard. Reach him at apasternack@fastcompany.com and on Twitter at @pasternack More


Explore Topics