On November 8, 2016, the night of Donald Trump’s election, artist and programmer Justin Blinder was closely watching Wikipedia. He noticed that a section of Trump’s Wikipedia page had undergone some clever semantic editing. It summarized the revelation of an audio recording from 2005, in which Trump bragged to former Access Hollywood host Billy Bush about forcibly kissing and groping women. On election night, the entry was edited to suggest Trump had actually told Bush that “fame made women amenable to kissing and groping.” The word “forcibly” had also been removed.
Intrigued, Blinder began looking into other Wikipedia edits—some involving public figures and current events, others more random in nature. He started to realize that over time certain edits aggregate to “semantically erase and reshape important popular narratives.” He was also seeing that they were “obfuscating context, authorship, and power, often marginalizing voices from minority communities.”
Shortly thereafter, Blinder created Wikipedia Was Here, a website that reveals the real-time location and geographical context of anonymous edits to specific Wikipedia entries. The site geocodes the IP addresses of new Wikipedia edits, while showing corresponding visual imagery on Google Maps and Google Street View. He does this by broadcasting the Wikipedia edits via IRC channels in real-time. The IRC posts contain metadata on the article being edited, date, comments, and user. For anonymous edits, the user’s IP address is included in this meta-data.
“The application I built pulls in these IRC messages and filters only those that are anonymous,” Blinder explains. “Once it receives an edit, it tries to resolve a latitude and longitude using the IP address.”
Thinking back on his election night Wikipedia watch, Blinder was struck by the passive voice replacing the active voice on Trump’s page. It was a development he didn’t see on Hillary Clinton’s Wikipedia article, or on other government-related pages.
A Semantic War Of Attrition
“A lot of the more overt edits were being denied, but it kind of seemed like these smaller passive changes were more likely to get past the opposing editors,” says Blinder. “So there was this new narrative that was forming and evolving, and it was kind of like a semantic war of attrition that was playing out.”
Blinder soon noticed that a lot of the edits had usernames that were IP addresses. After some research, Blinder learned that whenever a person anonymously edits an article without a user account, Wikipedia uses the person’s IP address as the username.
“Wikipedia actually has an article where they state you’re more likely to be anonymous if you have a username than if you use an anonymous account, because they do a really good job of blocking a lot of VPNs and Tor exit nodes,” Blinder explains. “So it seemed like these were fairly accurate IP addresses, and some of them were IPv4 and some were IPv6.”
Blinder began entering the different IP addresses into search engines like IP Location, which show the locations on a map of latitude and longitude. He plugged the coordinates into Google Street View, and the results were fascinating.
“I think one of the first ones was an edit to the ISIS entry, and it was in the middle of this country music concert in St. Louis, Missouri,” Blinder recalls. “You can zoom around and see all of these people; so when you try to contextualize it just based on the image you start to make assumptions about where you’re at. Another one was an edit to the James Comey article, and it was in the middle of the desert in Saudi Arabia.”
How Bias Informs Perspectives Around The Globe
Other Street View images were more banal. But most street views seemed to tell Blinder something about where these individuals came from, while raising interesting questions about what sort of geopolitical powers were at play in these locations. For instance, the edit of Comey’s from the Saudi Arabian desert—made at a time when he and Trump were battling in the press—made Blinder realize that a lot of specifically American issues weren’t exactly localized. Did the Comey page editor, he wondered, know something we didn’t?
“There tends to be a global interest,” Blinder notes. “There are definitely a lot of edits being made for Russia, and those ones kind of resolve to right next to the Kremlin, so you definitely start to see this global interest habit.”
Blinder pondered whether edits of ongoing issues like the Syrian civil war and refugee crisis could be traced to American, Russian, or other interests. He even saw edits about African culture (on the Wikipedia page for “Lists of African Masks”) coming out of the American Midwest, and questioned whether institutionalized racism and biases informed these perspectives.
As for the project’s intersection with YouTube’s new policy of “Information Cues,” Blinder thinks that the company put a big target on Wikipedia’s back, without even the courtesy of asking. He believes that Wikipedia will look a lot more appealing to people hoping to propagate fake news and alternative facts.
“What I started to see is that the cumulative effects of these minor tweaks can really reshape a dominant narrative, especially within Wikipedia, and they also kind of fly under the radar,” says Blinder. “These subtleties and nuances of language are kind of being weaponized… This seems to be a more gray area.”