advertisement
advertisement

YouTube’s Top Search Results For Las Vegas Are Already Littered With Conspiracy Theories

YouTube’s search results proclaim this week’s shooting was a “false flag.”

YouTube’s Top Search Results For Las Vegas Are Already Littered With Conspiracy Theories
Matthew Helms, who worked as a medic the night of the shooting, visits a makeshift memorial for the victims of Sunday night’s mass shooting, on the north end of the Las Vegas Strip, October 3, 2017 in Las Vegas, Nevada. [Photo: Drew Angerer/Getty Images]

Despite over a year’s worth of hand-wringing about the preponderance of fake news and conspiracy theories online, it’s still all too easy to find it. Especially in the wake of those emotionally searing events that expose our political divide.  Yesterday, we wrote about how Facebook and Google surfaced “false-flag” stories about the Las Vegas shooting just hours after the tragedy, later blaming a glitch in the algorithm for the screw-up.

advertisement

Watch: Las Vegas Reminds Us That Fake News Continues To Plague Breaking News

But on YouTube, such conspiracy theories are still all too prominent.  Currently, if you search the platform for videos relating to Sunday night’s tragedy, the results are littered with conspiracy theories and fake news. Fast Company performed a search for “Las Vegas shooting” this morning, and we found the fourth most popular result–with over 255,000 views–to be “Proof Las Vegas Shooting Was a FALSE FLAG Attack, a video posted by the End Times News Report. A few results down is another video, with over 507,000 views, called “The Las Vegas shooting – The Truth You’re Not Being Told About These Attacks,” by FullSpectrumSurvivor. Of the top 13 results, 5 of them were such conspiracy theory videos.

A screenshot from the morning of 10/3/2017

It should be noted that these searches were performed in YouTube’s main search bar on its home page–and the first two results to my query were marked as “top news.” On its news section, the selections were much more curated and didn’t appear to include such fake news videos.

Still, this raises a major issue for platforms like YouTube, Google, and Facebook–when news breaks and millions of users increasingly turn to them for facts and verified information, they end up encountering plenty of fake news and propagandistic content. Though they insist that they can be trusted to keep readers informed and that they want to filter out fake news, they’re  also committed to increasing user engagement. And, as we all know, what gets people to click is often not necessarily what’s true.

The night of the shooting, false information about the shooter was being posted to Facebook, which was then featured on the company’s “Security Check” page. Ditto Google, which pushed forums from the notoriously troll-y 4chan to its top news section. Both companies apologized for these results. Yet they blamed hiccups in the technology they created and not the system underpinning it that prioritizes sharing over context.

YouTube is now in a similar situation, although (thankfully) it has been able to keep conspiracy theories out of its news channels. All the same, these videos describing “false flags” and “what’s not being reported” are amassing millions of views. And they are very likely misleading hundreds of thousands of viewers who are simply searching for videos that offer news about the shooting. It puts YouTube in a bind–either allow fake news to proliferate or risk censoring users. The company’s solution is to clearly mark verified news organizations as such, yet is that really enough?

I reached out to the company, and a YouTube spokesperson gave me this statement:

advertisement

When it comes to news, we have thousands of news publishers that present a variety of viewpoints available on our news channel, www.youtube.com/news . When a major event happens, these same news sources are presented on the YouTube home page under breaking news and featured in search results.

In the past, YouTube has taken a strong stand on certain types of content. Those videos it deems to be extremist, for example, are subject to either a complete ban or a warning page (among other actions which hide these videos). Still, these moves apply to hateful content, not necessarily fake news or propaganda. Google, which owns YouTube,  has recently tweaked its search algorithm to filter out “low quality” sites sharing false or offensive content.

But YouTube hasn’t yet adopted these kinds of measures, despite some frightening past examples. PizzaGate, the notorious conspiracy theory about an alleged Democrat-run sex ring run out of a pizza parlor in Washington, D.C., gained steam thanks to popular YouTube videos. As a result, innocent people were harassed and one of the theory’s true believers showed up with an automatic rifle outside the pizzeria . Similarly, the massacre at Sandy Hook has long been a popular YouTube topic for conspiracy theorists claiming the shooting never happened. Just a month after the tragedy, one YouTube conspiracy theory video had already garnered 10 million views. Grieving parents of the victims have been harassed thanks to this misinformation, which continues to be shared.

With similar crazy theories about Las Vegas making the rounds, sadly we can expect similar outcomes.

About the author

Cale is a Brooklyn-based reporter. He writes about business, technology, leadership, and anything else that piques his interest.

More