Fast company logo
|
advertisement
YouTube rejected a proposal to prioritize vetted news sources after Parkland shooting: report

[Photos: Ben Sweet/Unsplash; 5187396/Pixabay; FreeCreativeStuff/Pixabay]

BY Melissa Locker2 minute read

Days after the mass murder at Marjory Stoneman Douglas High School in Parkland, Florida, a conspiracy video went viral on YouTube claiming that survivor David Hogg was a so-called crisis actor. The video was featured in YouTube’s trending section and promoted by the site. In the wake of that fiasco, YouTube said it would do better and start to fact-check videos it features.

Now, a new investigative report by Bloomberg found that in February 2018, after the crisis-actor video went viral on the site, YouTube’s own staff suggested “limiting recommendations on the page to vetted news sources” to help curb the spread of conspiracy theories and outright lies. According to Bloomberg, “YouTube management rejected the proposal,” and while their source didn’t know the exact reason for the rejection, they noted, “YouTube was then intent on accelerating its viewing time for videos related to news.”

According to Bloomberg, YouTube executives had realized that “outrage equals attention,” and attention equaled engagement, that all-important metric that measures views, time spent, and interaction with online videos. The more time people spent on YouTube, the more valuable the company was to advertisers–per Bloomberg, the company currently rakes in sales of more than $16 billion a year. To encourage people to stay on the site, the company’s powerful artificial intelligence system recommends videos that helped further the goal of keeping people on YouTube, regardless of whether the videos were factual or, say, linking Hillary Clinton to a pedophile cult operating out of a pizza joint in Washington, D.C., or claiming that Parkland survivors were hired actors. According to Bloomberg‘s reporting, YouTube leadership was “unable or unwilling to act” on even its own internal alarms about the videos they were promoting “for fear of throttling engagement.”

As Fast Company has reported, whenever YouTube is called out on the issue, it admits to the problem and says it will work to curb conspiracy videos from being promoted. However, it has yet to halt the flow of conspiracy theories and misinformation on the site, and such videos are still frequently recommended via its autoplay videos and top recommendations. It doesn’t seem like it’s getting better.

advertisement

We reached out to YouTube for comment and they provided this statement:

Over the past two years, our primary focus has been tackling some of the platform’s toughest content challenges, taking into account feedback and concerns from users, creators, advertisers, experts and employees. We’ve taken a number of significant steps, including updating our recommendations system to prevent the spread of harmful misinformation, improving the news experience on YouTube, bringing the number of people focused on content issues across Google to 10,000, investing in machine learning to be able to more quickly find and remove violative content, and reviewing and updating our policies — we made more than 30 policy updates in 2018 alone. And this is not the end: responsibility remains our number one priority.

Recognize your brand’s excellence by applying to this year’s Brands That Matter Awards before the early-rate deadline, May 3.

PluggedIn Newsletter logo
Sign up for our weekly tech digest.
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Privacy Policy

ABOUT THE AUTHOR

Melissa Locker is a writer and world renowned fish telepathist. More


Explore Topics