One of the reasons I seldom let my young kids watch YouTube unsupervised is because I don’t trust its recommendation algorithms. Even when YouTube’s suggestions don’t completely run off the rails into unsafe territory, they can easily be overrun by toy unboxing videos and other assorted junk.
I’ve always wanted the separate YouTube Kids app to be a more curated experience. In a new report over at Bloomberg, Mark Bergen and Lucas Shaw say that’s exactly what YouTube was considering earlier this year as it faced a potential FTC settlement over violating children’s privacy laws:
“YouTube privately considered taking more control. Earlier this year, it assembled a team of more than 40 employees to brace for the FTC decision. The team was code-named Crosswalk — as in a way to guide kids across YouTube’s chaotic streets. Among its proposals was a radical one, at least by the standards of Silicon Valley: YouTube would screen every video aimed at kids under the age of 8 in its YouTube Kids app, ensuring that no untoward content crept into the feed of millions of tots around the world.”
Bloomberg’s sources say YouTube even drafted a press release in which YouTube CEO Susan Wojcicki said the Google subsidiary would hire human moderators to screen content, but changed its mind at the last minute. YouTube was wary of looking “too much like a media company,” Bergen and Shaw report, and also felt that strict curation would be at odds with the spirit of the site. The latter rationale lines up with what YouTube told me earlier this year about how it approaches the YouTube Kids app.
It’s worth noting that in addition to letting parents disable recommendations and screen videos themselves, the YouTube Kids app provides a small number of human-curated “Collections” from trusted partners. Still, these collections exist largely to inform YouTube’s algorithms, rather than to serve as the focus of the Kids app. The Bloomberg story shows how close YouTube came to taking the opposite approach.