Fast company logo
|
advertisement

YouTube is committed to the free-form, user-generated nature of its platform. It’s tough to reconcile that with a version that’s perfectly safe for kids.

How YouTube is trying to fix its Kids app without ruining it

[Photo: Hal Gatewood/Unsplash]

BY Jared Newman10 minute read

Malik Ducard, YouTube’s global head of family and learning, likes to tell a story about the time the YouTube Kids app clicked for him personally.

It was just before the app’s official launch in 2015, and his youngest son was playing with an early version during a family brunch. Unlike the regular YouTube app, YouTube Kids has a simple, cartoonish design that young children can easily navigate, along with various filtering tools and timers for parents. After the app streamed a clip of Phineas and Ferb, YouTube’s recommendation algorithm served a video on how to draw the characters. The restaurant produced some crayons and a paper placemat, and an impromptu YouTube art lesson broke out.

“It was great, because it was him engaging the app, and viewing the app, but him also being active with the app,” Ducard says. “With YouTube Kids, we are as proud of the engagement that we see as we are the action–people putting down the app and going out and playing.”

As a parent of young children, it’s hard for me to reconcile that rosy vision with my own YouTube Kids experience, in which the app is easily overrun by toy unboxing videos and other assorted junk food. Meanwhile, I’m always worried that its recommendations will travel down a dangerous path toward inappropriate videos that somehow slipped past YouTube’s automated filters, as has been known to happen.

Ducard’s excitement and my wariness have one thing in common, though: They’re both rooted in YouTube’s use of algorithms to determine what the app recommends and what appears in search results. That’s a starkly different approach to wrangling kid-friendly content than the more human-intensive curation adopted by Amazon for its FreeTime Unlimited service.

Over the past four years, machine-driven recommendations have been the source of numerous YouTube Kids-related scandals, from violent videos in search results to videos depicting self-harm, which might explain why the recent “Momo challenge” hoax–in which a ghastly creature was supposedly popping into children’s YouTube videos to encourage suicide–seemed credible enough to go viral, even though there was nothing to it.

But for folks like Ducard, those algorithms are also YouTube’s greatest source of potential as an educational tool, allowing kids to learn about things they’d never encounter in traditional media. That’s why YouTube Kids isn’t abandoning them, and why giving kids access to the app may always feel a bit risky.

Beyond Hollywood

For Ducard, belief in YouTube’s algorithms is entwined with his own life experiences.

Growing up in the Bronx, Ducard was a natural storyteller, who made and narrated his own flipbooks for friends during school recess, and went onto making his own movies on a camcorder in high school. During undergrad at Columbia, he became president of the university’s student TV station, where he co-produced a show about black culture and entertainment. Later, Ducard started working in Hollywood on content acquisitions for MGM, Lionsgate, and Paramount, helping to cement some of the last studio’s first streaming deals.

Although Ducard says he enjoyed working in Hollywood, it was also an industry where you had to know someone to get your ideas seen. When Robert Kyncl, Netflix’s former head of acquisitions, moved over to YouTube in 2011, he offered Ducard a job leading the site’s family and learning initiatives. Ducard says the idea of a more democratic platform resonated.

“There are entire formats and categories of content on YouTube that would have just never ever exist in traditional media,” he says. “And that’s okay, there’s a place for everything, but I moved over to YouTube for that idea, and that opportunity to be part of a platform with that potential for creators.”

One of the things Ducard gets most passionate about is figuring out how to introduce concepts like gender balance and ethnicity balance into YouTube Kids’ recommendations. The algorithm is an opportunity to surface perspectives that children might otherwise not come across, and figuring out the best way to do that, he says, is one of YouTube Kids’ biggest responsibilities.

“When I think of TV that I grew up with, it was maybe not as diverse as the world I actually lived in,” Ducard says. “And I think there’s a great opportunity to do right by that.”

The case against whitelisting

Hearing all of this, I started to wonder why YouTube Kids doesn’t just put humans in charge of the curation. YouTube’s algorithms have caused a lot of damage in other areas–most notably, by recommending conspiracy theories, political extremism, and even, as the New York Times recently reported, the kind of videos of children that pedophiles might want to watch–but video for kids is arguably the one area where abandoning them makes the most sense. Doing so would eliminate any risk of surfacing inappropriate content, and could allow the app to become a kind of highlight reel for a diverse range of videos from across YouTube proper. (According to Bloomberg, some people inside YouTube even advocated for this approach, unsuccessfully.)

Alicia Blum-Ross, YouTube’s global public policy lead for kids and families, counters that without machine-driven recommendations, YouTube Kids wouldn’t be able to catch all the edge cases that drive people to the app in the first place.

As an example, she once interviewed a Brazilian and Portuguese family living in London, and found that they watched videos in Portuguese on YouTube Kids so the children could learn to speak the same language as their relatives. She also talked to a family that would watch hair braiding videos at the end of the day, which ended up becoming a bonding experience. “Would a whitelisted version have French-braiding hair? How long would your list have to be to think of all those different use cases?” Blum-Ross asks.

Tweaking the algorithm

In lieu of changing the fundamental way that it operates, YouTube has bolted more layers of parental control onto to YouTube Kids, which is officially the only way that children under 13 are supposed to access the site. (Unlike regular YouTube, the YouTube Kids app doesn’t allow comments or video uploads, doesn’t show interest-based ads, and requires explicit permission from parents to meet federal guidelines on collecting data from children.)

For instance, parents can now set up individual profiles for their kids, each with their own viewing preferences and age ranges. And last year, YouTube Kids added an option to disable search and recommendations entirely, allowing parents to either whitelist individual videos and channels on their own, or pick from small number of “Collections” from trustworthy sources like PBS Kids. The result is a pretty intensive setup process, involving tutorial videos, explanatory graphics, and an eventual decision about whether or not to trust YouTube’s algorithms.

It’s possible to restrict YouTube Kids to curated content, though it requires some work on the part of a parent. [Photo: courtesy of YouTube]

“Every household is different, from one block to the next, one country to the next, one child to the next, which is why we have the ability for parents to set up different [versions of] YouTube Kids, even within one device for different kids,” Ducard says. “So what we try to do is give parents the choice.”

advertisement

To YouTube’s credit, the process doesn’t overtly steer parents in any particular direction, but it seems to me like the bulk of YouTube’s attention is on the algorithmic side, improving and pruning the recommendations to reflect the potential that folks like Ducard see in the site.

That said, Blum-Ross argues that the split between machine and human curation isn’t as binary as it might seem. Now that YouTube has been creating “Collections” of pre-approved videos on the non-algorithmic side, for instance, the company has started training its algorithms to identify similar content, in hopes that higher-quality videos will come to the surface.

“They’re not totally independent of one another,” Blum-Ross says of YouTube Kids’ human and algorithmic sides. “That would be silly of us, right? We’ve spent so much time working with those third-party collections, we want to make sure we’re getting as much value from that as we can.”

Blum-Ross, who joined YouTube from the academic world last year, is also involved in an effort to predict future issues before they become scandals. One major area of focus right now is to make the app more palatable for kids ages 8 to 12, so they don’t jump over to regular YouTube before they’re ready.

“Obviously, sometimes kids in that age range really want to be getting access to more adult content,” she says. “And how we can continue to have a safe environment that at the same time respects that desire to explore and experiment a little bit online–that’s an area that we’ll continue to think about in the coming year.”

An unending work in progress

In talking to Ducard and Blum-Ross, I do get the sense they mean well. Although YouTube is sometimes criticized in the kids space as a soulless machine pumping out junk content, clearly there are people at the company who are invested in doing better.

YouTube also stresses that it has consulted with outside experts along the way. Alice Cahn, a former executive director of children’s programming at PBS, says she was impressed with the depth of questions that YouTube asked her as a paid consultant. She also sees a parallel between YouTube and cable in its formative years, noting that cable used to relegate children’s programming to early mornings, and became less kid-friendly throughout the day. Just as cable networks eventually addressed the issue with round-the-clock children’s channels, Cahn believes YouTube will become better at screening what it’s showing to kids.

“The whole issue of what kids can see and what’s available on the app, I think that needs to constantly be reviewed,” Cahn says. “What I would like to see is a commitment–and so far I’ve seen it–to that constant questioning, to that research.”

But in talking to folks outside of YouTube, it’s also clear how much work still has to be done. Jill Murphy, the vice president and editor-in-chief at Common Sense Media, a content review site for parents, says YouTube is easily the number one pain point for parents whose kids are 12 and under. Parents still spend a lot of time fending off toy unboxing videos and other low-quality content, and the content comes so quickly–thanks to auto-play and the short nature of most videos–that it’s hard for even the most attentive parents to keep up.

“I think they made a positive step trying to put YouTube Kids together, but it is still algorithmically generated, and so we don’t really feel that an algorithm is the best way to determine what’s quality content for kids,” she says.

Josh Golin, the executive director for the advocacy group Campaign for a Commercial Free Childhood, shares that concern. Golin has been a frequent critic of YouTube Kids, raising concerns about low-quality videos and inappropriate search results in 2015, and accusing YouTube of violating children’s privacy laws last year by failing to steer kids away from the standard YouTube app. Within YouTube Kids, he doesn’t believe Google’s done enough to police videos that effectively serve as marketing, such as people tasting different kinds of Oreos or Coke flavors. (A quick search shows that such videos are still readily available, even though Golin started complaining about the issue three years ago.)

“That’s part of what the YouTube experience is about: We’re no longer tied to the 46 channels on cable, and we have access to the entire world,” he says. “It’s not clear to me that’s what’s best for children, as opposed to having some sort of walled garden, whether it’s on YouTube or within YouTube Kids, where everything has been vetted, where there’s no improper recommendations.”

Given that YouTube has no intention of building that walled garden, the tension between YouTube Kids’ danger and its potential is unlikely to go away, and the site will remain a target of critics for the foreseeable future. But Ducard say he doesn’t mind. At the very least, it shows that people are invested in making YouTube better.

“It is actually okay when people care about, ultimately, kids,” he says. “We really embrace it all and are open for feedback, and that’s why you see an app today that’s constantly evolved, and improved.”

Recognize your brand’s excellence by applying to this year’s Brands That Matter Awards before the early-rate deadline, May 3.

PluggedIn Newsletter logo
Sign up for our weekly tech digest.
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Privacy Policy

ABOUT THE AUTHOR

Jared Newman covers apps and technology from his remote Cincinnati outpost. He also writes two newsletters, Cord Cutter Weekly and Advisorator. More


Explore Topics