Fast company logo
|
advertisement

After years of investment in algorithmic recommendations, major tech companies are rediscovering the human touch.

How human curation came back to clean up AI’s messes

[Photo: billyhoiler/iStock]

BY Jared Newman7 minute read

This article is part of Fast Company’s editorial series The New Rules of AI. More than 60 years into the era of artificial intelligence, the world’s largest technology companies are just beginning to crack open what’s possible with AI—and grapple with how it might change our future. Click here to read all the stories in the series.


Last month, HBO launched a handy website for figuring out which of its original shows to watch. But instead of using computer algorithms to sort through its vast catalog, the company hired people to make their case for each series by making video testimonials.

The site, called “Recommended by Humans,” is more marketing stunt than product strategy—HBO has no plans to offer an app-based version for mobile or TV devices, and it declined to say how humans might factor into its forthcoming HBO Max service. But its very existence was a statement about our current moment in tech. We’ve realized that recommendation algorithms aren’t as infallible as tech companies once made them out to be, and that handing the curation job to people still has value. Perhaps it’s no coincidence that Netflix began testing a section for human curation in its own apps a couple of weeks after HBO’s website launched.

HBO and Netflix aren’t alone in rediscovering the human touch. Consider some other examples:

In related news, Google recently rehired Krishna Bharat, the inventor of Google News—which helped launch the algorithmic curation era when it debuted in 2002. During his absence from Google, Bharat had criticized the company for not vetting the sources of top stories.

Even with algorithmic recommendations, tech companies still rely on humans to train their machine learning models and moderate questionable content (sometimes at great personal cost to the moderators). What we’re seeing now is a bigger push for humans to handle the curation themselves. Still, it’s unclear if the companies involved see this as a permanent investment or just a stopgap until algorithms become better tastemakers on their own.

How human curation came back

The idea of tech companies warming to human curation has come around before, most notably with the launch of Apple Music in 2015. Jimmy Iovine, who was the head of Apple Music at the time, said algorithms alone couldn’t handle the “emotional task” of choosing the right song at the right moment, so the company enlisted DJs and celebrity musicians to run the Beats 1 radio station and hired humans to create playlists. Apple also began hiring editors for its burgeoning news efforts around the same time, and Twitter was preparing a human-curated news feature codenamed Project Lightning, which would eventually become Twitter Moments.

But even if the trend was already percolating four years ago, it’s taken on greater urgency as tech companies face a backlash over the negative effects of their products. Both Facebook and Google have acknowledged that their algorithmic recommendations have played a role in spreading misinformation and surfacing inappropriate content for young viewers. And on a less societally catastrophic level, critics of Netflix have started wondering whether the service’s steely algorithms are failing to sell new audiences on good shows.

Introducing humans to the mix is an obvious way to compensate for algorithms’ shortcomings. With Facebook’s forthcoming News Tab feature, for instance, the Information reports that human editors have instructions to avoid promoting stories that aim to polarize readers and prioritize those with on-the-record sources. Those kinds of inferences might be difficult for an algorithm to make because it doesn’t actually understand the meaning of its source material.

“It’s going to be a long time before machine learning—or whatever you want to call these algorithms—can understand the meaning of a statement,” says Jean-Louis Gassée, a venture capitalist and former Apple executive who’s called for more human curation in tech products.

Apple’s editorial App Store recommendations are another example of where algorithms had fallen short. While a recommendation engine might be able to suggest apps based on your previous behavior, it can’t explain what it’s like to use a particular app or provide a point of view on why one is better than others. With human editors, Apple has been scattering recommendations across the home screen, category pages, and even search results pages, all with descriptions of why each app is worth your time.

Michael Bhaskar, author of the book Curation: The Power of Selection in a World of Excess, says this kind of storytelling is something that algorithms will have little chance of replicating.

“I think you need to have machine-driven stuff, just because the sets of information and media are so large. But then, people like people,” says Bhaskar.

advertisement

Scaling the human element

The problem with most of the human curation we’re seeing now is that they tend to be an either-or proposition. Facebook’s News Tab may have a section of top stories picked by journalists, but algorithms will still run other sections. The App Store has lots of great editorial picks, but if you look for something off the beaten path, like a Markdown editor, you’re on your own. The reason tech companies embraced recommendation algorithms in the first place is because they allow for infinite personalization and the processing of vast amounts of data at little cost.

Human effort doesn’t scale that way: Even Apple isn’t going to hire enough seasoned writers to deal with a million or more apps. So its revival of people-centric curation could just be a stopgap until algorithms improve. YouTube told me earlier this year that the Collections in its YouTube Kids app exist in part to help the company’s algorithms discern quality programming from junk food. Facebook could have similar plans for News Tab, which is already the company’s second attempt at having humans curate top stories. (A few years ago, the company fired the human editors that had been selecting stories for its “trending news” sidebar amid accusations of liberal bias. Algorithms then ran the section until 2018, when Facebook shut it down entirely.)

Still, it’s possible to imagine a model in which human and algorithmic curators coexist and even help one another. A good example is the news app Flipboard, which uses algorithms for personalization but involves humans in several steps along the way.

To start with, Flipboard’s own users act as curators, adding articles to their own digital magazines for other users to read. Those curators, in turn, help Flipboard’s algorithms decide which stories and sources to recommend when users search for a given topic.

But instead of stopping there, Flipboard also employs a team of human editors to make fine-grained adjustments to the output of each topic. For example, if someone looks up boating or cars as a topic on Flipboard, the algorithm might try to push out a lot of stories on accidents or crime because their sensational nature tends to get the most clicks. Human editors can then deprioritize those kinds of stories in favor of ones that are more rewarding reads.

“We have models that are trying to learn from our community curation, and that is actually the output for our editorial and topic curation teams, who have all the final say on everything,” says Arnie Bhadury, a machine learning engineer at Flipboard. “Building tools like that help us scale up human curation.”

The alternative approach is to invest more in human curation on its own merits, whether it scales or not. With Apple’s Beats 1 station, for instance, Apple isn’t just putting qualified humans in charge; it’s talking up the fact that they’re renowned DJs or musicians and letting those curators speak about the songs they’re playing as on-air personalities. Bhaskar, the book author, says that element of human curation is irreplaceable.

“The thing you can never get from an algorithm is that it doesn’t have a story behind it,” he says.

All of which makes me wonder: What if Apple News or Facebook’s News Tab didn’t just recommend stories, but also had mastheads and allowed its editors to publicly explain their story selections? What if Netflix’s Collections had input from directors or critics? What if YouTube Kids offered a “best of” selection for children’s programming, with lead-ins or PSAs by popular characters? Why not have Beats 2 through 5?

None of those things would replace the kind of granular recommendations that an algorithm can provide, nor would they be nearly as cheap to produce. But if they were compelling enough, they would also provide a level of trust, accountability, and personal connection that you don’t get from a cold and manipulative algorithm. Amid our current tech backlash, those might be precious resources.

Recognize your brand’s excellence by applying to this year’s Brands That Matter Awards before the early-rate deadline, May 3.

PluggedIn Newsletter logo
Sign up for our weekly tech digest.
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Privacy Policy

ABOUT THE AUTHOR

Jared Newman covers apps and technology from his remote Cincinnati outpost. He also writes two newsletters, Cord Cutter Weekly and Advisorator. More


Explore Topics