Fast company logo
|
advertisement

OpenAI CEO Sam Altman is worried about your kids having more AI friends than human ones. You should be too.

Parents: AI bots will want to be friends with your kids. We shouldn’t let them

[Photo:
bruce mars
/Unsplash]

BY Ainsley Harris3 minute read

Children might soon have more AI friends than human ones, according to OpenAI cofounder and CEO Sam Altman. 

“A thing someone said to me recently that stuck with me is that they’re pretty sure their kids are going to have more AI friends than human friends,” Altman told Stripe cofounder and CEO Patrick Collison during a video interview at the Sohn Conference earlier this week. “And I don’t know what the consequences are going to be.” 

Somewhat surprisingly, Altman added, people seem to have a hard time distinguishing between humans and AI, even when the AI is far from sophisticated. “Whatever the circuits in our brain are that crave social interaction seem satisfiable for some people in some cases with an AI friend,” Altman said. “And so how we’re gonna handle that I think is tricky.”

Discussions like this should be setting off alarm bells for parents—alarm bells far louder than the ones that welcomed the viral rise of OpenAI’s ChatGPT chatbot late last fall. Then, parents and educators discovered that students were using ChatGPT to cheat or bluff their way through homework assignments and exams. Some school districts, like New York City’s, went so far as to ban ChatGPT. 

Friendships between kids and AI might seem benign, by comparison. Indeed, AI-generated “friends” have a certain appeal. In a world of overloaded schedules and competitive extracurriculars, AI friends are available on demand. They can replicate themselves, so that no one feels left out. They can be programmed to encourage, rather than bully; to commiserate, rather than critique. Sure, they aren’t real friends—but what’s the problem with that, exactly? (Especially if you get the sense that your child’s experiences at recess are less Barney and Friends and more Lord of the Flies.)

But the harm of AI friendship could cut far deeper than the current angst around social media’s affect on children. Commercial imperatives underpin platforms like Facebook and TikTok, where the user is the product. Instagram, try as it might, will always have to fight an uphill battle against content that has a detrimental effect on teenage girls’ body image because that same content is lucrative fodder for likes, which translate into dollars. But if the platforms are commercial, the relationships they play host to can be real. As a result, many parents have rightly concluded that they can dull the sting of social media’s commercial logic by carefully monitoring who their children follow and communicate with. 

Generative AI chatbots throw that logic out the window. In an AI friendship paradigm, there is no avoiding the economics. Either you’re paying for the friendship or you’re the product. The conversation might be cute, but the relationship is ultimately commercial. That bot, after all, has been created by some corporate entity, which inevitably has its own agenda, whether monopolizing your affection, your time, or your wallet (this is right around the corner: Snapchat already has its own AI bot). Today, we project our human expectations onto AI friends. What if, in the future, the reverse is true, rendering even our human relationships transactional in nature? This is dangerous territory, with vast implications. It’s not a throwaway line for Twitter groupies. 

Human friendships, at their best, are rooted in the human conception of love. Before we learn to walk, to talk, and to query chatbots, we first learn to love in deceptively simple ways. When babies cry, parents soothe them, offering a snuggle and a lullaby. When babies are hungry, parents feed them. All of our first understanding of the world is based on touch and sound and comfort, and we bring that understanding to our relationships. 

AI friends know nothing of these mysteries. They can only echo the human stories that they’ve “read.” 

My kids are too young to use tools like ChatGPT, or seek out companionship in the form of an AI boyfriend or girlfriend for $1 per minute. But they are old enough to be making their first friends. Those friendships are often contentious, especially when there’s only one toy to share. But amid the tears, I see them learning the beauty of experiencing moments of joy with other humans. And I wouldn’t trade those moments for all the supercomputers in the world.  

Recognize your brand’s excellence by applying to this year’s Brands That Matter Awards before the early-rate deadline, May 3.

PluggedIn Newsletter logo
Sign up for our weekly tech digest.
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Privacy Policy

ABOUT THE AUTHOR

Ainsley Harris is a senior writer at Fast Company. She has written about technology, innovation, and finance for the past 10 years, including four cover stories More


Explore Topics