The case against teaching kids to be polite to Alexa

When parents tell kids to respect AI assistants, what kind of future are we preparing them for?

The case against teaching kids to be polite to Alexa

Should children be polite to virtual assistants?


It’s a simple question. And for most parents and child development experts, the answer is simple, too: Yes, of course they should. Nobody wants to hear children rudely barking orders at, or verbally abusing, an adult voice.

But teaching kids to say “please” and “thank you” to Alexa and Google Assistant may have unintended consequences and raise other questions that aren’t so simple.

Millions of parents have suddenly been forced to grapple with this new parental conundrum. Already, 20 percent of U.S. households have some kind of smart speaker, according to comScore. And Juniper Research says that percentage will rise to 55% within four years.

Today’s toddlers are the first generation to grow up without any memory of the world before ubiquitous artificial intelligence devices in homes. Parents are justifiably concerned about how these gadgets affect their children. One concern is manners. According to the U.K. research organization Childwise, children almost never say “please” or “thank you” to virtual assistant appliances (unlike adults, who often do).

Parents aren’t happy. But at least two companies are trying to help: Amazon and Google.


Amazon’s Echo Dot Kids Edition is accompanied by a raft of parental controls, children’s content, and features that encourage good manners. [Photo: courtesy of Amazon]
In April, Amazon introduced a politeness feature for its Alexa virtual assistant, along with a colorful line of Echo Dot devices just for kids. The manners feature, called Magic Word, is part of FreeTime, a wider range of child-specific features and content. It’s designed to encourage children to say “please” and “thank you” when speaking to Alexa assistant. After consulting outside child development experts, Amazon decided on positive reinforcement, with no “penalty” when a child is rude. For example, when a child says “please” in a request, Alexa might respond with “Thanks for asking so nicely.” Alexa replies to “Thank you” with “You’re welcome” or something similar. But if a child doesn’t say “please” or “thank you,” there’s no consequence.

An Amazon spokesperson told me that parents had requested help with reinforcing polite speech when their kids talk to Alexa. The company says it’s “still super early days” with the Magic Word feature, and expects to make future improvements based on customer feedback.

At its I/O conference in May, Google introduced something similar for its Google Home product, as well as some third-party smart speakers that support Google Assistant. Google’s politeness feature–part of Family Link, a bundle of kid-friendly capabilities–is called Pretty Please. Like Magic Word, Pretty Please uses positive reinforcement and reciprocal niceties to encourage manners in children. But Pretty Please goes further, giving parents the option to counter impolite demands with the phrase “Say the magic word!”

Pretty Please is optional and configurable and has to be turned on by parents for each specific child, whom the Assistant identifies using voice recognition technology. It will be rolled out over the summer, according to Google, which did not respond to my request for input on this article.

Both Amazon and Google are doing it right, according to Sheryl Berlin Brahnam, a professor in Missouri State University’s Management and Information Technology department. She has no problem with parents asking their children to be polite when talking with conversational agents, to say “thank you” and “please” when appropriate. And it’s “absolutely essential that agents reply in kind,” she adds. “All speaking beings should exhibit good manners; it makes for pleasant and friendly exchanges.”


However, while Brahnam finds subtle nudges toward politeness acceptable, she believes that it’s wrong for virtual assistants to “enforce polite behavior by being overly manipulative or by being punitive in any way whatsoever.”

Which brings us to an earlier attempt at using AI to enforce manners. Toy giant Mattel planned AI etiquette features for its short-lived kids’ virtual assistant, called Aristotle. Introduced a year and a half ago, the smart speaker would read bedtime stories, soothe crying babies at night, and teach toddlers basics such as the alphabet. To enforce manners, Aristotle would refuse to go along with children’s requests unless they said “please.”

Mattel’s now-defunct Aristotle product would have handled several tasks normally performed by parents, including an insistence on saying “please” and “thank you.” [Photo: courtesy of Mattel]
A petition organized by the nonprofit Campaign for a Commercial-Free Childhood pressured Mattel into canceling the product, based on concern about the infringement of children’s privacy and also over the outsourcing of some parental care to an information appliance.

These initiatives from Amazon, Google, and Mattel were designed to help parents teach children good manners. But in the process, what are children learning about their relationship to intelligent machines?

The courtesy conundrum

What is the purpose of manners and etiquette, anyway? A passage on the Child Development Institute website outlines why it’s important to teach manners to children. “Good manners convey a sense of respect for the sensibilities of other people,” it reads. When you say ‘thank you,’ you’re taking the time to make the other person feel appreciated. Saying ‘please’ respects a person’s right not to do what you’ve asked.”


By extending these human social norms to software and cloud services, are we teaching children that machines have sensibilities to be considered the same way we consider human feelings? Being polite to a piece of technology may suggest that AI assistants are capable of feeling appreciated or unappreciated; that machines have rights; and that one of these rights is the right to refuse our requests.

In teaching children to treat machines like people, we may also be treating people like machines. Telling kids to say “please” and “thank you” to software, knowing that no feelings are involved, could be construed to be telling them to to run their courtesy routines automatically regardless of meaning, effect, or purpose.

That’s not the intent, of course. Parents want to teach the habit of good manners, so that when children do interact with other people, they’ll be polite. But this teaching is applied irrationally and unevenly.

For example, parents also want their kids to be polite when writing to people, and will insist that they do so. Yet there’s no demand by parents to make kids say “please” and “thank you” when searching on Google–or when typing a request or query to Google Assistant on a smartphone app.

Parents presumably aren’t bothered when their child types “tell me the weather” in the Google Assistant app. But when a kid says those exact words out loud to Google Home, the rules seem to be different. These queries are interacting with the exact same servers and databases, yet the spoken words demands etiquette in ways the written words do not.


People, including children, speak to objects all the time. A child struggling to open a jar of peanut butter might say: “Come on, open!” Parents are unlikely to insist on a “please” or “thank you.” Yet that jar of peanut butter is exactly as sentient as Alexa, and has the same degree of feelings, the same amount of authority and is deserving of the same level of respect or deference. In both cases, the child is speaking, and so presumably habits of speech are being formed.

The difference, of course, is that Alexa can simulate or fake human thought and speech. Peanut butter cannot. And Alexa is listening.

So what happens when virtual assistant appliances start watching as well? The next generation of smart speakers and other assistant appliances, such as Amazon’s Echo Show, have cameras and screens.

[Photo: courtesy of Amazon]
Will parents request additional features that encourage children to mind manners that can be seen, rather than heard? When children are alone in a room with a smart speaker, for example, will parents want that appliance to insist on being politely greeted when the child enters the room? Will virtual assistants badger kids to keep their elbows off the table, chew with their mouths shut, and to not rudely point at the smart speaker?

When future virtual assistants gain preemptive action, and, say, remind a child that she has a karate class in an hour, will the parents–or the assistant itself–insist that the child respond? “Emma! Alexa is talking to you!”


If the answer to these questions is no, then why the double standard with spoken manners? And if the answer is yes, then what kind of world are we creating?

Politeness features aim to help kids learn good manners. But the unintended consequences might be to teach kids that intelligent machines are more or less the same as people, or even that they’re authority figures that should legitimately scrutinize human behavior. From infancy, children could get used to the idea that AI can refuse to comply with instructions unless human behavior is judged acceptable by the machines.

We may be accidentally creating a lifelong intuition in children that software has feelings that can be hurt, that it’s an intelligent being to be respected–or even an authority to be obeyed.

Josh Golin, executive director of the Campaign for a Commercial-Free Childhood, flatly told me: “I recommend that parents not utilize the politeness features.” He wonders: “What are the implications of children forming friendships with, and imbuing human characteristics into, a machine [built by companies] whose business models are all about creating dependence?”

“We don’t want children to view Alexa as a friend or having human characteristics and forming attachments that will make forming skepticism later on more difficult,” Golin adds.


Teaching more than manners

The world is changing. And parents need to prepare kids for the world they’ll actually live in. We need to teach them the old things, like good manners, and the new things, like the truth about AI.

“Being able to identify what makes humans different than machines is going to be a very important skill as AI devices infiltrate more and more aspects of our lives,” Golin says.

For starters, kids need to learn that Amazon Echos and Google Homes do not fit in the same category as mom and dad, but in the same category as TVs and toasters.

This parental challenge is part of a much larger one: preparing kids to cope with a world of digital illusions–the fake, the phony, and the virtual. Today’s toddlers will grow up in a world of deepfake videos, computer-generated Instagram personalities, holographic celebrities and virtual reality.

Preparing kids for the future means more than mere manners. It means teaching them to appreciate the difference between real human people and mere machines designed to create the illusion of humanity.


Kids need to learn that other people want and deserve our politeness and courtesy. And that appliances don’t.