advertisement

Google has worked hard to make its new Gemini AI assistant sound more human. Could that lead people to project race and gender biases on what they hear?

As AI voices become more human, will stereotypes follow?

[Image: local_doctor/Adobe Stock]

BY Janko Roettgers5 minute read

Google has worked hard to make its new Gemini AI assistant sound more human, but could that lead to people projecting race and gender biases on what they hear?

When Google prepared to give its new Gemini AI assistant the ability to speak, the company decided to name its 10 voice options after celestial bodies. Voices like Orbit, Vega, and Pegasus aren’t just a nod to Gemini’s own constellation branding heritage, but also a way to sidestep preconceived notions around gender.

“We wanted to avoid gendered voices,” explains Françoise Beaufays, Google’s senior director of speech for Gemini Live. “If you look at the settings, we never make statements about gender.”

It’s a laudable approach, but it also doesn’t stop Gemini’s users from anthropomorphizing the AI assistant in their mind’s eye. And as AI assistants increasingly sound like humans, one has to wonder: Are out-of-this-world names really enough to prevent us from projecting our own biases about race and gender on them?

Trying to avoid the Alexa trap

Ever since tech companies launched their first voice assistants, they have grappled with gender stereotypes. The first versions of Alexa, Siri, and the Google Assistant all used female-sounding voices by default, leading to criticism that these assistants were playing into existing notions around women as subservient helpers.

“Many of these [assistants] are not necessarily breaking free of the external stereotypes that exist in our society,” says Nicol Turner Lee, director of the Brookings Institution’s Center for Technology Innovation. “They’re just replicating some of the very same issues that we have.”

Assistants like Alexa have long been telling their users that they have no gender, but the general public clearly sees them as female—and also has a hunch of how problematic that is. After Amazon released its Echo smart speaker in 2014, far fewer parents named their baby girls Alexa, in part to avoid a name that was associated with getting ordered around. 

PluggedIn Newsletter logo
Sign up for our weekly tech digest.
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Privacy Policy

ABOUT THE AUTHOR

Janko Roettgers is a San Francisco-based reporter who has written for Variety, Protocol, and Gigaom, among other publications. More


Explore Topics