advertisement
advertisement
advertisement

The world’s first genderless AI voice is here. Listen now

Siri, Cortana, Alexa, and Google all default as women. But what if AI assistants thought beyond gender norms?

The world’s first genderless AI voice is here. Listen now
[Image: Virtue]

Voice assistants like Apple’s Siri and Amazon’s Alexa are women rather than men. You can change this in the settings, and choose a male speaker, of course, but the fact that the technology industry has chosen a woman to, by default, be our always-on-demand, personal assistant of choice, speaks volumes about our assumptions as a society: Women are expected to carry the psychic burden of schedules, birthdays, and phone numbers; they are the more caregiving sex, they should nurture and serve. Besides, who wants to ask a man for directions? He’ll never pull over at a gas station if he’s lost!

advertisement
advertisement

But what many people–myself included–have missed in the gender criticism of personal assistants is that it was even binary to begin with, as so much of the world identifies outside that schema. This oversight is exactly what Q is trying to fix. Q claims to be the world’s first genderless voice for AI systems developed by the creative studio Virtue Nordic and the human rights festival Copenhagen Pride, in conjunction with social scientist Julie Carpenter. The project had no client; it was born from a design exploration inside Virtue Nordic and snowballed from there.

To its creators, Q solves a very real problem that happens when technology fails to represent everyone. “Based on what we know about some other technologies that are communication mediums, we do understand that social representation–or omission of social representation in media–is important in influencing social values,” writes Carpenter over email. “There is a circle of influence between society, the people who develop the technology and people using the technology…” In other words, because Siri cannot be gender neutral, she reinforces a dated tradition of gender norms.

Now, voice assistants are often gender-specific for a reason. Companies test these computer voices on users and listen to the results of those tests. At Amazon, users preferred Alexa as a woman rather than a man. That relatively small sample set was extrapolated to represent Alexa for everyone. Research has shown, too, that men and women alike report female voices being more “welcoming” and “understanding” than male voices, and it’s easy to understand why these would be qualities any company would want in their always-listening voice assistant. But these companies and researchers only tested male and female voices. And testing a narrow set of options on a limited number of users isn’t the best way to build representational technology.

[Image: Virtue]
To develop Q, creators Emil Asmussen and Ryan Sherman from Virtue Nordic sampled several real voices from non-binary people, combined them digitally, and created one master voice that cruises between 145 Hz and 175 Hz, right in a sweet spot between male- and female-normative vocal ranges.

To the developers, it was important that Q wasn’t just designed as non-binary, but actually perceived by users as non-binary, too. So through development, the voice was tested on more than 4,600 people identifying as non-binary from Denmark, the U.K., and Venezuela, who rated the voice on a scale of 1 to 5–1 being “male” and 5 being “female.” They kept tuning the voice with more feedback until it was regularly rated as being gender-neutral.

Now that Q is being promoted publicly, Asmussen and Sherman say they have received interest from companies in the tech industry that might want to adopt Q in their platforms. “The dream is that it’s implemented as a third option for Siri and Alexa,” the duo writes over email. “We’re inviting the tech firms to collaborate with us. There’s no price tag on Q.”

advertisement

Hopefully that dream comes true. Today, your assistant is a man or a woman. Computer software and AIs, however, were neither gender in the first place. They’re just code. So if we insist on making them sound human, perhaps it’s most suitable that our virtual assistants mirror non-binary voices, too. From this perspective, adopting general neutral AI is not even a question of civil rights or representation; it’s just good design.

advertisement
advertisement

About the author

Mark Wilson is a senior writer at Fast Company who has written about design, technology, and culture for almost 15 years. His work has appeared at Gizmodo, Kotaku, PopMech, PopSci, Esquire, American Photo and Lucky Peach

More