Bank of America is about to launch its new virtual assistant. Like most other AI-powered bots, the program will use natural language processing so that the bank’s customers can ask it to perform tasks. The new app is called “Erica,” and will be there for all your banking needs–another gendered, computer-operated voice to serve you.
It’s now become a depressing trope for businesses to adopt new human-leaning technologies and have them reinforce ingrained stereotypes. Erica, of course, joins a list of female robots that includes Apple’s Siri, Amazon’s Alexa, and Microsoft’s Cortana.
As we’ve written recently, there is a growing movement in the artificial intelligence space to combat bias–that is, to build self-learning computer programs that don’t create feedback loops based on the presuppositions of the people who originally coded them. One of the ways to do this is to create consumer-facing programs that don’t feed stereotypes.
Which is to say: Why is Erica not Eric? Perhaps because the people who create technology–and the consumers who use it–have come to expect subservient technology to be gendered. Maybe it’s time to expect more.