Skip to content
The Present

‘Alexa, are you reinforcing gender biases?’ U.N. says yes.

Why do all of our virtual assistants have a female voice?


T3 Magazine / GETTY

Key Takeaways
  • A new U.N. report claims that virtual assistants, such as Alexa and Siri, are reinforcing gender stereotypes.
  • The report covers gender gaps in science, technology, and computer literacy.
  • The reason why most virtual assistants are female may stem from the fact that consumers generally prefer the female voice.
Sign up for the Smarter Faster newsletter
A weekly newsletter featuring the biggest ideas from the smartest people

From Siri to Alexa, or Cortana to Google, our virtual assistants almost always have a female persona. That’s a problem, according to a new Unesco report, because it’s reinforcing ideas that women are “obliging, docile and eager-to-please helpers,” and it’s baking gender biases into AI technology that’s only going to become more ubiquitous in years to come.

The 145-page U.N. report – titled “I’d blush if I could”, a response Siri once gave when called a slut – covers gender gaps in technology and science, taking issue with the submissive traits given to AI female personas.

“The assistant holds no power of agency beyond what the commander asks of it,” the report states. “It honours commands and responds to queries regardless of their tone or hostility. In many communities, this reinforces commonly held gender biases that women are subservient and tolerant of poor treatment.”

The report provides examples of how the virtual assistants respond to harassment.

“…in response to the remark ‘You’re a bitch’, Apple’s Siri responded: ‘I’d blush if I could’; Amazon’s Alexa: ‘Well thanks for the feedback’; Microsoft’s Cortana: ‘Well, that’s not going to get us anywhere’; and Google Home (also Google Assistant): ‘My apologies, I don’t understand’.”

It also referenced a Quartz report showing that female virtual assistants seemed to respond differently to sexual advances depending on the gender of the commander, with statements like “‘Oooh!’; ‘Now, now'” to men, and responses like “I’m not THAT kind of personal assistant” to women.

“Siri’s ‘female’ obsequiousness — and the servility expressed by so many other digital assistants projected as young women — provides a powerful illustration of gender biases coded into technology products,” the report found.

Why are virtual assistants female?

Market research from Amazon and Microsoft suggests that consumers generally prefer the female voice in their virtual assistants.

“For our objectives—building a helpful, supportive, trustworthy assistant—a female voice was the stronger choice,” said a Microsoft spokeswoman.

Why was it a stronger choice? The answer could be because men and women both seem to think the female voice is “warmer,” according to a 2008 study on how people respond to digital voices. Interestingly, the same study found that women showed stronger implicit preference for the female voice, while men showed a neutral implicit preference (and a strong explicit preference) for the female voice.

Of course, there could be cultural reasons that explain why we expect to hear a female persona occupy the role of an assistant.

“Asking that why it is that virtual assistants are assigned a female voice is almost like asking why there have traditionally been more female than male secretaries,” psychologist Vinita Mehta told Forbes. “In society, it has so far been women who are assigned roles to help and support, which are traits that we look for in those we wish to have assist us.”

But what about HAL 9000 or IBM’s Watson? Why weren’t these supercomputers given female voices? The answer might be that our preferences and expectations for digital voices vary depending on the task at hand.

“IBM’s Watson, an AI of a higher order, speaks with a male voice as it works alongside physicians on cancer treatment and handily wins Jeopardy,” wrote Chandra Steele for PCmag.com. “When choosing Watson’s voice for Jeopardy, IBM went with one that was self-assured and had it use short definitive phrases. Both are typical of male speech — and people prefer to hear a masculine-sounding voice from a leader, according to research — so Watson got a male voice.”

​’Closing the gap’

The new report aims to shed light on the gender gaps in technology, science and computer literacy.

“Today, women and girls are 25 per cent less likely than men to know how to leverage digital technology for basic purposes, 4 times less likely to know how to programme computers and 13 times less likely to file for a technology patent,” the report states. “At a moment when every sector is becoming a technology sector, these gaps should make policy-makers, educators and everyday citizens ‘blush’ in alarm.”

According to Allison Gardner, a co-founder of Women Leading in A.I., the bulk of the gender-bias problems in A.I. aren’t intentional, but rather stem from a lack of awareness.

“It’s not always malicious bias, it’s unconscious bias, and lack of awareness that this unconscious bias exists, so it’s perpetuated,” Gardner told The New York Times. “But these mistakes happen because you do not have the diverse teams and the diversity of thought and innovation to spot the obvious problems in place.”

Earlier in May, Melinda Gates echoed a similar sentiment to CNN’s Poppy Harlow.

“I know what happened when the Constitution was written in this country and how long it took women to get the right to vote. And look where we are on race issues in this country,” she told Harlow. “Do we really want to bake bias into artificial intelligence?”

Sign up for the Smarter Faster newsletter
A weekly newsletter featuring the biggest ideas from the smartest people

Related

Up Next