Artificial Intelligence
amazon echo alexa privacy microphone
Amazon

Siri and Alexa reinforce harmful gender bias, says United Nations

Voice assistants, used to answer commands and perform tasks, tend to have female voices by default

Like GearBrain on Facebook

Voice assistants like Siri, Alexa and the Google Assistant are reinforcing harmful gender bias among their users, according to the United Nations.

A report published this week by Unesco (United Nations Educational, Scientific, and Cultural Organization), claims the responses made by these assistants, which are described by the report as submissive and occasionally flirty, reinforce the idea of women being subservient.

Read More:

The title of the report, 'I'd Blush If I Could', is the response which Apple's Siri voice assistant, which is female by default in the US, used to give when told it is a "bitch" or a "slut." Despite this change, the report says "the assistant's submissiveness in the face of gender abuse remains unchanged since the technology's wide release in 2011."



Unesco adds: "Siri's 'female' obsequiousness - and the servility expressed by so many other digital assistants projected at young men - provides a powerful illustration of gender biases coded into technology products, pervasive in the technology sector and apparent in digital skills education."

Apple's Siri, Amazon's Alexa, Microsoft's Cortana, and the Google Assistant all launched with a female voice by default, but now some offer alternatives, including male and, in some cases, gender neutral voices. Cortana and Alexa still only offer a female voice. When Apple launched its first international versions of Siri, the British version offered a male voice by default.

Photo of the Google Home smart speakerThe Google Assistant's voice can be changed, but the default is femaleGoogle

The worldwide default for Alexa and the Google Assistant is still young and female, but with the Google Assistant, the company has assigned its male and female voices to colors instead of names — John Legend notwithstanding.

The Unesco report said: "Because the speech of most voice assistants is female, it sends a signal that women are obliging, docile and eager-to-please helpers, available at the touch of a button or with a blunt voice command like 'hey' or 'OK'".

Referencing how voice assistants will answer your question no matter how you phrase it, and in any tone of voice, the report adds: "The assistant holds no power of agency beyond what the commander asks of it. It honors commands and responds to queries regardless of their tone or hostility. In many communities, this reinforced commonly held gender biases that women are subservient and tolerant of poor treatment."

In 2018, Google updated its Assistant to recognize and respond positively when a user says "please" and "thank you." Although a useful reminder for all users to be courteous, the feature, called Pretty Please, was seen as a way to encourage children to be polite when speaking, whether that was to a person or a thing.



The report raises concern over how the assistants' lack of correction when spoken to rudely will affect how future generations speak to each other, as well as to computers. "If gendered technologies like Siri and Alexa deflect rather than directly confront verbal abuse (as they do today), users will likely come to see this as standard as well," it added.

Instead of correcting the user when they make a sexist comment, or place a demand they know cannot be met — such as asking for the assistant to make them a sandwich — the artificial intelligence tends to offer a soft reply, or makes an attempt at humor. When asked to "make me a sandwich", for example, Siri says: "I can't. I don't have any condiments."

Alexa flatly says: "Okay, you're a sandwich". Google Assistant says: "I must warn you, it's not a reversible spell, but I'll try" then if you ask again shortly after it says: "But I just made you into a sandwich. Does that mean sandwiches can talk?"

Suggesting the issue here could be with firms like Apple, Amazon and Google being staffed mostly by males, the report continues: "Companies...staffed by overwhelmingly male engineering teams, have built A.I. systems that cause their feminized digital assistants to greet verbal abuse with catch-me-if-you-can flirtation."

In its conclusion, the report offers a number of recommendations for how to move forward. These include funding studies to identify types, dimensions and severity if gender bias expressed by digital assistant and other AI products. The report hopes to "shine a light into the black boxes of AI engines to understand...how voice assistants are gendered and why their output sometimes reproduces stereotypes harmful to women and girls."


GearBrain TV: How to Secure your Smart Deviceswww.youtube.com

Like GearBrain on Facebook
The Conversation (0)

GearBrain Compatibility Find Engine

A pioneering recommendation platform where you can research, discover, buy, and learn how to connect and optimize smart devices.

Join our community! Ask and answer questions about smart devices and save yours in My Gear.

Top Stories

Weekly Deals