Amazon & # 39; s Alexa and Apple & # 39; s Siri are SEXIST because her female voice reinforces the idea that women & # 39; submissive & # 39; the UN claims
- AI-fed voice assistants with female voices may not be the norm in the future
- Research showed that voices used by AI bots reinforce ideas that make women & # 39; submissive & # 39; to be
- The UNESCO report, called & # 39; I would blush if I could & # 39; has called for change
Alexa is a widely heard cry in most homes and the smart speaker has become a household staple, along with Apple & Siri.
But AI-fed voice assistants with female voices may not be the norm in the future because they encourage harmful gender bias.
A UN study found that the voices used by smart speakers reinforce the ideas that women are & # 39; submissive & # 39; because they are portrayed as & # 39; required and eager to please & # 39 ;.
It also criticized the way in which female AIs respond to gender-based insults with & # 39; distracting, faint or apologetic responses & # 39 ;. The UNESCO report, titled & # 39; I would blush if I could & # 39 ;, calls on technology companies to no longer make voice assistants more feminine and hire more women to work on them.
Questions about using the Alexa personal assistant from Amazon can be seen in an Amazon & # 39; experience center & # 39; in Vallejo, California
The title is derived from a standard answer from Siri, Apple's first mobile assistant, and is what the automated voice said in response to being an & # 39; b ** ch & # 39; was named.
The 146-page report reads: & # 39; Companies such as Apple and Amazon, staffed by overwhelmingly male technical teams, have built AI systems that ensure their feminized digital assistants greet verbally violence with catch-me-if -you-can-flirt.
& # 39; Because the voice of most voice assistants is female, it sends a signal that women are docile helpers, available at the touch of a button or with a blunt voice command such as & # 39; hey & # 39; or & # 39; OK & # 39 ;.
& # 39; The assistant has no more power than the commander asks him. It honors assignments and answers questions regardless of their tone or hostility.
& # 39; In many communities, this reinforces the common gender bias that women are submissive and tolerant of poor treatment. & # 39; About 100 million smart speakers were sold worldwide in 2018, according to research firm Canalys.
The Unesco report, titled & # 39; I would blush if I could & # 39 ;, calls on technology companies to stop making female voice assistants by default
These voice assistants manage one billion tasks a month, from playing music and telling about the weather to offering recipes for a home-cooked meal.
The report calls on developers to create a neutral machine connection for voice assistants and even goes so far that voice assistants are programmed to answer and discourage gender-based insults.
By 2020, research firm Gartner predicts that some people will have more conversations with their voice assistant than with their spouse.
Therefore, at the start of every interaction, the device should announce that it is non-human, the report suggested.
Scientists, sound designers and linguists are currently struggling to create a genderless digital voice with the name Q.
The makers said on their website: & # 39; While society continues to break the gender binary number and recognizes those who do not identify themselves as male or female, the technology we create must follow. & # 39;
The Unesco report concluded that more women are needed in the field of technology and the development of smart speakers that prevent machines from playfully responding to abuse or insults in the future.
. [TagsToTranslate] Dailymail