Quantcast

Apple will no longer make Siri's voice female by default

funnywithfierce

You Tried It!
Joined
Sep 16, 2017
Messages
19,762
Reaction score
Reactions
191,702 35,800 1,142
261,706
Alleybux
2,786
1617258214077.png

Apple's Siri will soon stop defaulting to a female-sounding voice.
The company said Wednesday that its mobile devices will ask users to pick from a range of voices when they set up the virtual assistant.

"Apple currently allows users to pick between a male and female voice as well as six different accents including American, British, Indian and Irish, but defaults to a female voice for US devices. (In some countries, such as the UK, Siri defaults to a male voice.)

Apple (AAPL) also said it will add two new voices to Siri.

"We're excited to introduce two new Siri voices for English speakers and the option for Siri users to select the voice they want when they set up their device," Apple said in a statement.

"This is a continuation of Apple's long-standing commitment to diversity and inclusion, and products and services that are designed to better reflect the diversity of the world we live in."

The changes to Siri are already available in Apple's iOS 14.5 beta, and will take effect for those setting up a new Apple device when the software update rolls out more widely later this year.

Gender stereotypes among voice assistants such as Siri, Amazon's Alexa and the Google Assistant have long been a concern.

In 2019, a United Nations report warned that voice assistants perpetuate the idea that "women are obliging, docile and eager-to-please helpers, available at the touch of a button or with a blunt voice command."

Most virtual assistants still default to a female voice, though users have the option to switch."

Source: Apple will no longer make Siri's voice female by default
 

BklynFinest

There's no crying. There's no crying in baseball
Joined
Apr 14, 2014
Messages
11,437
Reaction score
Reactions
102,402 3,482 890
114,894
Alleybux
546,417
"Gender stereotypes among voice assistants such as Siri, Amazon's Alexa and the Google Assistant have long been a concern."

Really? :rolleyes:
 

Rumplestiltskin

Knee-Jerk Contrarian
Joined
Jan 14, 2016
Messages
14,954
Reaction score
Reactions
116,741 17,256 1,867
117,353
Alleybux
573,132
"Gender stereotypes among voice assistants such as Siri, Amazon's Alexa and the Google Assistant have long been a concern."

Really? :rolleyes:

They phrased it incorrectly. They meant to say “xes stereotypes” not “gender”. All the personal assistant voices default to female because the tech bros only experience with women is their mommies taking care of them. But now all the tech companies are staffed by autogynephilic sissy pδrn addicts so they want “gender neutral” voices that no one else is going to be comfortable with at all.
 

luckygirl93

Caramel Goddess
Joined
Dec 21, 2014
Messages
29,524
Reaction score
Reactions
407,674 21,357 16,214
411,416
Alleybux
883,324
Basically gay men who pretend like they trans. Have made them change it so that in hopes straight men will start finding deep voices attractive. All of these changes are just them trying to make straight men see them as women so that they will find them sexy.
 

Cass S

General Manager
Joined
Aug 26, 2017
Messages
2,751
Reaction score
Reactions
27,296 783 387
29,007
Alleybux
224,224
Basically gay men who pretend like they trans. Have made them change it so that in hopes straight men will start finding deep voices attractive. All of these changes are just them trying to make straight men see them as women so that they will find them sexy.
They are literally just offering you a choice like they do in other countries. In the UK the default is a male voice. When I set up my new iphone a week ago it gives you options on what you want siri to sound like. It's part of the setup and not deep.
 

O.o

Black Women Disproportionately @ Risk For Homicide
Joined
Feb 23, 2014
Messages
59,470
Reaction score
Reactions
401,375 11,864 9,185
442,629
Alleybux
11,900

Can they please add a Black man with some bass in his voice?

Or a Black woman like Issa Rae?
 

O.o

Black Women Disproportionately @ Risk For Homicide
Joined
Feb 23, 2014
Messages
59,470
Reaction score
Reactions
401,375 11,864 9,185
442,629
Alleybux
11,900
Can they add a Jamaican man like Buju?

Or a fake Jamaican accent like Bushmaster from Luke Cage.
 

Surreal

⭐ ⭐ ⭐
Joined
Sep 25, 2015
Messages
27,796
Solutions
2
Reaction score
Reactions
241,902 7,371 6,305
246,605
Alleybux
233,069
giphy.gif
Even computers want to be transgender now.

You miss the point. The stereotype is making a virtual assistant sound like a woman by default, as if to say assistants must be women by default. Assistants who work and serve you and do your bidding. It must be a woman's role. It's more than a voice cause all these AIs are given female names by default too, so they were always planned to represent women. Y'all don't get it. Some people must have complained for them to stop putting women in these stereotypical roles. It's even more obvious what's going on here when your realize most of these AIs are made and mostly programmed by men. So men are putting these things as women cause that's what they want- a woman to serve them and do everything they say. I myself have noticed that a lot of robots and AIs being made lately are mostly made to present as women.


Here's further reading:

"Why do most virtual assistants that are powered by artificial intelligence — like Apple’s Siri and Amazon’s Alexa system — by default have female names, female voices and often a submissive or even flirtatious style?

The problem, according to a new report released this week by Unesco, stems from a lack of diversity within the industry that is reinforcing problematic gender stereotypes.
“Obedient and obliging machines that pretend to be women are entering our homes, cars and offices,” Saniye Gulser Corat, Unesco’s director for gender equality, said in a statement. “The world needs to pay much closer attention to how, when and whether A.I. technologies are gendered and, crucially, who is gendering them.”

One particularly worrying reflection of this is the “deflecting, lackluster or apologetic responses” that these assistants give to insults.

The report borrows its title — “I’d Blush if I Could” — from a standard response from Siri, the Apple voice assistant, when a user hurled a gendered expletive at it. When a user tells Alexa, “You’re hot,” her typical response has been a cheery, “That’s nice of you to say!”

Siri’s response was recently altered to a more flattened “I don’t know how to respond to that,” but the report suggests that the technology remains gender biased, arguing that the problem starts with engineering teams that are staffed overwhelmingly by men.

“Siri’s ‘female’ obsequiousness — and the servility expressed by so many other digital assistants projected as young women — provides a powerful illustration of gender biases coded into technology products,” the report found.

Amazon’s Alexa, named for the ancient library of Alexandria, is unmistakably female. Microsoft’s Cortana was named after an A.I. character in the Halo video game franchise that projects itself as a sensuous, unclothed woman. Apple’s Siri is a Norse name that means “beautiful woman who leads you to victory.” The Google Assistant system, also known as Google Home, has a gender-neutral name, but the default voice is female.

Baked into their humanized personalities, though, are generations of problematic perceptions of women. These assistants are putting a stamp on society as they become common in homes across the world, and can influence interactions with real women, the report warns. As the report puts it, “The more that culture teaches people to equate women with assistants, the more real women will be seen as assistants — and penalized for not being assistant-like.”

Apple and Google declined to comment on the report. Amazon did not immediately respond to requests for comment.
The publication — the first to offer United Nations recommendations regarding the gendering of A.I. technologies — urged tech companies and governments to stop making digital assistants female by default and to explore developing a gender-neutral voice assistant, among other guidance.

The systems are a reflection of broader gender disparities within the technology and A.I. sectors, Unesco noted in the report, which was released in conjunction with the government of Germany and the Equals Skills Coalition, which promotes gender balance in the technology sector.

Women are grossly underrepresented in artificial intelligence, making up 12 percent of A.I. researchers and 6 percent of software developers in the field.
The report noted that technology companies justify the use of female voices by pointing to studies that showed consumers preferred female voices to male ones. But lost in that conversation is research showing that people like the sound of a male voice when it is making authoritative statements, but a female voice when it is being “helpful,” further perpetuating stereotypes.
Experts say bias baked into A.I. and broader disparities within the programming field are not new — pointing to an inadvertently sexist hiring tool developed by Amazon and facial recognition technology that misidentified black faces as examples.
“It’s not always malicious bias, it’s unconscious bias, and lack of awareness that this unconscious bias exists, so it’s perpetuated,” said Allison Gardner, a co-founder of Women Leading in A.I. “But these mistakes happen because you do not have the diverse teams and the diversity of thought and innovation to spot the obvious problems in place.”

But the report offers guidance for education and steps to address the issues, which equality advocates have long pushed for.

Dr. Gardner’s organization works to bring women working in A.I. together with business leaders and politicians to discuss the ethics, bias and potential for legislative frameworks to develop the industry in a way that is more representative.

The group has published its own list of recommendations for building inclusive artificial intelligence, among them establishing a regulatory body to audit algorithms, investigate complaints and ensure bias is taken into account in the development of new technology.

“We need to change things now, because these things are being implemented now,” Dr. Gardner said, pointing to the rapid spread of A.I.-powered virtual assistants. “We are writing the standards now that will be used in the future.”

Dr. Gardner said that changes are also needed in education, because the bias was a symptom of systemic under representation within a male-dominated field.

“The whole structure of the subject area of computer science has been designed to be male-centric, right down to the very semantics we use,” she said.

Although women now have more opportunities in computer science, more are disappearing from the field as they advance in their careers, a trend known as the “leaky pipeline” phenomenon.

“I would say they are actually being forced out by a rather female-unfriendly environment and culture,” Dr. Gardner said. “It’s the culture that needs to change.”


(source)
 

rasclautmangos

General Manager
Joined
Apr 2, 2019
Messages
4,702
Reaction score
Reactions
34,582 1,251 263
39,635
Alleybux
12,655
You can tell when people just don't have sh!t else to worry about lol

Just say Siri has more voices to choose from now and go, all this extra ish is exhausting.
 

driveinsaturday

Team Owner
Joined
Jan 9, 2017
Messages
7,131
Reaction score
Reactions
88,105 1,446 1,126
92,489
Alleybux
442,137
I applaud this. Even Bixby on Samsung (Bixby is the name of a male butler) defaults to female.

This all comes down to stereotypes of women as servants and low level assistants. If people all want to choose female voices that's fine but it's great that apple doesn't want to be part of reinforcing sexism.
 

Similar Threads

The Culture

News Alley

Top Bottom