1. Skip to content
  2. Skip to main menu
  3. Skip to more DW sites

Sexy Siri, you made a fool of everyone

February 11, 2019

Some say it's sexist, some say it's science. Is there any solid reason why speech assistants are a default female voice?

Apple Homepod
Image: Apple

So for all you wonderful people born after "the end of history," when cultural references prior to one's own existence still mattered, a word on that headline: "Sexy Siri" is a nod and a wink to The Beatles' 1968 song "Sexy Sadie," and not merely because it's a neat alliteration. It's also an apt analogy for what's going on with smart speaker technology and the fact that most use female voices by default. 

They appear to be one thing, when perhaps they're something entirely different.

Sexy Sadie was either some woman the band knew, or (more likely) the Maharishi Mahesh Yogi, an Indian spiritual leader, who either led or misled The Beatles and many other celebrities — depending on your opinion (I have none on that particular matter) — with incense and bongs during the hippie smell-o-vution.  

Read more: DW Discoveries launches on Google Home, smartphone

He or she "broke the rules," they sang, "made a fool of everyone," and, comes the warning line, "… you'll get yours yet, however big you think you are." 

Only the Maharishi never got his, and chances are Siri won't get hers, however big the smart speaker market grows. And if self-serving industry projections are anything to go by, the market is going to be huge — so neither will Amazon's Alexa get hers, nor will Google's nameless Home Assistant, or Microsoft's Cortana get theirs.

The Maharishi was meant to be a version of purity, neutral to earthly distractions like sex, and yet he is said to have hit on the actress Mia Farrow — so goes the legend that inspired a disillusioned John Lennon to write the shaming song. 

Screenshot of Apple's speech assistant, Siri. "Hey Siri, what's your gender?" Answer: "I don't have a gender."Image: Screenshot Apple iOS

Smart speaker technology, by the same token, is its own version of purity, neutral to most social issues including feminism (well, the tech's got to be global and on that scale it's too easy to offend), so the technology is steeped in bias, unconscious or otherwise. Even if we give these glorified fonts of knowledge female names.

But we thought you wanted more women in tech!

Yes, we did indeed want more women in tech — just not imprisoned in a remotely controlled box. We want them to be equal partners in the development of such tech and scientific research, policy, and the star astronauts in space.

"Speech assistants are…," said a learned feminist I trust, when I asked her, "basically blow-up dolls."

I laughed, and I got the point. But then I countered, "So what about the kids who use them to listen to bedtime stories?"

"Hmm... But they weren't developed for kids, were they?"

Perhaps not originally, anyway.

So is it science?

The industry will tell you it is all about business. They will say that female voices market better, and that they've done the research.

Artificial Intelligence and Art

02:19

This browser does not support the video element.

Sandra Calvert, a professor of psychology at Georgetown University and director of the Children's Digital Media Center there says the use of female voices as default may be "based on user preferences."

We're not aware of any direct links between Calvert and smart speaker developers.

Read more: We know what you're thinking. We read your brain

But Calvert cites a study which suggests a preference by women for female voices and a "more flexible preference" by males for male or female voices. 

"That kind of data might lead a business to choose the female voice as the default, with the flexibility to change the voice based on personal preferences," she wrote in an email to DW. "How those voices are perceived depends on the listener." 

In an email to DW, Amazon said the company had looked for a "pleasant voice" that could co-habit "with people in their living rooms." And after countless tests, they say they found that female voices came across as friendlier than male ones — and that that is how they landed on the "current version of Alexa." 

Siri sexism? Screenshot of Apple's speech assistant, Siri. "Hey Siri, are you sexist?" Answer: "I believe all voices are created equal and worth equal respect."Image: Screenshot Apple iOS

Siri, it should be noted, uses male voices as default in certain countries and certain languages. But male or female, Siri is Siri, a Norse word for a "beautiful" woman or victory, or a Swahili word for "secret." So, either way, it's a telling name.

Other science suggests people respond better to maternal — so, again, female — voices.

There's a relatively famous American study that was done to investigate the effectiveness of using maternal voices in smoke detectors and fire alarms. The study found that "maternal voice alarms significantly outperformed" common "tone alarms."

Until recently, the researchers had omitted to compare male to female voices, but they have done that now and the results are pending, according to one of the team, Dr. Gary Smith, who replied by email.

Read more: Facebook funds AI ethics center in Munich

There are other studies that suggest maternal voices reduce pain in preterm infants — babies born more than three weeks before their expected date of birth.

And others about the warmth in female voices… Etcetera.

"There are a number of supposed scientific reasons why female voices are used but none of them stand up to scrutiny," writes Dr. Kate Devlin in an email to DW.

Devlin is a senior lecturer in Social and Cultural Artificial Intelligence at King's College London. She's also the author of "Turned On: Science, Sex and Robots."

"With regards to maternal voices, babies show a preference for female voices initially, but there isn't evidence for this lasting past the first eight months or so. Nor, by the way, are female voices easier to hear," says Devlin. "It all boils down to stereotypes and sexism, not science."

Is it sexism?

How credible are these studies?

How likely is it that the results were prejudiced by socially conditioned ideas (among the men?) of the sounds we perceive to be warm, sexy, or just easy on the ear?

Screenshot of Apple's speech assistant, Siri. "Hey Siri, are you a machine?" Answer: " I don't have an answer to that. Is there something else I can help you with?" Image: Screenshot Apple iOS

And what about all the kids who, left to their own faceless devices, while their parents "just answer that text," may — or may not — learn to associate only female voices with people who give them stuff, or from whom they can demand answers, without even an inkling of a "please" or a "thank you"?

Please note, I've not just plucked that out of the air. Some experts say these devices are breeding rude kids. Imagine that: We were once raised by a "cathode ray nipple" — another cultural reference from beyond the now ("Television, The Drug of the Nation") — and today our kids' best friends are artificially intelligent "know-it-all" ersatz-mothers who don't mind being bossed around.

Come to think of it: Are kids learning to tune-out their fathers' — paternal — voices? That would be kind of sexist against men, wouldn't it? All hail IBM's Watson and his male voice, then, I say. But let's park that for now.

Because the whole thing gets really odd when you stumble upon research that suggests the machines themselves don't understand women, or even children.

It's as if male developers, sweaty in a fit of Lara Croft Syndrome (look it up), created a female voice with a male brain — "I hear you but… I don't get you!"

Ask Delip Rao, founder of Joostware AI Research, and he will explain all the intricacies of the human vocal tract, gender-based AI rules, and the problems those machines have in deciphering gender, age, ethnicity, and accents.

It may just be a case of a "work-in-progress," or a cancer that you can treat if it's caught early enough. But for that, says Rao, we need awareness.

"I think female voices tend to inspire trust and comfort, and that's likely why they test well," wrote Rao in an email. "But that doesn't mean we are free from gender stereotypes in making such choices."

Connected living: So I just say anything, right?Image: picture-alliance/F. Duenzl

Selling stuff

Trust is important, especially if you're a business, trying to get customers to trust a new idea, a new way of accessing information, entertainment, and paid products.

Read more: What the 2017 total solar eclipse has to do with Apple's Siri

And if you're that company, you're unlikely to want to overload the market with new ideas about equality and diversity. Wouldn't you rather take the lazy route to riches by reinforcing some tried and tested sexism? I mean, sheesh, there's no need to disrupt everything! Let the people spend their money first, all right?   

Companies like Amazon and Google are feverishly trying to find ways to make money through speech assistants, and what better way than pimp out some female voice for an audience, largely made of men, who want a friendly — read "passive servant" — to order them a pizza, or better still, use their artificial intelligence — because women don't have any real intelligence, right? — to intuit when their male masters want a pizza and just get it in for the lazy frats.

Google seems to think it's swiped past the poo by not naming its speech assistant. So they don't have people asking, "Why Alexa? Why not Alexander?"

Screenshot of Apple's speech assistant, Siri. "Hey Siri, are you human?" Answer: "Close enough, I'd say."Image: Screenshot Apple iOS

Theirs is just called "assistant" or "home," but when you think about it, it's still a woman's voice "playing house."

And, in fact, isn't a nameless slave even worse? Sure is. That's like you really couldn't care.

So if we're going to have slaves, let's at least give them names. And girls' names are just so… nice and sweet, aren't they?

In any case, we wanted more women in tech, so there's a start.

"Or possibly a misplaced idea that it's somehow increasing the representation of women — to have a woman's voice on such a device," wrote Dr. Helen Margetts, Professor of Society and the Internet at the Oxford Internet Institute. "Which basically means the engineers did not think about it very much because they are … all men."

And that "highlights the importance of getting more women into tech, AI, and data science," says Margetts.

Don't confuse the machine

Personally, I couldn't say whether it's sexism or science, or a sexist kind of science. But what I do see is how people like to give human characteristics to non-human things, whether it's other animals or machines, or whatever, we anthropomorphize.

It's actually pretty sad, but it may go some way to explain why some people feel so offended by certain choices in the design of technology that may or may not be discriminatory. We're at this point with digital technology where we can no longer see the difference between us and it — especially when it comes to thinking machines, as we have wanted them to attain a kind of human intelligence all along. But, now, as the machines approach that level, we need to take a step back, and remind us and them of what's what.

So don't confuse the machine for a human, or you'll make a fool of everyone.

Zulfikar Abbany Senior editor fascinated by space, AI and the mind, and how science touches people
Skip next section Explore more
Skip next section DW's Top Story

DW's Top Story

Skip next section More stories from DW