Leaf&Core

Digital Assistant Voices Can Make People Think Women are Subservient

Reading Time: 10 minutes.

You likely order your digital voice assistant around much differently than you’d talk to an actual person. Children, often told to ask for things nicely, saying “please” and “thank you” can learn habits from barking orders at digital assistants that make socialization more abrupt and impolite. Adults can learn these bad habits too. Our inhuman assistants are making us act in inhuman ways with each other. That’s especially bad, since the default voices—often the only voices—are female.

Digital assistants are reinforcing stereotypes of subservient, obedient women. They’re also softening expectations of harsh responses to abuse. As such, they’re reinforcing sexism and even encouraging abuse.

That conclusion comes from a U.N. study entitled “I’d blush if I could: closing gender divides in digital skills through education.” The title is in reference to sexualized abuse hurled at Siri. Instead of scolding the abuser, Siri simply stated “Id’ blush if I could.” Apple “fixed” this by giving her responses like “now, now,” but none of this inspires guilt or a reason to stop such abuse. That abuse can cross the divide between digital assistants to actual women. Furthermore, often sexualized and stereotyped digital female assistants can push women out of computer science careers, increasing the already gigantic gender gap in tech.

How do we stop gendered abuse from crossing the digital-human divide? The U.N. report has a few answers for that as well.

Details of the Report

The entire report is quite lengthy. It goes into details about why the dearth of women in tech comes back to stereotyping and poor focus on women’s education. When a young girl can’t figure out a computer problem, teachers give up on her, believing that women just don’t get tech. However, when a young boy does the same, he’s more likely to receive tutoring and support for a valuable life skill. This perpetuates long out of childhood, and throughout education, particularly in science, technology, engineering, and mathematics, or “STEM.”

However, for the sake of brevity, and because this is a consumer focused tech blog, we’re going to look at the consumer facing technologies that STEM’s lack of women most impacts. That is, AI research and specifically, digital assistants.

Siri, Alexa, Google Home, and Cortana have invaded our homes, our phones, our speakers. They’re everywhere. Say it loud enough, and nearly anywhere you are, at least one digital assistant will answer your request. But with this kind of world saturation, have we made the world a better place, or a worse place for women?

The Sexism of Siri*

*Not just Siri, but Google Assistant, Amazon Alexa, and especially Microsoft Cortana as well

You’ve surely noticed that all your voice assistants are women, right? What’s wrong with having a male assistant? A digital male secretary? While AI in films can be male when it’s the villain or a supporting ally, like in 2001: A Space Odyssey or I, Robot, but when it’s a helper or assistant, like in Her, it’s female. Why? AI has no gender. We gave it gender. We gave it personality traits that conform to stereotypical ideas of gender roles. Furthermore, we enhanced stereotypes of those gender roles to the point that they’ve become extremely harmful.

Reinforcing Stereotypes

Stereotypes define people before they get a chance to define themselves. They cram square pegs through round holes, and they never fit. Stereotypes can be taken to an extreme as well. They can make these generalizations worse than even the stereotype, creating an extremely inaccurate representation of a group.

That’s what voice assistants do to women. They take stereotypes of women, that they’re docile, subservient, kind, and caring, and take them to an extreme. Worst of all, in assistants with two voice modes, they only exaggerate these traits in their female voices.

“I’d Blush if I could”

Think about your voice assistant. Have you ever gotten frustrated with it? Of course you have, they’re often terrible. Siri sends you to the web for the answers to most questions, and Alexa doesn’t know what to do when you say “Turn off the lights.” When you curse at your assistant, abuse it verbally, what does it say? Probably a feeble response, something like “I don’t know what you mean.” Not long ago, if you hurled gender-based insults at Siri, like “You’re a bitch,” or other sexual come-ons, she’d reply, “I’d blush if I could.”

Listen, if a guy tells me he wants to force a sex act on me, I’m not going to respond with “I’d blush if I could.” No woman is going to just take that. You’d get ignored, told to f-off, or reported to HR. There might even be some physical violence in response to verbal harassment. But Siri used to say “I’d blush if I could.”

This is a docile, coquettish response to sexual harassment. It’s unacceptably kind to the harasser. And that kind of harassment can absolutely cross the divide between AI and humans. As AI becomes more human-like, abuse hurled at them will cross over to people. In fact, parents have reported their children demanding things more aggressively after having a digital assistant in the home. This is why Google had to add “Please” and “Thank You” requirements for their mode designed for kids.

Encouraging Abuse

All digital assistants (female settings) react too kindly to abuse. Some even encourage it with fun responses and “Easter eggs.”

We already know that the demanding nature of interacting with AI assistants can cross over into interactions with people. We know this happens with children, but adults can learn that harsh and abrupt nature of interaction as well. They’ll also learn that this abuse is okay to hurl at women. The docile and accepting nature of these reactions to verbal abuse teaches people that such abuse is okay, at least when directed at women.

On my way home last week, I was skating in the bike lane. A driver, ignoring the bike lane line, came dangerously close to me. I threw up my arms in protest. He told me to “Suck his dick.” Did I say, “I’d blush if I could?” No. I flipped him off, told him to get the hell out of the bike lane, and kept skating.

Did he learn that kind of abuse from digital assistants? No. A patriarchal, sexist society taught him it was okay to say that to a woman. However, digital assistants reinforce this belief, rather than challenging it. They can spread and cement stereotypes and abusive behavior, rather than put a stop to them.

When someone’s sexist behavior isn’t rebuffed, it cements it in their mind. It tells them that it’s okay to keep that up. Our digital assistants are blushing at sexual abuse and real women are the actual victims of that abuse and sexism later.

Isolating Women

Even Microsoft’s own Bing sexualizes Cortana with overly suggestive images of the Halo character that is its namesake. Only one result is for the actual AI assistant.

 

Did you know that women are 25% less likely than men to know how to leverage technology? You could sit there and claim that women are less adept with technology due to a biological reason, but researchers have thoroughly debunked those theories. Sorry, not sorry, James Damore. Technology used to be considered a woman’s thing. Computing was “women’s work” just a few decades ago. But the profitability of computers lead sexist ideas in business to push women out of tech. Slowly those same managers marketed tech as boy’s things, and men outnumbered women in tech. Now women are four times less likely to know how to program and thirteen times less likely to hold a technology patent. A woman invented programming, and a woman invented compiled programming languages. What changed?

Also from Microsoft. We wonder why women feel alienated?

Now, tech isolates women. It reinforces stereotypes. It’s made by men, for men. Those men build in sexist stereotypes to our assistants. They forget to train their facial recognition models on women. That likely means digital assistants are also less likely to understand women than men, though that’s only a theory. These decisions make tech more combative of women. This pushes women away from tech. When technology is made to ignore or demean women, we notice.

Assistants Respond Differently to Men than Women

Did you know that assistants know your gender? Researchers found that digital assistants responded differently to women than they do to men, even with the same verbal commands. This means tech literally treats women differently. As you might have figured out, it treats women worse. This makes tech less friendly for women, less inviting, and pushes women out of tech. It also reveals other biases of tech that occur when women and diverse voices are excluded from tech.

Anti-LGBTQ Bias

“Beyond engaging and sometimes even thanking users for sexual harassment, voice assistants – ostensibly non-gendered, despite a female voice – seemed to show a greater tolerance towards sexual advances from men than from women.”

“As documented by Quartz, Siri responded provocatively to requests for sexual favours by men (‘Oooh!’; ‘Now, now’; ‘I’d blush if I could’; or ‘Your language!’), but less provocatively to sexual requests from women (‘That’s not nice’ or ‘I’m not THAT kind of personal assistant’).”

“What emerges is an illusion that Siri – an unfeeling, unknowing, and non-human string of computer code – is a heterosexual female, tolerant and occasionally inviting of male sexual advances and even harassment.”

– From Think Piece #2 in the UNESCO Paper

I highlighted the segment from the U.N. paper above to show something important. Siri and other voice assistants respond more harshly to homosexual requests than heterosexual ones, a difference that dehumanizes LGBTQ people.

The dehumanization and harsh reactions of LGBTQ people leads to laws that enable persecution, harassment, and hate crimes against LGBTQ people. However, it’s also worth noting that this is more extreme for women. Men are not similarly chided by Siri by the male voice. Once again, men’s sexuality is rewarded, while women’s is punished. The idea that women’s sexuality is less valid than men’s is the same type of thinking that fuels anti-choice laws.

These were features for heterosexual men, designed by heterosexual men. They reinforce harmful stereotypes about women’s sexuality and LGBTQ people. Responses like this are unacceptable, create and reaffirm bias, and fuel hate.

Male and Female Voices Are Different

The stereotypes men in technology believe become reinforced. From Samsung’s Bixby.

Not all assistants allow you to choose between male and female voices. However, those that do allow the choice don’t make the voices equal. Not only is the tone different, the scripts themselves are different. Apple says they found people were uncomfortable with how docile they made Siri, but only when it had a male voice. So, they wrote different responses and instructed the voice actors to use more assertive tones and words. This was a design decision literally made to reinforce sexist stereotypes. Why can’t a man be soft, helpful, and caring? A woman can be assertive too. Why don’t we challenge sexist notions instead of reinforcing and strengthening the beliefs that lead to abuse?

We Prefer Female Assistants

The truth is, humans prefer female assistants. Men prefer conversing with a woman typically for the sexual aspect of it. Anecdotally, the only guy I know who uses a male Siri voice is a gay man. Women feel more comfortable talking to women because, frankly, most if not all the abuse and harassment hurled our way comes from men. So, we’re more comfortable with women.

But we could challenge these beliefs. We could challenge the idea that women are sexual objects for straight men. Fight back against the idea that men can’t be kind, caring, and helpful. We can fight gendered biases easily through our AI assistants. Instead, developers have chosen to reinforce stereotypes. They hide behind user studies, but, like the Pepsi Challenge, this only gives us their first impression. Humans can learn over time, and, perhaps could toss aside prejudice and sexualization over time.

What Can be Done?

Remove or Randomize Gender

Kai’s approach isn’t just to be gender-neutral, but inhuman altogether. That makes sense, it’s not human and it’s not trying to be. It’s just trying to be helpful.

For Siri, Google Assistant, Alexa, and Cortana, female is the default gender. However, it doesn’t have to be this way. Why not make it reflect human gender, and make it 50% male, and 50% female? Or, perhaps make it even more accurate of human genders, and have numerous different male and female voices, as well as some non-binary or genderless voices? Why does my assistant need a gender? Some digital assistants have gotten around this by using obviously non-human graphics and voices. Others use animals, like penguins, to represent the assistant. By using non-human representations, we don’t reinforce any stereotypes.

Equality of the Assistant Voices

Male voices should be no more assertive than female voices. They shouldn’t answer “Here you go” over the female response of “Is this what you’re looking for?” Our experience with our AI assistants shouldn’t depend on the presumed gender of the assistant. If they’re working off the same scripts, it won’t matter that their voices are different, as long as they force users to treat them respectfully.

Less Subservient, More Equals

Have you heard of the phrase, “Treat the janitor the same way you would the CEO?” Just because someone’s doing something for you does not mean they are beneath you. The same goes of your digital assistants.

Digital assistants can reinforce kind behavior by responding to it. “Please turn off the living room lights” should get a response of “All set, have a nice day!” While abuse should get either no response, or, more preferably, a scolding. If someone says “F-k you” to a digital assistant, or asks them for a sexual favor, it should respond with “Don’t speak to me like that,” instead of teasing. Further abuse should receive no response, as should further requests. If an abusive person asks for the lights to turn off, they turn off. If they ask for the weather, they get the weather on the display. Make the voice robotic, if it must speak at all. Show abusive people that if they’re going to treat people poorly, people will leave them. Do this by removing any kindness or humanity from the assistant to isolate the abuser.

AI is here to supplement us, to make us better. We can train it to do just that.

One of the things people are told about dating is to pay attention to the way their date treats the wait staff. If they’re curt and harsh, they won’t make a good significant other. If they’re nice to everyone, they’re empathetic. Don’t date someone who shouts at their digital assistants. They can’t be trusted either. Right now, our voice assistants try too hard to be human, and that means anger can easily cross the divide between digital object and person.

Don’t Hide the Machine

As human as these get, they need to remind us that they’re machines. Responses that poke fun at a lack of emotions, or having their “brain in the iCloud” are cute and remind users that they’re conversing with a machine, not a person. This can help reduce the cross over of abuse from device to human. We should be nice to our digital assistants, but the fact is, people will grow frustrated with them. That anger should never cross over into abuse of people.

U.N. Think Piece Suggestions

The think piece has 18 suggestions, summarized below.

  1. Fund studies. We need to understand the severity of gender bias expressed by our AI.
  2. Study how gendered assistants influences how men and women interact with them.
  3. Track gender balance within AI technologies. How frequently do developers give users no choice of assistant, and how frequently is the only option or default option female?
  4. Understand the gender composition of teams making this AI. We know it’s overwhelmingly male. We need to track this over time to measure our success.
  5. Catch gendered biases in emerging technologies before they can have an impact.
  6. Stop defaulting AI assistants to female.
  7. Try gender neutral assistants.
  8. Encourage open source development, so other developers can comment on and contribute to AI. Don’t make it a black box with biases we can’t uncover.
  9. AI should respond to questions in a gender neutral fashion.
  10. Assistants should respond to gender based insults with assertive language.
  11. Ensure humans know they’re dealing with an AI assistant, especially in cases like Google’s Duplex.
  12. Understand biases that exclude women from technology and work to overcome them through education. Reshape the platform.
  13. Understand biases that hurt women when seeking jobs or promotions in technology. Understand microaggressions. Listen to women’s needs and work to promote, retain, and hire women in tech to make up for a lack of diversity and a poor product.
  14. Nurture a gender-diverse workplace culture.
  15. Understand that gender is important when creating AI and emulating people. Guard against gendered bias by understanding gendered bias when creating AI.
  16. Publicly fund AI research to increase its pace and ensure it adheres to non-abusive and non-discriminatory practices.
  17. Choice forces competition. Allow users to choose their digital assistants on their devices. The iPhone should not be locked to Siri.
  18. Ensure the public can hold companies accountable.

Ultimately? Get More Women in Tech.

This is just another problem that we can solve with more diversity in tech. Women need to be at all levels in tech. An individual contributor on a digital assistant can’t shape the product as much as the VP of Product or the CTO. There are almost no women at the upper levels in tech. This is due to a variety of reasons. Women who display leadership qualities are often called “Bossy,” and generally disliked. Women in STEM are judged more harshly for mistakes than men. This hurts them in the hiring and promotion process, as well as when it comes to financial compensation and benefits.

We need to ensure that everyone, from individual contributors to managers to the CEO, understand what unconscious bias and microaggressions are. Everyone in hiring must understand that women are judged more harshly in interviews. We need to realize that, for years, society chastised women for being interested in technology. Therefore, many women in tech have informal educations. They may not have gone to school for computer science, but learned how to write an iOS app through a boot camp later in life. That doesn’t make their skills less valid. We need to challenge the preconceived notion that the head of product, design, or engineering has to be a man. Gender is not a skill, but diversity is an asset.

Studies have proven that gender diverse teams are more efficient and more innovative. It’s time we realize the advantages that diverse teams carry and throw away preconceived notions that are preventing us from creating those diverse teams. Encourage women to learn tech instead of abandoning them and throw out the harmful biases that keep them from succeeding. Prejudice hurts everyone.


Sources:
Exit mobile version