Skip to main content
Street Roots Donate
Portland, Oregon's award-winning weekly street newspaper
For those who can't afford free speech
Twitter Facebook RSS Vimeo Instagram
▼
Open menu
▲
Close menu
▼
Open menu
▲
Close menu
  • Advertise with Us
  • Contact
  • Job Openings
  • Donate
  • About
  • future home
  • Vendors
  • Rose City Resource
  • Advocacy
  • Support
News
  • News
  • Housing
  • Environment
  • Culture
  • Opinion
  • Orange Fence Project
  • Podcasts
  • Vendor Profiles
  • Archives
(Illustration by Andrey Suslov/iStock)

Artificial intelligence has a gender bias problem – just ask Siri

Street Roots
COMMENTARY | The gender bias in personal assistance devices is far more insidious than just the voice
by Rachel Adams | 18 Oct 2019

Suggest to Samsung’s Virtual Personal Assistant Bixby “Let’s talk dirty,” and the female voice will respond with a honeyed accent, “I don’t want to end up on Santa’s naughty list.”

Ask the same question to the program’s male voice and it replies “I’ve read that soil erosion is a real dirt problem.”

In South Africa, where I live and conduct my research into gender biases in artificial intelligence, Samsung now offers Bixby in various voices depending on which language you choose. For American English, there’s Julia, Stephanie, Lisa and John. The voices of Julia, Lisa and Stephanie are coquettish and eager. John is clever and straightforward.

Virtual Personal Assistants – such as Bixby, Alexa (Amazon), Siri (Apple) and Cortana (Microsoft) – are at the cutting edge of marketable artificial intelligence (AI). AI refers to using technological systems to perform tasks that people usually would.

They function as an application on a smart device, responding to voice commands through natural language processing. Their ubiquity throughout the world is rapidly increasing. A recent report by UNESCO estimated that by as early as next year we will be having more conversations with our virtual personal assistants than with our spouses.

Yet, as I’ve explored in my own research with Nora Ni Loideain from the Information Law and Policy Centre at the University of London, these technologies betray critical gender biases.

With their female names, voices and programmed flirtatiousness, the design of virtual personal assistants reproduces discriminatory stereotypes of the female secretary who, according to the gender stereotype, is often more than just a secretary to her male boss.

It also reinforces the role of women as secondary and submissive to men. These AI assistants operate on the command of their user. They have no right to refuse these commands. They are programmed only to obey. Arguably, they also raise expectations for how real women ought to behave.

The objective of these assistants is to also free their user from menial work such as making appointments and purchasing items online. This is problematic on at least two fronts: it suggests the user has more time for supposedly more important work. Secondly, it makes a critical statement about the value of the kind of secretarial work performed, first by real women and now by digitalized women, in the digital future.

“What are you wearing?”

One of the more overt ways in which these biases are evident is the use of female names: Siri and Cortana, for instance. Siri is a Nordic name meaning “the beautiful woman that leads you to victory.”

Cortana takes its name (as well as visuals and voice) from the game series Halo. In Halo, Cortana was created from a clone of the brain of a successful female scientist married with a transparent and highly-sexualized female body. She functions as a fictional aide for gamers with her unassuming intelligence and mesmeric shape.

In addition to their female voices, all the virtual personal assistants on the market today come with a default female voice, which, like Bixby, is programmed to respond to all kinds of suggestive questions and comments. These questions include: “What are you wearing?” Siri’s response is: “Why would I be wearing anything?”

Alexa, meanwhile, quips: “They don’t make clothes for me,” and Cortana replies, “Just a little something I picked up in engineering.”

Bias and discrimination in AI

It is being increasingly acknowledged that AI systems are often biased, particularly along race and gender lines. For example, the recent recruitment algorithm development by Amazon to sort resumes for job applications displayed gender biases by downgrading resumes which contained the word “women” or which contained reference to women’s colleges. As the algorithm was trained on historical data and the preferential recruitment of males, it ultimately could not be fixed and had to be dropped.

As research has shown, there is a critical link between the development of AI systems which display gender biases and the lack of women on teams that design them.

But there is rather less recognition of the ways in which AI products incorporate stereotyped representations of gender within their very design. For AI Now, a leading research institution looking into the social impact of AI, there is a clear connection between the male dominated AI industry and the discriminatory systems and products it produces.

The role of researchers is to make visible these connections and to show the critical links between the representations of women, whether in cultural or technological products, and the treatment of women in the real world.

AI is the leading technology in the so-called Fourth Industrial Revolution. It refers to the technological advances – from biotechnology, to AI and big data – that are rapidly reshaping the world as we know it. As we continue to engage with the promises and pitfalls of what this holds, it will become increasingly more important to consider and address how the technologies driving these changes may affect women.

Rachel Adams is a research specialist at the Human Sciences Research Council.

Courtesy of The Conversation / INSP.ngo (This article is republished from The Conversation under a Creative Commons license.)


Street Roots is an award-winning, nonprofit, weekly newspaper focusing on economic, environmental and social justice issues. Our newspaper is sold in Portland, Oregon, by people experiencing homelessness and/or extreme poverty as means of earning an income with dignity.  Learn more about Street Roots. Support your community newspaper by making a one-time or recurring gift today.

  • Print

More like this

  • We value the lives of all 92 who died homeless in 2018
  • Kaia Sand, Street Roots director, receives Spirit of Portland Award
  • Do away with policing? Alex Vitale says it is time
  • Street Roots vendor profile: An example of brotherly love
  • Without parking, car and RV dwellers have nowhere to go
▼
Open menu
▲
Close menu
  • © 2021 Street Roots. All rights reserved. To request permission to reuse content, email editor@streetroots.org.
  • Read Street Roots' commenting policy
  • Support Street Roots
  • Our Annual Breakfast Broadcast will stream live at 8 AM, October 5th! Click the button to RSVP, donate and learn more about our biggest event of the year.

  • LEARN MORE