Advertisement
The Student Newspaper of DePaul University

The DePaulia

The Student Newspaper of DePaul University

The DePaulia

The Student Newspaper of DePaul University

The DePaulia

Female digital assistants represent gender stereotypes

Female+digital+assistants+represent+gender+stereotypes

It is a common feature for users to ask their digital assistants to complete tasks for them. A question about the weather, ask Siri. Need a cup of coffee, ask Alexa. Need directions, ask the GPS. These type of requests are meant for the everyday user to ask their personal “digital assistants” since it makes users’ lives easier asking their handheld assistants any question that comes to mind.

Each assistant is designated for their own type of duty, but there is one commonality: they are all preset to female voices and names.

Created by major tech companies like Apple, Amazon, Microsoft, and even navigation systems, these digital assistants have become normalized into everyday life.

But, the notion of it being preset to a feminine voice is unsettling when you are consistently asking it to complete tasks for you. And, while users can change the voice setting of every service besides Alexa, it can be argued the preset feminine voices and names are a reflection of our society’s construct towards gender.

Having feminine preset voices for digital assistants conforms to the roles of gender so much that as a society it is a concern that it does not come to a user’s immediate attention that a robotic voice has a female pitch.

“I didn’t have any reaction to (Siri being female), even when I was aware,” senior Tom Fagan said. “I don’t know if this is oblivious or not, but I never really put much thought into it. I always pictured it as more of a robot voice, not to say I wasn’t aware it was female.”

It should be considered how miniscule actions in society’s everyday lives that are so representative of the bigger picture. Even technology is a medium where culture is held accountable for deeper meanings.

“I’ve thought about this since Siri was introduced. I’ve heard some peers say they believe we use female voices because they are “softer” but I think it stems from the history of female house maids and stay-at-home housewives,” junior MacKenzie Carlock said. “When there are so many people who still believe that women should be the homemaker it is hard to believe that ideology didn’t carry over when creating voices for artificial intelligence.”

Blake Paxton, a gender and communication professor, explains how this can be an example of the gender division of labour, a sociology theory that associates both males and females with specific jobs because of innate characteristics associated with gender.

“Female and only female voices on these systems, is one little thing that we don’t always question,” Paxton said. “I think that it definitely perpetuates and leads to higher systemic inequality. Not that it is causing the inequality, but it’s a representation of things that are happening at the higher level.”

The act that we as a population generally accept these assistants with feminine tones and names is representative of a binary culture. Women are automatically associated with the role of an assistant rather than an authority. Thus, the decision was made to have a female narrated robot.

“We want to make it a female voice because that is the ‘female way.’ They are the assistants, they are the help. They are the ones that are going to help answer our questions for us,” Paxton said. “We are feeding into that gender division of labor a lot of secretaries and administrative assistants are women. It’s been constructed as women’s work and making these female voices feeds into those dichotomy.”

While it is incorrect to assume creators of Siri, Alexa and Cortana decided to purposely further the gender agenda, their decision to give a technological assistance program a female persona needs to be considered.

“If you asked the people who were making these decisions what they would probably respond is with ‘its very marketable, women’s voices are soothing, we feel like the (consumer) demographics would be a lot more comfortable, it could be enticing for men,’” Paxton said. “They are not going to think into it.”

But, in reality, women’s voices are harshly criticized versus men’s voices in the workforce.

Reports done by National Public Radio (NPR) display how women are chastised for speech tendencies such as vocal fry and upspeak more than men are. Vocal fry being described as a low, croaky tone and upspeak when women raise their voice at the end of sentences causing them to appear unsure.

After being blamed for the same tendencies, NPR reporter Jessica Grosse sought out help from a vocal coach to sound more professional and decided to pursue this phenomenon herself.

Penny Eckert, a professional linguist, in an interview with Grosse, made the point while people are policing women’s voices, nobody is policing younger men’s language for the same tendencies.

A study done by the University of Miami concludes women who speak in these tendencies are perceived negatively.

“Women who speak in vocal fry are perceived as less attractive, less competent, less educated, less trustworthy and ultimately less hirable,” Casey A. Klofstad, corresponding author of the study, said. “Given this context, our findings suggest that young women would be best advised to avoid using vocal fry when trying to secure employment.”

So while women are facing harsher criticism in the workforce for their speaking tendencies, having feminine voices for digital assistants appears nonsensical. It futhers the point these voices are feminine because of gender stereotypes.

“The choices that people are making about what voice to use or gender to represent says something about inequality or sexism that is still present in our country,” Paxton said.

And if the intention is not to associate females with the role of a personal digital assistant, as a solution the narrative should be flipped. Perhaps having a male voice as the presetting rather than an option should be considered when creating the next digital assistant. Or, even better, programmers should consider giving the digital assistant a non-gendered robotic voice to go hand-in-hand with the nature of the program. At its core assistants such as Siri and Alexa are robots meant to assist their users in menial tasks, as a society working towards reducing gender roles, programers should consider how feminie digital assistants reinforce gender stereotypes.

“I think if we had originally introduced these voices with robot names, like MX100 for example, and voices that were not clearly male or female rather than gendered names and voices then all would’ve been fine,” Carlock said. “It’s the fact that we have to stray away from something that has already been normalized that is the issue. People wouldn’t be happy about the change, thinking that it’s due to feminist sensitivity, as ridiculous as that sounds.”

More to Discover