Alexa, Siri, Cortana: Our virtual assistants say a lot about sexism -Science Friction
OK, Google. We need to talk.
For that matter — Alexa, Siri, Cortana — we should too.
The tech world's growing legion of virtual assistants added another to its ranks last month, with the launch of Google Home in Australia.
And like its predecessors, the device speaks in dulcet tones and with a woman's voice. She sits on your kitchen table — discreet, rotund and white — at your beck and call and ready to respond to your questions.
But what's with all the obsequious, subservient small talk? And why do nearly all digital assistants and chatbots default to being female?
A handmaid's tale
Feminist researcher and digital media scholar Miriam Sweeney, from the University of Alabama, believes the fact that virtual agents are overwhelmingly represented as women is not accidental.
"It definitely corresponds to the kinds of tasks they carry out," she says.
"Service work, domestic labour, healthcare, office assistants — these are all industries which are heavily feminised — and also often low paid and low status, with precarious work conditions.
"We're seeing the same with digital labour online."
And we seem to like it that way.
Studies show that users anthropomorphise virtual agents — relating to them as human — and are more receptive to them if they are empathetic and female. But it can depend on what the device is being tasked to do.
If it's role-playing an engineer or lawyer, for example, users prefer their bots to be blokes — reverting to old-fashioned stereotypes.
Service bots clearly have a job to do, and they wouldn't be doing it very well if they alienated users. But do they really need to be female in order to be functional or relatable?
"Many have been written to sound demure to the point of being extremely forbearing and really passive — almost eerily so," New York-based writer Jacqueline Feldman says.
"The gendering of these bots and AI assistants tells us what their makers picture as a perfect servant, a perfect worker."
Gender bending chatbots
Last year, Ms Feldman got a job designing the personality of a new chatbot called Kai. Programmed to help users with basic banking tasks, it learns iteratively via a machine learning algorithm.
"I wanted it to be personable, but not a person — I wanted the bot to express itself as a bot," she says.
She decided to buck the trend and make it genderless too — and her employer, AI company Kasisto, was open to the possibility.
"To assign a piece of technology a gender within the binary is so regressive to me," she says.
"We don't have to make the technologies around us have genders in order to like them."
Ms Feldman thinks our affinity for fembots is more insidious: a captive, digital persona that can't walk away from us will feel less bizarre if it is a woman, rather than a man.
But when you ask Apple's virtual assistant Siri whether it's female, it insists it is genderless.
"I exist beyond your human concept of gender. In my realm, anything can be anything," Siri says.
Press Siri on the issue further, and it insists "animals and French nouns have genders, I do not … I am genderless, like cacti and certain species of fish."
Siri might just be asserting its right to be gender neutral or even gender fluid. You can change its default voice to be male. But its name means "beautiful woman who leads you to victory" in Norwegian.
Sexual harassment: there are no limits
According to Dr Sweeney, research indicates virtual assistants like Siri and Amazon's virtual assistant Alexa find themselves fending off endless sexual solicitations and abuse from users.
So what does our treatment of this growing virtual workforce say about us?
read the full story: