Friday 24 May 2019

"Voice assistants" criticized for reinforcing harmful stereotypes

The MIT Technology Review has a story about a new UN/UNESCO report titled "I'd blush if I could: closing gender divides in digital skills through education" that criticizes the default voices used for "voice assistants" like Siri, Alexa, and Cortana. According to the article, 
"Most AI voice assistants are gendered as young women, and are mostly used to answer questions or carry out tasks like checking the weather, playing music, or setting reminders. This sends a signal that women are docile, eager-to-please helpers without any agency, always on hand to help their masters."
The report aims to expose the gender biases that are being hard-coded into our technology and the internet of "things" that is expanding rapidly.  The title of the report comes from a response that Siri gave after being called "a b****." The report contains a section on the responses that "voice assistants" give to abusive and gendered language. "The assistants almost never give negative responses or label a user’s speech as inappropriate, regardless of its cruelty, the study found."

No comments: