Researchers Who Think Voice Assistants Like Siri Perpetuate Gender ‘Stereotypes’ Have A Genderless Solution

Evie Fordham | Politics and Health Care Reporter

A group of researchers who believe tech’s current offering of mainly male and female voice assistants “perpetuates stereotypes” have put money and time into Q, a “genderless” voice assistant.

The artificial intelligence assistant uses a voice with a frequency of around 145 Hertz, which is believed to fall between the frequencies of typical male and female voices, according to Geek.com.

Q’s creators asked that visitors on their website to share the voice assistant with tech companies like Twitter and Apple. The site’s “About” section states:

Technology companies often choose to gender technology believing it will make people more comfortable adopting it. Unfortunately this reinforces a binary perception of gender, and perpetuates stereotypes that many have fought hard to progress. As society continues to break down the gender binary, recognising [sic] those who neither identify as male nor female, the technology we create should follow.

Who are Q’s creators? The voice assistant is backed by a team including Copenhagen Pride and Vice’s creative agency Virtue. They unveiled Q at South by Southwest in Austin, Texas, on March 11, according to AdWeek. (RELATED: Tech Exec Hired By DNC After Embarrassing Email Leaks Leaves For Social Justice Org Run By Steve Jobs’s Widow)

A man uses 'Siri' on the new iPhone 4S after being one of the first customers in the Apple store in Covent Garden on October 14, 2011 in London, England. (Photo by Oli Scarff/Getty Images)

A man uses ‘Siri’ on the new iPhone 4S after being one of the first customers in the Apple store in Covent Garden on October 14, 2011 in London, England. (Photo by Oli Scarff/Getty Images)

“It’s going to become an increasingly commonplace way for us to communicate with tech,” Project Q collaborator Julie Carpenter, a researcher with the Ethics and Emerging Sciences Group, said according to WIRED. “Naming a home assistant Alexa, which sounds female, can be problematic for some people, because it reinforces this stereotype that females assist and support people in tasks.”

Follow Evie on Twitter @eviefordham.

Send tips to evie@dailycallernewsfoundation.org.

Content created by The Daily Caller News Foundation is available without charge to any eligible news publisher that can provide a large audience. For licensing opportunities of our original content, please contact licensing@dailycallernewsfoundation.org.

Tags : siri south by southwest vice
Loading comments...
© Copyright 2010 - 2018 | The Daily Caller