The contemporary professional landscape increasingly incorporates AI tools due to their accessibility and practical utility. Often seen as neutral aids, these tools necessitate careful examination to assess their fairness, value, and true neutrality. From a gender studies perspective, AI exacerbates existing inequalities. Users tend to anthropomorphize AIs, often giving them nicknames. Voice assistants like Siri, Alexa, and Google Assistant, traditionally have female voices, reflecting stereotypes of caregiving and assistance linked to traditional femininity. Recently, male voices have been introduced, and gender-neutral voices like “Q” have emerged, created by blending voices across the gender spectrum to achieve a neutral frequency of 153 Hz. The evolving relationship between AI and gender studies is explored through bibliographic review and case studies, revealing different design approaches in this field. While the gender debate around voice assistants is well-studied, less explored is the gender identity attributed to chatbots like ChatGPT, CoPilot, and Bard. User biases and training data biases contribute to the maintenance of gender binaries, with professional AI support perceived as male and vocal assistants as female. The article presents results from a study conducted in early 2024. An online survey explored public perceptions of the gender of common assistants, considering both voice assistants and chatbots. Data was analyzed by gender, age, and professional background. The discussion highlights differences in gender perceptions among user groups and provides insights into design choices that influence and educate users on gender biases, promoting neutral and inclusive characteristics in AI technology.

Il genere digitale: identità e percezione di assistenti vocali e chatbot

Annapaola Vacanti
;
2024-01-01

Abstract

The contemporary professional landscape increasingly incorporates AI tools due to their accessibility and practical utility. Often seen as neutral aids, these tools necessitate careful examination to assess their fairness, value, and true neutrality. From a gender studies perspective, AI exacerbates existing inequalities. Users tend to anthropomorphize AIs, often giving them nicknames. Voice assistants like Siri, Alexa, and Google Assistant, traditionally have female voices, reflecting stereotypes of caregiving and assistance linked to traditional femininity. Recently, male voices have been introduced, and gender-neutral voices like “Q” have emerged, created by blending voices across the gender spectrum to achieve a neutral frequency of 153 Hz. The evolving relationship between AI and gender studies is explored through bibliographic review and case studies, revealing different design approaches in this field. While the gender debate around voice assistants is well-studied, less explored is the gender identity attributed to chatbots like ChatGPT, CoPilot, and Bard. User biases and training data biases contribute to the maintenance of gender binaries, with professional AI support perceived as male and vocal assistants as female. The article presents results from a study conducted in early 2024. An online survey explored public perceptions of the gender of common assistants, considering both voice assistants and chatbots. Data was analyzed by gender, age, and professional background. The discussion highlights differences in gender perceptions among user groups and provides insights into design choices that influence and educate users on gender biases, promoting neutral and inclusive characteristics in AI technology.
File in questo prodotto:
File Dimensione Formato  
GUDHYPERHUMAN-compresso.pdf

solo utenti autorizzati

Tipologia: Versione Editoriale
Licenza: Copyright dell'editore
Dimensione 475.98 kB
Formato Adobe PDF
475.98 kB Adobe PDF   Visualizza/Apri   Richiedi una copia

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11578/347469
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus ND
  • ???jsp.display-item.citation.isi??? ND
social impact