Behind The Feminine Facade: Gender Bias in Virtual Assistants and its Effect on Users Cover Image

Behind The Feminine Facade: Gender Bias in Virtual Assistants and its Effect on Users
Behind The Feminine Facade: Gender Bias in Virtual Assistants and its Effect on Users

Author(s): Shaivya Singh, Anand Kumar, Shormita Bose
Subject(s): Gender Studies, Media studies, Psycholinguistics, Sociolinguistics, ICT Information and Communications Technologies
Published by: Transnational Press London
Keywords: irtual assistants; gender bias; stereotype; linguistic equity; digital interaction;

Summary/Abstract: The paper examines the boundaries of gendered voice in virtual assistant responses. It intends to examine the gender biases these virtual assistants display by looking at popular systems such as Siri, Alexa, Cortana and Google Assistant which are known for using female voices in their choice of assistants. The study explores how gender bias in VA design affects user experience using both quantitative and qualitative methods of study. It examines how Virtual assistants can be programmed and designed for inclusive interaction to challenge societal gender norms. By drawing from diverse fields such as pragmatics, sociolinguistics, and gender studies to analyze how these assistants encode and reproduce gender bias. The present study also highlights the importance of linguistic equity in digital interaction.

  • Issue Year: 3/2024
  • Issue No: 4
  • Page Range: 351-356
  • Page Count: 6
  • Language: English
Toggle Accessibility Mode