Virtual assistants like Siri or Alexa often use female voices, sparking debates about gender stereotypes. Studies have found that presenting digital assistants as female can reinforce traditional ideas about women as helpful, submissive, or subordinate. Even in 2019, UNESCO warned that predominantly female-voiced assistants could normalize outdated gender roles, making women appear naturally suited for assisting rather than leading.
Responding to this concern, some companies introduced gender-neutral voices. Yet, auditory perception research shows that users often feel uncomfortable hearing voices that don’t fit traditional male or female categories. This discomfort can affect how widely accepted these products become. Users tend to prefer voices that clearly align with familiar identities, demonstrating how deeply ingrained societal expectations influence even our interactions with technology.
While addressing gender stereotypes in technology is important, we cannot expect artificial intelligence alone to change deep-rooted societal norms. Overcoming stereotypes requires structural changes in society, from education to workplaces.
Today, users can choose from various voice options on some devices. Offering diverse choices is a necessary step forward, allowing individuals to select voices that align with their preferences and values. Artificial intelligence gives us an opportunity to rethink gender representation, but it’s essential to also question underlying societal issues. The feminist movement and ethics discussions should prioritize urgent matters beyond digital voices, recognizing that true change must tackle deeper cultural and social foundations. As society moves forward, combining technological progress with broader social reforms will be key to genuinely challenging and transforming gender stereotypes.
Associate Professor at University of Granada