The siren song of subservience: Perpetuation of gender stereotypes in AI through pop culture

As we charge forward into the unknown depths of the future, the influence of artificial intelligence (AI) in shaping our lives is escalating at an alarming rate. From mundane activities like music selection to critical fields such as healthcare and recruitment, the intricate algorithms dominating our world are now entrusted with making decisions that hold the power to shape not just our destinies, but the very fabric of our society.

However, in our haste to embrace these innovative technologies, we risk perpetuating the very prejudices we seek to eliminate. Whether it is encouraging systemic racism, gender inequality or ableism, AI systems have the potential to cause immeasurable harm if left unchecked, the consequences of which could be disastrous, threatening the very fabric of our democracy and promoting socioeconomic disparities.

In the field of artificial intelligence, the issue of gender bias has been a recurrent problem since the first chatbot was created, in the 1960s. Even the UN is aware of how gender bias is perpetuated through technology, with the likes of ELIZA – based on Eliza Doolittle, a character in Bernard Shaw's Pygmalion, a character that, as created by a male author, perpetuates sexism – as well as other virtual assistants such as Siri (a “humble” virtual assistant), Cortana, Alexa and others. One of the characteristics of virtual assistants that perpetuates gender bias is the use of female voices by default, which reinforces the stereotype that women are better suited for administrative and service-oriented tasks.

Continue reading at GenderIT.org.

Photo by Google DeepMind on Unsplash. Artist: Khyati Trehan

 

Région: 
« Retourner