'I am your female virtual assistant'

'I am your female virtual assistant'

There is an overwhelming majority of virtual assistants and chatbots who have female voices as default — Alexa, Siri, HDFC’s Eva, Kotak’s Keya. Representative image/Pixabay

The other day I was watching one of the several videos that get circulated on Whatsapp. This was a collection of advertisements and jingles from when I was growing up. From hamara bajaj to ajantha toothpaste, all brought back a flood of memories of TVs with poor reception and the anticipation with which we would look forward to our serials (which did not streamed on demand) but made us wait a full week before they aired again.

Amidst all the recollections, what struck me particularly was the Prestige advertisement. Jo biwi se kare pyaar, who prestige se kaise kare inkaar (someone who loves his wife could not refuse her a prestige appliance). It seems so strange, that when I was growing up I did not find anything wrong with this ad. And this despite having the most progressive parents who did not expose me to the thought that being a girl was in any way inferior.

But, as a child, not once did I find it objectionable that the inherent message in the advertisement was that the woman cooks and the man provides. The quality of the culinary equipment provided is directly proportional to the love one felt for one’s wife, that is, if the Prestige tag line was to be believed.

We’ve certainly come a long way since these biases, that were not discernable then, are now obvious to the next generation. But if we reflect deeply, we will realise that not much has changed. There is an overwhelming majority of virtual assistants and chatbots who have female voices as default — Alexa, Siri, HDFC’s Eva, Kotak’s Keya.

Everyday technology is constantly reinforcing gender biases. Women are meant to patiently and sweetly, help others take on mundane, administrative aspects of their lives. Of course, the IBM Watson that demolished human competitors in a Jeopardy episode, had to have a male voice! Till recently, Siri’s response to calling her a woman of questionable repute was a coy “I’d blush if I could.”

Though the ensuing furore has ensured the response to these particular slurs is now different, not much has been done to change the inherent servility of these female voices. What is
scarier still is the advent of Artificial Intelligence and Machine Learning. Are we then programming digital entities to absorb these gender biases at a subliminal level? Could this then balloon into a bigger problem of gender biases when these very same machines are used for shortlisting potential job candidates? It is with much concern that I ponder the future of my confident daughters. Will they be able to thrive in this persistently patriarchal world? Only time will tell. Until then, “Alexa, can you turn up the volume?”

DH Newsletter Privacy Policy Get top news in your inbox daily
GET IT
Comments (+)