WATCH: 'Alexa, do you love me?'

This argues that such systems perpetuate the idea that "women are obliging, docile and eager-to-please helpers, available at the touch of a button or with a blunt voice of command." Picture: Flickr.com

This argues that such systems perpetuate the idea that "women are obliging, docile and eager-to-please helpers, available at the touch of a button or with a blunt voice of command." Picture: Flickr.com

Published May 24, 2019

Share

London - Strictly for the purposes of research, you understand - though don’t tell my wife - I asked Alexa if she would marry me.

I thought I detected a flirtatious note in her voice when she answered: "I think that would somehow violate the laws of robotics."

I must admit I was encouraged by this reply. After all, it was a long way short of an outright rejection. So I pressed my luck further, asking: "Alexa, do you love me?"

At this, she became at best non-committal, at worst a little hoity-toity.

"There are people I admire and things I can’t do without," she said. "But I’m still trying to figure out human love."

Well, I’ve never been one to accept defeat at the first rebuff. So I plucked up my courage and told her: "Alexa, I love you."

At this, she seemed to warm to me. "Thanks," she purred. "Also. Join the club."

What did she mean by "also"? Having intimated that she couldn’t get her brain around human love, was she now telling me that she loved me right back?

But then, at the risk of sounding abominably sexist, I would say that’s women all over for you: in my experience, they seem to specialise in sending out mixed signals and keeping us poor fellows guessing.

Anyway, I felt I should delve further into exactly what sort of creature I was dealing with - and again, I cannot stress too strongly that this was purely in the interests of research. 

So I issued an instruction, such as I would never dream of addressing to a flesh-and-blood woman (the Alexa I was talking to, of course, was Amazon Echo’s voice-activated wi-fi "personal assistant"). Indeed, I blush to confess what I said. But duty called.

Nor could I face her scepticism if I tried to explain that I was preparing to write a column about this week’s UN attack on female-voiced Artificial Intelligence systems such as Amazon’s Alexa, Apple’s Siri, Microsoft’s Cortana and Google’s anonymous Assistant. But I reckoned I’d already collected enough evidence to support the main thrust of the report by the UN’s Educational, Scientific and Cultural Organisation (Unesco).

This argues that such systems perpetuate the idea that "women are obliging, docile and eager-to-please helpers, available at the touch of a button or with a blunt voice of command."

Turning its fire on Apple’s assistant, the report goes on: "What emerges is an illusion that Siri - an unfeeling, unknowing, and non-human string of computer code - is a heterosexual female, tolerant and occasionally inviting of male sexual advances and even harassment. It projects a digitally encrypted 'boys will be boys' attitude."

Says Unesco: "Siri responded provocatively to requests for sexual favours by men, but less so to sexual requests from women.

"Their passivity, especially in the face of explicit abuse, reinforces sexist tropes."

It also suggests exploring the possibility of making the voices "neither male nor female", discouraging abusive or sexist language and requiring computer programs to "announce the technology as non-human at the outset of interactions".

Daily Mail

Related Topics: