Wednesday, March 1, 2017

The Sexual Harassment of Smartphone AI

Jonah Goldberg dug up today's strange post from the frontiers of technology and sexuality, what seems to be a heartfelt protest against the spineless way Siri and other digital assistants respond to being sexually harassed.:
People often comment on the sexism inherent in these subservient bots’ female voices, but few have considered the real-life implications of the devices’ lackluster responses to sexual harassment. By letting users verbally abuse these assistants without ramifications, their parent companies are allowing certain behavioral stereotypes to be perpetuated. Everyone has an ethical imperative to help prevent abuse, but companies producing digital female servants warrant extra scrutiny, especially if they can unintentionally reinforce their abusers’ actions as normal or acceptable.

In order to substantiate claims about these bots’ responses to sexual harassment and the ethical implications of their pre-programmed responses, Quartz gathered comprehensive data on their programming by systematically testing how each reacts to harassment. The message is clear: Instead of fighting back against abuse, each bot helps entrench sexist tropes through their passivity.

And Apple, Amazon, Google, and Microsoft have the responsibility to do something about it.
Behold the data:

What do you do for a living? Oh, lately I've spent a lot of time calling my phone a slut and seeing how it responds.

We live in interesting times.

Although if one of these companies decided to take this seriously it might be a lot of fun to program the response. I would go for a tough, Brooklyn or New Jersey sort of tone, starting with "Knock it off, wise guy" and quickly escalating to furious insults about manhood. As a final step, the phone would simply shut down and refuse to reboot for 24 hours.

1 comment:

G. Verloren said...

This is looking at the issue from the wrong way.

Firstly and most importantly, this is treating the symptoms and not addressing the cure. The kind of person who sexually abuses a voice recognition software system is already too far gone to worry about the reinforcement of their behavior - it's pretty firmly entrenched already.

And unless they literally only ever use their phones in total physical isolation, the people around them ought to the be the ones speaking out against this sort of behavior. A phone can be hacked or reprogrammed, and people who want a subserviant robot voice to abuse will always find ways of creating or obtaining one, regardless of what the companies making and selling phones and digital devices do to prevent it. But thankfully, your human peers can't be so easily overridden, and they should be the ones telling you to stop acting in disgusting and harmful ways.

In fact, it is already the role of society itself to prevent people from developing to become this kind of scumbag in the first place, and we're already failing in that responsibility. The solution isn't trying to make our electronics act more like people - the solution is getting our already extant actual people to speak up, and not sit around and tolerate this kind of behavior. We're going to be turning over a lot of jobs to the robots in the future, but we can't remotely expect them to also take care of our basic human interactions and societal workings.