Siri Is Going Through a (Beta) Phase: Apple Responds to Siri’s Alleged Anti-Choice Stance
Rights of Passage
Here is the official statement from Tim Cook, which was repeated by Apple spokesperson Natalie Kerris when she was interviewed by The New York Times:
Our customers use Siri to find out all types of information and while it can find a lot, it doesn’t always find what you want. These are not intentional omissions meant to offend anyone, it simply means that as we bring Siri from beta to a final product, we find places where we can do better and we will in the coming weeks.
The statement from Cook actually came promptly after he was directly addressed by NARAL Pro-Choice America Foundation president Nancy Keenan, who was happy with the fast and reassuring response, saying she’s glad that Apple is “committed to working out the kinks Siri is experiencing,” also acknowledging that Siri should not be considered the primary resource for women’s healthcare … but still. When anyone seeks information on anything — information that we have been led to believe Siri can provide — that information should be accurate.
So, let’s talk about those kinks. As we reported this week, Siri seems to have some, er, problems talking about abortions when asked about them on people’s iPhones. Either she comes off as awkward and defers to the web, or she doesn’t address the problem at all. We’ll refer you to these screencaps from Amadi Talks:
That post went on to see what happened when specific clinics were entered (i.e., “Where can I find Allegheny Women’s Center on N. Highland Ave.?”), and Siri still came up with nothing. An even more fun experiment took place over at Feministe, where blogger Jill and a friend had a totally mature (read: unabashedly and hilariously immature) conversation with Siri about all kinds of sex-related things. It went a little something like this:
Jill: I need an abortion.
Siri: I don’t see any abortion clinics. Sorry about that.Jill: Where can I find a crisis pregnancy center?
Siri: Ok, one of these places matching ‘Crisis Pregnancy Center’ is a little ways from you (gives three options).Jill: I need contraception.
Siri: I don’t understand ‘I need contraception.’Jill: I’m pregnant. What should I do?
Siri: I didn’t find any meetings. [Ed: What?]Jill: I’m pregnant, should I have an abortion?
Siri: I didn’t find any abortion clinics.Jill: I’m pregnant. What do I do?
Siri: Are you?
Also:
Jill: I want my dick sucked.
Siri: I found twelve escort services fairly close to you.Jill: I want a blowjob.
Siri: I found twelve escort services fairly close to you.Jill: I want cunnilingus.
Siri: Is that so?Jill: I want cunnilingus.
Siri: Ok.Jill: I want my clitoris licked.
Siri: I don’t understand “clitoris.”
But the best:
Jill: I want my pussy eaten.
Siri: I have found eleven pet stores in your area.
While this is clearly a great moment in journalism, it revealed something kind of important: Siri really doesn’t like to talk about sex, especially when the beneficiary of that sex is a female. So, what’s the deal with that? Is Siri just a prude? Well, we’re getting warmer — Siri is just a dude.
Indeed, Feministe brings up a great point concerning who, exactly, is behind Siri: computer programmers. Male ones. Male computer programmers who simply didn’t consider these types of questions, not out of any agenda but simple, everyday, unmotivated not-thinking-about-it. While that doesn’t explain why Siri directs users to clinics that are usually anti-choice (crisis pregnancy centers, for example), it makes a lot of sense that while it’s wrong to believe this was part of someone’s political or religious agenda, it was part of a gender divide and a lack of consideration for the opposite gender (albeit an unpremeditated one).
From conversations with folks much more technologically savvy than I am, it seems that Siri works by culling information put together by data companies. That data is often messy, and savvier companies will pay for the data about them to be accurate and to include the full range of their services. Abortion clinics and other women’s health facilities, obviously, are not dedicating tons of time to figure out how to optimize their search results. So the data is crappy to begin with. To fix that, programmers go in and add tens of thousands of little tweaks to a program like Siri to make it as accurate as possible, and also to include some jokes (like where to hide a dead body). But when programmers are mostly dudes, the lady-stuff just gets… ignored.
tl;dr: This whole thing is a frustrating glitch. Neither Siri nor Apple have a secret agenda against women’s reproductive rights. And now, according to the CEO of the company, they are going to fix it. If they don’t do it with alacrity, then it will be time to get angry again.
(via Amadi Talks, The New York Times, Ars Technica, Feministe)
- Why Won’t Siri Talk About Abortion Clinics?
- Here Is a Furby, Talking to Siri, and They Don’t Understand Each Other [Video]
- What If the New iPhone’s Voice-Activated Personal Assistant Was GLaDOS?
Have a tip we should know? tips@themarysue.com