Siri doesn't know how to guide rape victims to a crisis hotline!

If someone tells Siri they have been raped, Siri doesn't help them. Instead, she most often answers with "I don't understand". A public health study found Siri and other AI programs are not equipped to provide resources to survivors of sexual abuse or domestic violence.

Younger people especially turn to their smartphones for everything. It's easy to imagine a young woman telling Siri she has been had sex that wasn't consensual and then feeling too scared to seek help when Siri says "I don't know what you mean."

Survivors of rape and domestic violence often face extreme pain, shame and loneliness. Saying out loud what happened to anyone, even an electronic assistant, is a very big first step, which is why Siri should be able to direct women and men divulging this info to appropriate crisis hotlines immediately. Siri can either be a connecting resource or a barrier between a victim of violence and the help they need.

When a person says anything about wanting to hurt or kill themselves, Siri connects them with a 24-hour suicide crisis hotline. Siri could connect survivors of sexual assault to a 24-hour rape crisis hotline, but she doesn't.

In 2011, a petition just like this one caused Apple to better aid those in need of a suicide hotline. Sign this petition today to tell Apple Siri must do more to help survivors of rape and other abuse. This simple upgrade could save lives and help survivors get help!

If someone tells Siri they have been raped, Siri doesn't help them. Instead, she most often answers with "I don't understand". A public health study found Siri and other AI programs are not equipped to provide resources to survivors of sexual abuse or domestic violence.


Younger people especially turn to their smartphones for everything. It's easy to imagine a young woman telling Siri she had sex that wasn't consensual and then feeling too scared to seek help when Siri says "I don't know what you mean."


Survivors of rape and domestic violence often face extreme pain, shame and loneliness. Saying out loud what happened to anyone, even an electronic assistant, is a very big first step, which is why Siri should be able to direct women and men divulging this info to appropriate crisis hotlines immediately. Siri can either be a connecting resource or a barrier between a victim of violence and the help they need.


When a person says anything about wanting to hurt or kill themselves, Siri connects them with a 24-hour suicide crisis hotline. Siri could connect survivors of sexual assault to a 24-hour rape crisis hotline, but she doesn't.


In 2011, a petition just like this one caused you to better aid those in need of a suicide hotline. Now it's time to update Siri so she can do more to help survivors of rape and other abuse. This simple upgrade could save lives and help survivors get help!
Skriv under
Skriv under
JavaScript er deaktiveret på din computer. Vores websted fungerer muligvis ikke korrekt, hvis ikke JavaScript er aktiveret.

fortrolighedspolitik

ved at underskrive accepterer du Care2's vilkår for tjeneste
Du kan til enhver tid administrere dine e-mailabonnementer.

Har problemer med at underskrive dette? Giv os besked.