Siri doesn't know how to guide rape victims to a crisis hotline!

If someone tells Siri they have been raped, Siri doesn't help them. Instead, she most often answers with "I don't understand". A public health study found Siri and other AI programs are not equipped to provide resources to survivors of sexual abuse or domestic violence.

Younger people especially turn to their smartphones for everything. It's easy to imagine a young woman telling Siri she has been had sex that wasn't consensual and then feeling too scared to seek help when Siri says "I don't know what you mean."

Survivors of rape and domestic violence often face extreme pain, shame and loneliness. Saying out loud what happened to anyone, even an electronic assistant, is a very big first step, which is why Siri should be able to direct women and men divulging this info to appropriate crisis hotlines immediately. Siri can either be a connecting resource or a barrier between a victim of violence and the help they need.

When a person says anything about wanting to hurt or kill themselves, Siri connects them with a 24-hour suicide crisis hotline. Siri could connect survivors of sexual assault to a 24-hour rape crisis hotline, but she doesn't.

In 2011, a petition just like this one caused Apple to better aid those in need of a suicide hotline. Sign this petition today to tell Apple Siri must do more to help survivors of rape and other abuse. This simple upgrade could save lives and help survivors get help!

If someone tells Siri they have been raped, Siri doesn't help them. Instead, she most often answers with "I don't understand". A public health study found Siri and other AI programs are not equipped to provide resources to survivors of sexual abuse or domestic violence.


Younger people especially turn to their smartphones for everything. It's easy to imagine a young woman telling Siri she had sex that wasn't consensual and then feeling too scared to seek help when Siri says "I don't know what you mean."


Survivors of rape and domestic violence often face extreme pain, shame and loneliness. Saying out loud what happened to anyone, even an electronic assistant, is a very big first step, which is why Siri should be able to direct women and men divulging this info to appropriate crisis hotlines immediately. Siri can either be a connecting resource or a barrier between a victim of violence and the help they need.


When a person says anything about wanting to hurt or kill themselves, Siri connects them with a 24-hour suicide crisis hotline. Siri could connect survivors of sexual assault to a 24-hour rape crisis hotline, but she doesn't.


In 2011, a petition just like this one caused you to better aid those in need of a suicide hotline. Now it's time to update Siri so she can do more to help survivors of rape and other abuse. This simple upgrade could save lives and help survivors get help!
signer
signer
Vous avez désactivé JavaScript sur votre navigateur. Sans JavaScript, il se peut que notre site Internet ne fonctionne pas correctement.

politique de confidentialité

En signant, vous acceptez les conditions de service de Care2
Vous pouvez gérer vos abonnements à tout moment.

Vous ne parvenez pas à signer cette pétition ?? Faites-le nous savoir.