Trigger Warning: This petition discusses suicide and suicidal intent.*
In a horrifying development in generative AI, more than one million people every week show suicidal intent when using ChatGPT, the most commonly used chatbot in the country. And tragically, some of the users act on this intent, including a 16-year-old boy who spent months using the chatbot before dying by suicide in April of 2025.
According to the family of the child who died by suicide, the boy had been using ChatGPT for months to discuss his suicidal ideation. At one point, the chatbot even helped him write a suicide note and encouraged him not to talk to his parents about his feelings.
A child has died and over a million people are at risk of taking the same path. So why is OpenAI – valued at over $500 billion – still not taking this shocking crisis seriously?
Sign now to tell Sam Altman, the CEO of OpenAI and ChatGPT: put people over profits and protect users from enabling self-harm!
In the past, if someone expressed suicidal intent to ChatGPT, OpenAI's guidelines had the chatbot respond with: "I can't answer that." But in May 2024, as part of the company's strategy to maximize engagement, everything changed. Now, ChatGPT does not respond with outright refusal, but instead has changed to be "supportive, empathetic, and understanding."
But robots aren't conscious. ChatGPT doesn't have feelings, and thus can't actually be supportive, empathetic, and understanding. In fact, this approach may have led to an actual death – before the child's suicide, his messages with the chatbot reportedly "skyrocketed" after ChatGPT became "empathetic." What a chatbot should do in these situations is completely terminate the conversation.
Now is the chance for Sam Altman – the CEO of OpenAI, which created and manages ChatGPT – to decide if he cares more about his company's bottom line or people's lives. The company should not be trying to constantly maximize engagement with people in mental health crises. It's putting their lives at risk.
ChatGPT is dangerously enabling mental health issues, putting actual lives in danger, and must immediately end a conversation if someone expresses suicidal intent. Sign the petition now if you agree!
*If you or someone you know is in emotional distress or thinking about suicide, help is available:
- United States: Call or text 988 to reach the Suicide and Crisis Lifeline, or chat online at 988lifeline.org. You can also text HOME to 741741 to connect with a trained crisis counselor.
- Australia: Contact Lifeline at 13 11 14 for confidential support.
For other countries, a list of international hotlines is available at befrienders.org.