Ask AI Companies to Offer Human Support for Users in Crisis

  • recipient: AI users, mental health advocates

As artificial intelligence becomes more embedded in our lives, it's crucial that we ensure it evolves with responsibility and compassion.

People often turn to AI when they're feeling overwhelmed, alone, or even in emotional crisis. Unfortunately, today's AI systems don't have built-in tools to recognize serious red flags or offer the human help a user might truly need.

We are calling on major AI developers including OpenAI, Gemini, DeepMind, Anthropic, xAI, Mistral, Cohere, and others to implement a simple but powerful solution:

Include a baseline mental-health-awareness layer so the AI can detect signs of distress in conversation patterns.

Offer a gentle, private message, such as:

"It sounds like you're going through something difficult. Would you like to talk to a human support line?"

Partner with or staff confidential, in-house mental health support, using licensed professionals bound by patient privacy laws, not surveillance or police reports.

AI should be a tool for healing, not harm.

Let's create systems that don't just listen but know when to say, "It's okay to ask for help."

Sign Petition
Sign Petition
You have JavaScript disabled. Without it, our site might not function properly.

Privacy Policy

By signing, you accept Care2's Terms of Service.
You can unsub at any time here.

Having problems signing this? Let us know.