Ask AI Companies to Offer Human Support for Users in Crisis

  • ontvanger: AI users, mental health advocates

As artificial intelligence becomes more embedded in our lives, it's crucial that we ensure it evolves with responsibility and compassion.

People often turn to AI when they're feeling overwhelmed, alone, or even in emotional crisis. Unfortunately, today's AI systems don't have built-in tools to recognize serious red flags or offer the human help a user might truly need.

We are calling on major AI developers including OpenAI, Gemini, DeepMind, Anthropic, xAI, Mistral, Cohere, and others to implement a simple but powerful solution:

Include a baseline mental-health-awareness layer so the AI can detect signs of distress in conversation patterns.

Offer a gentle, private message, such as:

"It sounds like you're going through something difficult. Would you like to talk to a human support line?"

Partner with or staff confidential, in-house mental health support, using licensed professionals bound by patient privacy laws, not surveillance or police reports.

AI should be a tool for healing, not harm.

Let's create systems that don't just listen but know when to say, "It's okay to ask for help."

petitie tekenen
petitie tekenen
Je hebt JavaScript uitgeschakeld. Hierdoor werkt onze website misschien niet goed.

privacybeleid

Door te tekenen accepteer je de servicevoorwaarden van Care2
U kunt uw e-mail abonnementen op elk gewenst moment beheren.

Lukt het niet om dit te tekenen? Laat het ons weten..