Table of Contents

Personalized intensive treatment from home

Ready to start healing?

The ChatGPT Suicide Risk and How to Manage It, According to Therapists

Headshot of Sarah Fielding

Written By: Sarah Fielding

Krystal Batista is a Dance/Movement Therapist at Charlie Health, specializing in supporting children and adolescents.

Clinically Reviewed By: Krystal Batista

February 18, 2026

4 min.

Learn about ChatGPT’s alleged link to deaths by suicide.

Learn more about our Clinical Review Process

Table of Contents

Personalized intensive treatment from home

Ready to start healing?

If you’re experiencing suicidal thoughts or are in danger of harming yourself, this is a mental health emergency. Contact The Suicide & Crisis Lifeline 24/7 by calling or texting 988.

Over the past few years, AI chatbots, such as ChatGPT, have become widespread, bringing both benefits and significant consequences. At times, people have used AI as a companion and source of real-world support—but ChatGPT has also been connected to multiple cases of suicide. Last August, for instance, Maria and Matt Raine sued OpenAI and co-founder Sam Altman for wrongful death and negligence, alleging that ChatGPT encouraged their teenage son to commit suicide. 

How exactly could an AI chatbot be connected to a suicide or suicide attempt? The short answer is that an AI chatbot is not prepared to deal with suicidal ideation or emotional distress. “AI reflects the information and patterns it’s given — it does not truly understand pain, risk, or human complexity,” says Charlie Health Group Facilitator Bree Williams, LPCA. “Suicide prevention and mental healthcare require empathy, accountability, and real-time assessment, which technology cannot replicate.” 

Understanding the risks AI chatbots like ChatGPT pose is critical to remaining safe while using technology. Below, we delve into an overview of the dangers of ChatGPT and its impact on mental health, along with how to protect yourself and your loved ones.

Charlie Health shield logo

There is no replacement for professional mental health support

Connect with caring, compassionate providers trained to offer you the care you need.

The impact of technology on mental health

Let’s start a bit further back. AI chatbots follow years of increasingly immersive technology and its greater influence on people. Digital technology, from social media to ChatGPT, provides constant information, excessive screen time, and disrupted emotional regulation, says Williams. Its “nonstop availability of information” can cause a person to be overwhelmed and feel increasing anxiety.

It’s well documented that technology, especially social media, can adversely impact a person’s mental health.”Technology can negatively affect mental health when it is used excessively or replaces healthy activities and human connection,” says Charlie Health Group Facilitator Chris Hinton, MS, M.Ed., LPC, CLC, CTP.

According to Hinton, technology can facilitate and cause adverse effects, such as:

  • Sleep disruption
  • Social comparison
  • Cyberbullying
  • Loneliness
  • Compulsive use
  • Information overload
  • Exposure to harmful content
  • Blurred work–life boundaries
  • Privacy-related anxiety

Williams adds that “technology can encourage emotional avoidance — scrolling, gaming, or chatting to escape discomfort instead of processing it.”

ChatGPT and mental health

An AI chatbot can feel like a safe space to divulge your feelings and personal information. “AI chatbots can feel comforting because they are accessible, responsive, and nonjudgmental — but this can become a double-edged sword,” says Williams.

For instance, Williams explains that a person who feels lonely or vulnerable might become over-reliant on AI companions and limit their interactions with human support, whether consciously or not. This dynamic can end up creating a greater sense of loneliness and need for regular reassurance. Hinton seconds this, explaining that a person in this dynamic might develop a “parasocial or emotional reliance” on their chatbot.

Real people, especially certified mental health professionals, are also much more aware of when thoughts are harmful and dangerous, says Hinton. AI companions might end up reinforcing or even encouraging negative ideas in a space without any safeguards. Big tech AI companies have not been created to improve a person’s mental health, and they can’t take the place of mental health professionals.

“AI lacks clinical judgment, emotional attunement, and ethical responsibility in the way a trained mental health professional has,” adds Williams. “It may miss warning signs of suicidal ideation, minimize risk, or provide responses that sound supportive but fail to encourage urgent help when it’s needed.” Williams continues: “In moments of crisis, this gap can be dangerous — not because AI is malicious, but because it is limited.”

How to protect your mental health against AI chatbots

DON’T

DO

DON’T use AI as a primary support 

DON’T become emotionally reliant on chatbots

DO seek professional help

DO find moments for human connection

Both Williams and Hinton emphasize the importance of ChatGPT and other AI systems working as complementary tools, not as primary support. It can’t take the place of strong human relationships and real-world connections.

Avoid creating an emotional reliance on chatbots. “Protection starts with intentional boundaries,” says Williams. “Use AI as a tool — not a lifeline.” She suggests asking chatbots for things like journaling prompts, grounding exercises, or a helpful organization. Hinton adds that people should always think critically about the information AI is providing them and not be afraid to enlist a second opinion.

Times of distress and upheaval are not when you should lean on AI. “Limiting reliance, especially during moments of vulnerability, is critical,” says Williams, instead recommending that people seek professional help. 

How Charlie Health can help

If you or a loved one is struggling with suicidal thoughts or depressive symptoms, Charlie Health is here to help. Charlie Health’s virtual Intensive Outpatient Program (IOP) provides more than once-weekly mental health treatment for dealing with serious mental health conditions, including suicidal ideation. Our expert clinicians incorporate evidence-based therapies into individual counseling, family therapy, and group sessions. With treatment, managing emotional distress is possible. Fill out the form below or give us a call to start healing today.

References

https://www.npr.org/sections/shots-health-news/2025/09/19/nx-s1-5545749/ai-chatbots-safety-openai-meta-characterai-teens-suicide
https://www.cbsnews.com/news/chatgpt-lawsuit-colordo-man-suicide-openai-sam-altman/
https://www.theguardian.com/technology/2025/nov/26/chatgpt-openai-blame-technology-misuse-california-boy-suicide

Charlie Health shield logo

Comprehensive treatment from home.

92% of Charlie Health clients and their families would recommend Charlie Health