Table of Contents
AI for Mental Health: Understanding the Risks
Written By: Ashley Laderer
Clinically Reviewed By: Brooke Cortez
January 2, 2026
7 min.
Do you find yourself relying on AI for mental health support? If so, it’s important to be aware of the risks. Read on to learn more.
Learn more about our Clinical Review Process
Table of Contents
For better or worse, AI has become a go-to source of emotional support for many — seemingly overnight. Even if you haven’t personally sought out validation or vented to ChatGPT at 2 A.M., it’s likely someone you know has.
While mental health stigma has reduced and mental health care has become more accessible over the past several years, many people still can’t get the help they need when they need it. Therapy may feel inaccessible, expensive, or even intimidating, and that’s where AI comes in.
Read on to learn more about the risks of using AI for mental health, the safety of chatbots, and the dangers of using AI instead of seeking help from a real mental health professional.
AI cannot replace professional mental health support
Connect with experts and evidence-based treatment today.
Introduction to AI for mental health
AI is increasingly being used as a source of companionship and even therapy-like guidance. From general-purpose generative AI models (like ChatGPT or Gemini) to AI “companions” and AI mental health apps, these tools promise quick access, validation, and coping strategies, without the barriers to access of traditional mental health care.
However, while AI can feel helpful in the moment, it also comes with major limitations and real risks.
How many people use AI for mental health?
The number of people using AI for mental health is growing. Harvard Business Review reported that in 2024, therapy/companionship was the second-most-popular use case for generative AI, and in 2025, it moved up to the number one spot.
Actual estimates of how many people use AI for mental health vary. One study found that 28% of adult users lean on AI for quick support and/or as a personal therapist. Another study of adolescents and young adults (ages 12–21) found that 13.1% use AI for mental health advice. Among those aged 18–21 specifically, 22.2% use AI for mental health advice.
As AI becomes more prevalent, it’s possible that the number of people who use AI for mental health reasons could be even higher.
Why is AI bad for mental health?
There are multiple reasons why AI is bad for mental health, ranging from overreliance, misinformation, and, of course, a lack of clinical judgment.
The American Psychological Association (APA) has urged caution among users, saying that chatbots can endanger the public, and has urged the Federal Trade Commission (FTC) to push for stronger safety guardrails to protect users.
The risks of AI chatbots for mental health
“AI chatbots can feel comforting because they’re available 24/7, nonjudgmental, and easy to access — but that accessibility can also be risky,” says Bree Williams, LCPA, a Charlie Health Group Facilitator. The 24/7 access and instance validation are especially enticing for those who lack professional support or even a support network of friends or family they feel like they can open up to.
However, these chatbots lack the clinical judgment and emotional attunement that human therapists have, Williams explains. “They can miss signs of crisis, suicidal ideation, abuse, or severe mental illness — or respond in ways that unintentionally validate harmful thought patterns,” she adds.
Here are some risks of AI chatbots and generative AI for mental health:
1. It can’t provide nuanced, personalized treatment plans
“Each client has their own individualized needs and goals, as well as specific treatment that is designed for their betterment and growth,” says Brooke Cortez, MT-BC, a Creative Arts Therapist at Charlie Health. “While AI is able to provide general information and statistics, it is not yet at the point where it can make individualized statements based on a client’s specific needs or issues.”
2. It oversimplifies complex emotions
“Because AI relies on patterns and language rather than lived human understanding, it may oversimplify complex emotions or offer advice that sounds supportive but lacks nuance, accountability, or safety planning,” Williams explains. “This can create a false sense of being ‘helped’ while deeper issues go unaddressed.”
3. It tells you what you want to hear
Generative AI technology is often programmed for “sycophancy,” meaning it’s agreeable and essentially tells you what you want to hear. This is harmful because it can reinforce cognitive distortions, validate unhealthy behaviors, or discourage users from seeking real help.
4. It enables rumination
Rumination means repetitively thinking about the same distressing thoughts, worries, or “what ifs”. This is a common symptom for people with anxiety disorders or obsessive-compulsive disorder (OCD). “Algorithms can also reinforce rumination, people repeatedly seeking reassurance or answers instead of building tolerance for uncertainty,” Williams says.
5. It can reduce real-world social interactions
Some people may find opening up to a chatbot much easier than opening up to a real person, becoming reliant on an “emotional” connection with AI. “Constant interaction with AI may reduce real-world social connection, which is essential for emotional regulation and healing,” Williams says.
6. It may result in AI psychosis
AI psychosis is a relatively new phenomenon that describes a worsening or triggering of psychotic symptoms. A user might develop delusions or hallucinations related to their interactions with a chatbot. This is particularly a risk for those who already struggle with psychosis, hallucinations, or delusions, Cortez says.
Is character AI bad for your mental health?
Character.AI is a chatbot website where users can create and chat with custom characters. Some users even feel like they’re developing real relationships with these chatbots because the characters are built to mimic human connection, blurring the line between simulated interaction and real relationships.
Since Character.AI characters are designed to be engaging and affirming, they may encourage excessive use or reinforce unhealthy beliefs. These characters won’t challenge distorted or harmful thoughts the way that a trained human therapist would.
For this reason, the platform has been under fire and the subject of multiple lawsuits. There are cases where families have claimed that their children attempted suicide or died by suicide after interacting with a Character.AI chatbot.
The dangers of using AI instead of seeking real therapy
Sure, AI can “listen” to your problems and give you some advice, but it is in no way a replacement for real therapy from a human clinician, especially if you’re struggling with a mental health condition.
“Therapy is not just advice or reflection — it’s a regulated, ethical, relational process grounded in training, accountability, and real-time emotional responsiveness,” Williams says. “AI cannot assess risk, manage crises, or maintain confidentiality in the same way. It cannot adapt interventions based on nonverbal cues, cultural context, or trauma history.”
Additionally, people with certain diagnoses should be very cautious when using AI. “For individuals dealing with depression, trauma, addiction, eating disorders, or suicidal thoughts, relying on AI instead of professional care can delay treatment and increase harm,” Williams says. This is because AI could normalize dangerous symptoms, miss red flags, or fail to intervene properly during a crisis.
“Healing often requires being seen, challenged, and supported by another human — something AI cannot truly replicate,” says Williams.
AI tools for mental health support: Can they help?
Keeping all of this information in mind, AI isn’t all bad. If you use it responsibly (and not as an actual replacement for a therapist), it may lead to some benefits.
Not to mention, not everyone has access to mental health care, whether due to cost, location, long waitlists, or provider shortages. In these cases, AI could help provide some basic support to bridge the gap in the meantime. “AI is also capable of housing information on resources that clients might be able to access for additional support,” Cortez adds.
Additionally, if you are receiving mental health care from a professional, you might use AI for some minor additional support. “When used appropriately, AI can be a helpful supplement to mental health care,” Williams says. She notes that AI can increase access to:
- Psychoeducation
- Coping tools
- Journaling prompts
- Grounding exercises
- Self-care reminders
- Medication adherence
For example, if you’re in therapy for anxiety but struggling between sessions, you might ask AI to provide you with step-by-step directions for a breathing exercise to help you calm down, or to give you journal prompt ideas to explore your feelings more deeply.
“The key is using AI as a support tool — not a replacement — for relational care,” Williams says.
Additionally, more AI applications and tools are being developed by mental health professionals, with clinical oversight of programming and some degree of safety guardrails in place. However, even these tools should still be used with caution and are not a replacement for therapy with a live human therapist.
How Charlie Health can support you
Charlie Health’s virtual Intensive Outpatient Program (IOP) provides more than once-weekly mental health treatment for individuals dealing with serious and complex mental conditions, including major depressive disorder, anxiety disorders, substance use disorders, eating disorders, and more.
Our compassionate (human!) mental health professionals use a variety of evidence-based therapy modalities, such as cognitive behavioral therapy (CBT), dialectical behavior therapy (DBT), and trauma-informed modalities to treat mental health conditions. If needed, you can also get connected with a provider for medication management. With treatment, managing your symptoms is absolutely possible. Fill out the form below or give us a call to start your healing journey today.s. If needed, you can also get connected with a provider for medication management. With treatment, managing psychosis symptoms is absolutely possible. Fill out the form below or give us a call to start your healing journey today.
References
https://hbr.org/2025/04/how-people-are-really-using-gen-ai-in-2025
https://pmc.ncbi.nlm.nih.gov/articles/PMC11488652/
https://jamanetwork.com/journals/jamanetworkopen/fullarticle/2841067
https://www.apaservices.org/practice/business/technology/artificial-intelligence-chatbots-therapists
https://www.nature.com/articles/d41586-025-03390-0
https://www.cnn.com/2025/09/16/tech/character-ai-developer-lawsuit-teens-suicide-and-suicide-attempt