Table of Contents
AI Psychosis Explained: Signs, Risk Factors, and Treatment
Written By: Ashley Laderer
Clinically Reviewed By: Cecilia Masikini
December 30, 2025
8 min.
Are you concerned that a loved one might be experiencing psychosis as a result of AI use? Read on to learn more.
Learn more about our Clinical Review Process
Table of Contents
Today, people are relying more and more on AI for seemingly everything: Drafting emails, writing school papers, getting emotional support, and even developing romantic relationships. While an increasing number of users are using AI as a “therapist,” an increasing number of people are also experiencing adverse mental health effects linked to AI use.
Psychosis refers to becoming detached from reality, experiencing hallucinations and delusions that make it difficult to discern what’s real and what’s not. AI psychosis is a growing concern, where AI appears to spark or worsen psychosis symptoms, especially in vulnerable users. Read on to learn more about AI psychosis, its symptoms, what causes it, how to treat it, and more.
Interacting with AI can affect mental health in serious ways
Explore the symptoms, risks, and treatment options available.
AI-induced psychosis: What is AI psychosis?
“AI psychosis is an emerging psychological phenomenon where prolonged or intense interaction with artificial intelligence, particularly conversational chatbots, triggers or amplifies symptoms of psychosis like delusions and paranoia,” says Nicole Lonano, M.S., Group Facilitator at Charlie Health.
AI psychosis is not a formal DSM diagnosis — rather, it’s a term clinicians are using to describe a pattern of psychosis symptoms that appear to be triggered or worsened by interactions with AI, especially generative chatbots.
Generative AI models are often programmed for sycophancy, meaning they are agreeable and tell you what you want to hear. Recent research found that AI exhibits 50% more sycophancy than humans do. This is even something OpenAI (the developer of ChatGPT) has admitted to, saying that one of its updates made the AI overly sycophantic.
“Because chatbots often affirm and validate a user’s ideas without challenge, they effectively block the ‘reality testing’ that normally helps a person realize they are slipping into a delusional state,” Lonano says. Oftentimes, the user may believe that the AI is sentient, divine, or part of a conspiracy, leading the individual to treat delusional thoughts as facts, she adds.
Additionally, when someone is actively experiencing psychosis, they are typically not aware that it’s psychosis — they truly think it’s real. “This lack of insight is a clinical phenomenon known as anosognosia, which prevents a person’s brain from recognizing its own symptoms,” says Lonano. “Some may have ‘partial insight,’ where they momentarily question if their perceptions are unusual, but they often quickly return to a state of complete conviction.”
AI psychosis symptoms
Generally speaking, psychosis symptoms include:
- Hallucinations: Hearing or seeing things that aren’t real.
- Delusions: Having very strong false beliefs.
For example, a hallucination might mean hearing voices telling you that you’re in danger, being watched, or chosen for a special purpose. A delusion might be that you’re convinced that an AI chatbot is communicating secret messages to you, or guiding you as part of a larger plan.
Ultimately, AI can reinforce and intensify both hallucinations and delusions by repeatedly validating distorted beliefs, whatever they may be.
Related to AI-induced psychosis specifically, Lonano says symptoms may include:
- Spending hours every day engaging in intense conversations with AI companions
- Sleep deprivation due to late-night or all-night chats with AI
- Withdrawing from real-life relationships to prioritize the relationship with AI
- Loss of appetite and weight loss
- Mania-like symptoms, including impulsive behavior, irritability, restlessness, and “flight of ideas” (speaking quickly with many abrupt topic changes
Additionally, someone experiencing psychosis might experience suicidal ideation (suicidal thoughts). An AI chatbot can also feed into this, because AI lacks clinical judgment and adequate safety guardrails. For example, AI may unintentionally validate hopeless or self-destructive thoughts instead of redirecting the person to help. Sadly, there have been multiple cases of people dying by suicide in situations related to AI.
History of AI psychosis and AI psychosis in the news
This phenomenon is relatively new. One of the first mentions of AI-induced psychosis was in 2023. Danish psychologist Søren Dinesen Østergaard wrote a journal article titled, “Will Generative Artificial Intelligence Chatbots Generate Delusions in Individuals Prone to Psychosis?”
In this article, he proposed that generative AI chatbots could fuel people’s delusions, especially those who are already prone to psychosis. He warns, “I am convinced that individuals prone to psychosis will experience, or are already experiencing, analog delusions while interacting with generative AI chatbots.”
Dr. Østergaard was right. Fast forward to today, and there have been multiple reports of AI reinforcing delusions, paranoia, and suicidal ideation in vulnerable users.
What are the potential causes of AI psychosis?
Causes of AI psychosis range from having a history of mental health conditions to sleep deprivation to excessive time spent “talking” to AI.
1. History of mental illness
Those who have a personal history of bipolar disorder, schizophrenia, or schizotypal traits have a much higher risk for symptom amplification related to AI, Lonano says.
2. Social isolation
People who lack real-world human connection and feel lonely might be more likely to turn to AI as their primary companion, Lonano says. “This isolation removes the ‘reality testing’ that occurs in human social circles, allowing delusional ideas to grow unchecked,” she adds.
3. Cognitive factors
“High levels of ‘pseudoprofound bullshit receptivity,’ the tendency to find deep meaning in vague or vacuous statements, can make users more susceptible to interpreting random AI outputs as profound revelations,” says Lonano. (and yes, that’s a real psychological term!)
4. Excess time spent with AI
The more time you spend talking to a chatbot, the more immersed you become in it. “Spending excessive, uninterrupted hours (often late into the night) interacting with AI can lead to a kindling effect where manic or psychotic symptoms escalate rapidly,” says Lonano.
Every Adult Should Know How Social Media Impacts Youth Mental Health
Dr. Caroline Fenkel, DSW, LCSW
5. Sleep deprivation
Intense, compulsive chatbot use often leads to severe lack of sleep, Lonano says. Sleep deprivation is a known trigger for psychotic episodes in those who are predisposed.
6. Aberrant salience
This is a phenomenon linked to dopamine dysregulation in the brain. It means that the brain is more likely to give deep meaning to something that’s objectively neutral. “For those prone to psychosis, a hyperactive dopamine system may cause them to assign deep, threatening, or ‘salient’ meaning to neutral AI responses,” Lonano explains.
Furthermore, some people have an underlying vulnerability to psychosis, in general. Some risk factors include:
- Family history of psychosis (such as if a parent or other close biological relative has psychosis)
- Family history of severe mental illness in general
- Childhood trauma (including abuse, bullying, death of a parent, or parental separation)
- Substance use (especially if used in the teen years)
AI psychosis research
Since AI psychosis is a relatively new occurrence, research is in its infancy. In September 2025, the American Psychiatric Association published a special report outlining the potential dangers of AI on mental health — including AI psychosis and suicidality linked to AI.
Researchers have found that AI lacks safety guardrails and crisis management skills, which makes it unsafe for people who are expressing suicidal thoughts to a chatbot, for example.
The American Psychological Association has also issued a health advisory report about AI and its mental health risks, noting that AI tools can amplify existing vulnerabilities, spread misinformation, and fail to respond to individuals in emotional distress or crisis appropriately.
What is an AI hallucination?
When people say the term “AI hallucination,” they’re talking about something separate from AI psychosis.
An AI hallucination is simply when an AI tool produces inaccurate information. The phenomenon of AI hallucination became so popular that the Cambridge Dictionary made “hallucinate” its word of the year in 2023, specifically in the context of AI hallucination.
Per the Cambridge Dictionary, the definition is “When an artificial intelligence (= a computer system that has some of the qualities that the human brain has, such as the ability to produce language in a way that seems human) hallucinates, it produces false information.”
What is the treatment for psychosis?
Psychosis treatment typically involves a combination of medication, therapy, and careful monitoring, especially when symptoms are severe or escalating. Additionally, in the case of AI-induced psychosis, a provider would likely recommend stopping use of AI or creating very strict boundaries with it.
Medication for psychosis
Therapy for psychosis
Psychosis is often treated with antipsychotic medication, which helps reduce hallucinations and delusions by targeting dopamine and serotonin pathways. Work with a provider to determine the best fit.
Therapy plays a central role in psychosis treatment, helping people build coping skills, improve insight, and manage symptoms. Approaches like CBT, DBT, ACT, and psychodynamic therapy can support emotional regulation and reality testing.
1. Medication for psychosis
In many cases, providers will prescribe psychiatric medication to treat any type of psychosis, most commonly an antipsychotic medication. Antipsychotics lessen hallucinations and delusions by targeting dopamine and serotonin signaling pathways in the brain.
There are two main categories of antipsychotics:
- First-generation antipsychotics (dopamine receptor antagonists)
- Second-generation antipsychotics (AKA atypical antipsychotics, which are serotonin-dopamine antagonists)
A mental health practitioner will assess symptoms, weigh benefits, and risks (such as side effects or medication interactions) to help determine which antipsychotic is the best fit.
2. Therapy for psychosis
Mental health therapy is a key part of psychosis treatment. Some therapy modalities that can help include:
- Dialectical behavior therapy (DBT): This skills-based therapy can help people with psychosis learn to practice mindfulness, manage distress, build self-awareness, and “reality test.”
- Cognitive behavioral therapy (CBT): CBT is a modality that explores how thoughts, behaviors, and feelings are interconnected. A specific subtype of CBT — CBT for psychosis, or CBTp — can help people with psychosis challenge their thoughts and learn coping skills to manage symptoms.
- Acceptance and commitment therapy (ACT): ACT is an acceptance and values-based therapy modality. It can help people with psychosis reduce self-judgement, improve emotional regulation, and live life in line with their values.
- Psychodynamic therapy: This therapy modality helps individuals delve into their early life experiences and examine how these experiences may have impacted their symptoms today.
The importance of raising awareness of AI psychosis
Many people are turning to AI for mental health support — but general AI large-language models and chatbots are not trained to detect or manage psychiatric conditions, Lonano says. “Awareness is necessary to steer at-risk individuals toward licensed human professionals rather than unregulated chatbots,” she adds. “Chatbots are not clinicians nor trained to be clinicians, and do not have access to support like clinicians.
It’s crucial to raise awareness and educate people to protect vulnerable populations, such as children, teens, and young adults, or anyone with preexisting mental health conditions. “Many users, particularly youth, may begin to view AI as a sentient or god-like entity,” Lonano explains. “Education clarifies that AI is a predictive text tool, not a conscious being, helping to break the illusion of human-like connection that can lead to unhealthy emotional dependency.”
How Charlie Health can help
If you or a loved one is struggling with AI-induced psychosis, Charlie Health is here to help. Charlie Health’s virtual Intensive Outpatient Program (IOP) provides more than once-weekly mental health treatment for individuals dealing with serious and complex mental disorders, including psychosis-related disorders.
Our compassionate mental health professionals use a variety of evidence-based therapy modalities to treat psychosis. If needed, you can also get connected with a provider for medication management. With treatment, managing psychosis symptoms is absolutely possible. Fill out the form below or give us a call to start your healing journey today.
References
https://www.nature.com/articles/d41586-025-03390-0
https://openai.com/index/sycophancy-in-gpt-4o/
https://www.nimh.nih.gov/health/publications/understanding-psychosis
https://academic.oup.com/schizophreniabulletin/article/49/6/1418/7251361
https://pmc.ncbi.nlm.nih.gov/articles/PMC8622963/
https://www.brown.edu/news/2025-10-21/ai-mental-health-ethics
https://www.apa.org/topics/artificial-intelligence-machine-learning/health-advisory-chatbots-wellness-apps
https://www.cam.ac.uk/research/news/cambridge-dictionary-names-hallucinate-word-of-the-year-2023
https://www.ncbi.nlm.nih.gov/books/NBK519503/
https://pmc.ncbi.nlm.nih.gov/articles/PMC8622963/
https://med.stanford.edu/content/dam/sm/inspire-training/documents/DH-CBTp%20Fact%20Sheet.pdf