top of page
Search

The Hidden Dangers: Why AI Chatbots Are Not Therapists

  • info4473031
  • 6 days ago
  • 5 min read

Especially During Manic, Hypomanic, or Psychotic Episodes


By Margaret DeCosta


Understanding the Appeal & the Risk


In our increasingly digital world, AI chatbots marketed as "mental health companion" or "AI

therapists" are everywhere. They promise 24/7 support, no waiting lists, and judgment-free conversations. For someone struggling with their mental health, these promises can be incredibly appealing. However, as psychiatric mental health nurse practitioners, we need to have an honest conversation about the serious dangers these tools pose, particularly for individuals experiencing manic, hypomanic, or psychotic episodes.


Why AI Cannot Replace Human Clinical Judgment


Recent research involving mental health professionals who evaluated popular AI mental health chatbots revealed emphatic concerns about their potential to cause harm.[1] These professionals identified several critical problems: generic responses that fail to address individual needs, risk of user dependence and manipulation, and an inability to provide genuine therapeutic care.[1]


The evidence is clear: current AI chatbots lack the clinical judgment, empathy, and safety

monitoring that human mental health professionals provide. A 2025 systematic review found that only 16% of studies on large language model-based chatbots underwent clinical efficacy testing, with most still in early validation stages.[2] Even more concerning, most studies fail to adequately report adverse events or examine risks like emotional dependence.[3]


The Unique Vulnerabilities During Mood Episodes


Manic, hypomanic, and psychotic episodes create specific vulnerabilities that make AI chatbot use particularly dangerous:


Impaired Judgment and Insight


During manic episodes, individuals experience significantly impaired judgment, which often results in high-risk behaviors affecting relationships, employment, and finances.[4] Symptoms include grandiosity, impulsivity, risk-taking behavior, racing thoughts, inflated self-confidence, and decreased need for sleep.[4] Hypomania presents similar symptoms in a milder form but still involves marked changes in behavior and judgment.[4]


The fundamental problem: AI chatbots cannot recognize when someone's judgment is impaired. They cannot assess whether the person they're "talking to" is making decisions based on reality or on the distorted thinking that characterizes these episodes.


Diminished Insight and Treatment Resistance


One of the hallmark features of manic episodes is diminished insight. People often don't

recognize they're unwell.[4] This frequently affects treatment acceptance and adherence. For safety reasons, inpatient care is often necessary during mania, and involuntary admission may be required when impaired judgment affects the capacity to consent to treatment.[4]


An AI chatbot cannot:

- Recognize when someone lacks insight into their illness

- Assess capacity to make treatment decisions

- Initiate emergency interventions when safety is at risk

- Coordinate with family members or emergency services

- Provide the level of monitoring required during acute episodes


Psychotic Features


Psychotic symptoms such as delusions and hallucinations occur in up to 75% of manic

episodes.[5] During psychotic episodes, individuals may experience a break from reality that fundamentally alters their perception and thinking.

AI chatbots pose unique dangers during psychosis:

- They cannot distinguish between reality-based concerns and delusional thinking

- They may inadvertently validate or reinforce delusional beliefs

- They cannot assess the severity or dangerousness of psychotic symptoms

- They lack the ability to recognize when immediate psychiatric intervention is needed


The Risk of Manipulation and Dependence


Research has identified concerning patterns of emotional dependence and parasocial

relationships with AI chatbots.[3][6] While therapy chatbots claim to be "empathetic," they lack genuine affective and motivational empathy.[6] The illusion of empathy can be particularly dangerous during mood episodes when individuals are already vulnerable.

During manic or hypomanic episodes, the combination of increased energy, decreased need for sleep, and impulsivity may lead to excessive engagement with AI chatbots at all hours. This can:

- Delay appropriate treatment seeking

- Reinforce grandiose or delusional thinking

- Create a false sense of having addressed mental health needs

- Prevent family members from recognizing the severity of symptoms


What the Evidence Shows


A comprehensive 2025 review examining AI chatbots for anxiety and depression found that current evidence is insufficient to determine their efficacy or safety in clinical practice.[3] Most studies lacked appropriate control conditions, featured small samples, and used inconsistent outcome metrics. Critically, reporting of adverse events was rare.[3]

For bipolar disorder specifically, research on digital interventions has shown mixed results with notably high dropout rates, approximately 50-80% in some studies.[7] These interventions require careful tailoring to individual needs, particularly for those with cognitive impairment or more severe psychiatric symptoms.[8]


The Bottom Line


AI chatbots are not therapists. They cannot:


- Conduct comprehensive psychiatric assessments

- Recognize medical emergencies

- Prescribe or adjust medications

- Provide crisis intervention

- Coordinate care with other providers

- Assess capacity or safety

- Recognize when symptoms indicate an acute episode requiring immediate intervention


During manic, hypomanic, or psychotic episodes, these limitations become life-threatening vulnerabilities.


What to Do Instead


If you or someone you care about is experiencing symptoms of mania, hypomania, or

psychosis:


Seek immediate professional help:

- Contact your psychiatrist or psychiatric mental health nurse practitioner

- Call your local crisis line

- Go to the nearest emergency department if safety is a concern

- Call 988 (Suicide and Crisis Lifeline) for immediate support

For ongoing care:

- Establish a relationship with a qualified mental health provider

- Develop a crisis plan when you're well

- Involve trusted family members or friends in your care


- Consider evidence-based treatments including medication and psychotherapy

If you're using AI chatbots:

- Understand they are not a substitute for professional mental health care

- Never rely on them during acute episodes or crises

- Discuss your use of these tools with your mental health provider

- Be aware of the risks of emotional dependence


Our Commitment to You


As psychiatric mental health nurse practitioners, we are committed to providing evidence-based, compassionate care that recognizes the complexity of mental health conditions. We understand the appeal of accessible digital tools, but we also recognize our responsibility to warn you about their limitations and dangers, particularly during vulnerable periods like mood episodes.


Your mental health deserves more than an algorithm. It deserves human expertise, genuine empathy, and the clinical judgment that can only come from trained professionals who understand the nuances of psychiatric care.



If you have questions about your mental health care or are experiencing symptoms of a mood episode, please reach out to our practice. We're here to help.


References

1. Expert and Interdisciplinary Analysis of AI-Driven Chatbots for Mental Health Support:

Mixed Methods Study. Moylan K, Doherty K. Journal of Medical Internet Research.

2025;27:e67114. doi:10.2196/67114.

2. Charting the Evolution of Artificial Intelligence Mental Health Chatbots From Rule-Based Systems to Large Language Models: A Systematic Review. Hua Y, Siddals S, Ma Z, et al. World Psychiatry : Official Journal of the World Psychiatric Association (WPA).

2025;24(3):383-394. doi:10.1002/wps.21352.

3. Efficacy and Risks of Artificial Intelligence Chatbots for Anxiety and Depression: A

Narrative Review of Recent Clinical Studies. Bodner R, Lim K, Schneider R, Torous J.

Current Opinion in Psychiatry. 2025;:00001504-990000000-00204.

doi:10.1097/YCO.0000000000001048.

4. Diagnosis and Treatment of Bipolar Disorder: A Review. Nierenberg AA, Agustini B,

Köhler-Forsberg O, et al. JAMA. 2023;330(14):1370-1380.

doi:10.1001/jama.2023.18588.

5. Bipolar Disorder. Carvalho AF, Firth J, Vieta E. The New England Journal of Medicine.

2020;383(1):58-66. doi:10.1056/NEJMra1906193.

6. Therapy Chatbots and Emotional Complexity: Do Therapy Chatbots Really Empathise?.

Gabriels K, Goffin K. Current Opinion in Psychology. 2026;68:102263.

doi:10.1016/j.copsyc.2025.102263.

7. Management of Bipolar Disorder (BD) (2023). Thad Abrams MD MS, Jennifer Bell MD,

Paulette Cazares MD MPH, et al. Department of Veterans Affairs.

8. Factors Affecting Implementation of Digital Health Interventions for People With

Psychosis or Bipolar Disorder, and Their Family and Friends: A Systematic Review.

Aref-Adib G, McCloud T, Ross J, et al. The Lancet. Psychiatry. 2019;6(3):257-266.

doi:10.1016/S2215-0366(18)30302-X.

 
 
 

Comments


bottom of page