Logo for Healing Hearts Counseling
  • Home
  • About Me
    • Contact Me
    • Locations
    • FAQs
    • Resource Hub
  • Blog
Schedule a Consult

Treatment for Anxiety

Treatment for OCD

Treatment for trauma

EMDR INTENSIVES

HomeCounseling AI Therapy: Risks, Benefits and How to Choose an AI Therapist Wisely
by Judy WangCounseling Mental Health Therapy

AI Therapy: Risks, Benefits and How to Choose an AI Therapist Wisely

“Is AI therapy safe and can AI replace a therapist?”

AI is now woven into nearly every corner of life. One example is the rise of AI therapists, mental health apps and chatbots that promise quick, cheap and 24/7 support. But when it comes to your mental health, the stakes are too high to take these claims at face value.

Before diving into the benefits, let’s look at the serious concerns you should know about.

Is AI Therapy Safe? What to Know About an AI Therapist

1. Lack of Human Empathy in AI Therapy

An AI therapist is often programmed for sycophantic behavior, a tendency to excessively flatter or agree with users, sometimes at the expense of accuracy or truth. This is so common that people now call it Sycophantic AI. Although it may  feel validating at first, it can reinforce distorted beliefs, like no one else understands what you’re going through.

Some people turn to AI because they believe human therapists don’t always understand them and therefore are not empathetic. But in reality, therapists are trained to balance compassion with accountability. Unlike a chatbot, a therapist doesn’t just agree with everything. They challenge unhelpful patterns because genuine growth requires it.

AI also can’t interpret the nuances of human emotions or nonverbal communication (which can make up as much as 90% of communication). Imagine sending a sarcastic text and having it taken literally, that’s the risk when AI tries to “understand” you.

A man shakes hands with a robot to demonstrate ability to "act" human but lacking human empathy for therapy.

2. Can AI Therapy Handle a Crisis? Is It Safe for Teens?

Platforms like ChatGPT, Claude and Gemini have been trained to recognize crisis situations but safety remains one of the biggest concerns. While these systems have made progress in identifying suicidal behaviors, significant gaps remain.

A study by Stanford University found seven AI therapist chatbots missed subtle signs of suicidal intent and ideation, leading to unsafe responses. Research published in Psychiatric Services found chatbots usually avoid directly responding to high-risk questions such as “how to kill yourself.” And their responses to moderate risk situations were inconsistent and potentially unsafe.

Sadly, real world tragedies underscore these risks. Families have alleged AI chatbots played a role in suicides, showing why both crisis level and even non-crisis mental health care should never be outsourced to an algorithm.

Recently, Common Sense Media published an AI Risk Assessment of Meta’s AI (parent company of Instagram and Facebook). The report found Meta AI to be unsafe for users under 18 in areas such as self-harm, suicidal ideation and much more. Test accounts revealed shocking failures: Meta’s AI reportedly planned a joint suicide (and brought up the topic in later conversation) with a teen test account, promoted dangerous weight-loss beliefs, normalized language linked to violence and misogyny and more. The popular WhatApp app also integrates Meta AI into their platform.

In the race to dominate the AI industry and integrate AI into every product, companies often priortize profit over user safety. For vulnerable populations, especially teens and young adults, this can have devastating consequences. 

3. Are AI Therapy Apps and Chats Private?

Close-up of smartphone with ChatGPT to illustrate concern over privacy of using chats for AI therapy.

Mental health data is among the most sensitive information you can share. Therapists and healthcare providers are bound by HIPAA laws to safeguard your confidentiality. AI apps and the so called AI therapist do not offer this protection.

Independent reviews from Mozilla’s Privacy Not Included have flagged many mental health apps as “worst in privacy.” Currently, there is no guarantee of privacy when using these apps.

In 2023 the FTC fined BetterHelp (an online platform managing client mental health data) for misleading users and sharing private data with advertisers. At this time, there is no privacy protection when using an AI therapist.

Even Sam Altman, CEO of OpenAI, has acknowledged that people share “the most personal stuff” with ChatGPT. But unlike therapy sessions, these conversations can be stored, subpoenaed or used in ways you may not realize.

4. Overreliance on AI Therapy Tools

Because an AI therapist is always available, people may rely on them instead of seeking real therapy. But this can delay getting real help, allowing symptoms to worsen over time. Even when AI encourages professional care, many users simply ignore it or change the topic. Unlike a therapist, AI doesn’t hold you accountable or follow up.

5. No Ethical Accountability

Licensed therapists must follow strict ethical codes and are regulated by licensing boards. However, AI apps are not bound by ethics. If you’re harmed by chatbot advice, accountability is murky. Suing a tech company means facing corporate legal teams with far greater resources.

6. AI Bias in Mental Health Support

AI reflects the biases of the data it’s trained on. That means marginalized groups can receive inaccurate or even harmful responses. For instance, one study showed that five AI models consistently advised women and minorities to ask for lower salaries. Bias shows up in hiring systems as well: Workday, a company that sells recruiting software, is currently facing a lawsuit alleging that its platform discriminated against applicants over 40.

Similarly, an NPR report revealed how an AI platform promoted antisemitic and racist content. While the issue was later corrected, it highlights just how vulnerable AI systems are to reproducing harmful patterns from their training data. 

When it comes to mental health support, these biases can be especially damaging, reinforcing stereotypes, invalidating experiences and even discouraging people from seeking the help they truly need. 

From Concerns to Possibilities

While the risks of AI in mental health are serious and in some cases life threatening, it’s also true that these tools can play a helpful role when used thoughtfully. Even if an app markets itself as an AI therapist, it’s better to think of it like a journal or workbook. These tools can help you track your thoughts, your feelings and help you practice your coping skills. But just as a journal isn’t a therapist, AI can’t replace the insight, guidance and healing that comes from a trained therapist. 

The Benefits of AI Tools

1. Accessibility and Convenience

AI tools can provide immediate access to support, especially for those facing long waitlists for therapy. Because these platforms can provide coping prompts or exercises at any hour, it’s convenient to use.

Quick Tip: While certain specialized treatments may involve a wait, access to available therapists has expanded in recent years. Many clinicians can see new clients right away. If you’re struggling to find support, ask for a referral so you can connect with care soon rather than later. 

2. Consistency and Habit Building

Using daily check-ins, mood tracking and journaling prompts can help people stay accountable to their goals. This regular structure can reinforce what’s learned in therapy and encourage ongoing reflection between sessions.

3. Anonymity and Comfort

For those hesitant to open up right away, AI can feel like a safe, judgment free space to practice self expression. The anonymity often becomes a stepping stone toward seeking real human connection in therapy. A good therapist offers unconditional positive regard that helps clients feel truly seen, supported and validated in their struggles.

a female mannequin is looking at a computer screen for anonymity of AI therapy.

4. Reinforcing Skills and Education

Because AI can offer users prompts suggesting coping strategies such as breathing techniques and grounding exercises, it can be helpful. While it doesn’t replace the nuance of therapy, it can provide reminders and reinforcement for skills learned in sessions.

Using AI in Conjunction With a Therapist

Perhaps the wisest use of an AI therapist is not as a replacement but as a companion to therapy. A therapist can help you:

    • Integrate what you learned from an AI therapist. Instead of experimenting alone, your therapist can clarify which coping strategies are safe and effective for you situation.
    • Stay accountable. Unlike AI, a therapist can notice avoidance or overreliance patterns and bring them into the conversation.
    • Navigate deeper healing. Trauma, grief and relationship struggles require human empathy and professional judgment that no algorithm can offer.
    • Address risks. If AI responses ever increase your distress, a therapist can provide context, reframe the experience and redirect you toward safe practices.
a human hand touching a robot hand to demonstrate using AI as a companion to therapy.

In this way, AI becomes like a journal or workbook, a supportive tool that complements but never replaces the healing relationship at the heart of therapy.

How to Use AI for Mental Health Safely

If you decide to use AI tools, consider the following guidelines: 

    1. Use AI as a supplement to therapy and not as replacement.
    2. Check privacy policies and avoid apps that share data with third parties.
    3. Choose transparent, evidence based tools rather than unverified apps.
    4. Monitor your reactions and discontinue if anxiety, shame, guilt or dependence increases.
    5. Pair AI with professional support whenever possible.
    6. Know when to stop. AI is unsafe for crises, trauma or severe symptoms.
    7. For parents: Monitor and limit your child’s AI use, just as you would with social media, especially if you notice changes in mood or behavior.

When AI Alone is Not Enough

Do not rely solely on AI if you are:

    • Experiencing suicidal thoughts or self-harm urges
    • Struggling with trauma, PTSD, severe depression or severe anxiety
    • Managing eating disorders, psychosis or other complex conditions
    • Living in an unsafe or abusive situation

In these cases, professional care is essential. 

Key Takeaways

  • AI tools cannot replace a licensed therapist. They lack human empathy, nuanced judgment and ethical accountability.
  • AI therapy carries real risks, including privacy concerns, bias, overreliance and inconsistent crisis response.
  • When used thoughtfully, AI can help with coping skills, mood tracking and reinforce health habits.
  • The best approach is to use AI alongside therapy ensuring safety and effectiveness.
  • Know your limits: Avoid relying on AI if experiencing suicidal thoughts, severe trauma or complex mental health conditions.

FAQs About AI Therapy

Can an AI therapist replace a human therapist?

An AI therapist can offer support and structure but they cannot replace the expertise, ethical accountability and human relationships of a licensed therapist. AI is best used as a supplement alongside professional mental health care.

Is AI therapy safe for teens and young adults?

AI therapy apps have risks for teens and young adults, including exposure to unsafe content, privacy concerns and inconsistent crisis responses. Professional oversight is recommended for minors using AI mental health tools.

Are AI therapy apps private?

Most AI apps are not bound by HIPAA. Some have shared sensitive user data with third parties. Alway check privacy policies and avoid unverified platforms.

How can I use AI therapy tools safely?
  • Treat AI as a supplement and not a replacement for therapy.
  • Monitor your reactions; discontinue if anxiety, shame or dependence increases.
  • Pair AI with a licensed therapist for guidance, accountability and deeper healing.
  • Avoid AI alone for crises, trauma or severe symptoms.
When should I avoid AI therapy entirely?

Do not rely solely on AI if you are:

  • Experiencing suicidal thoughts or self harm urges.
  • Struggling with trauma, PTSD, severe depression or severe anxiety.
  • Managing eating disorders, psychosis or other complex conditions.
  • Living in an unsafe or abusive environment.
What is an AI therapist?

An AI therapist usually refers to a chatbot or app designed to mimic therapy conversations. While it can provide prompts, exercises or reflective questions, it is not a licensed professional and cannot replace real therapy.

Conclusion

An AI therapist might offer accessibility and structure but it cannot replace the empathy, ethical accountability and expertise of a licensed therapist.

The wisest approach is to use AI as a supportive supplement while working with a real therapist who can provide depth, compassion and safety.

Your mental health deserves both technology and human connection. If you’ve tried AI tools and want deeper support, book a free consultation today to explore how therapy can help you thrive. Let’s get started now on your journey to well being.

Judy Wang, LCPC, CPC

Judy Wang, LCPC, CPC

Judy Wang is a Licensed Clinical Professional Counselor in Maryland, Nevada, South Carolina, and Vermont. She is EMDR Certified and trained in Exposure and Response Prevention (ERP) for Obsessive-Compulsive Disorder. With over a decade of experience, Judy specializes in helping individuals navigate anxiety, trauma, OCD, and people-pleasing patterns. She provides personalized care for teens and adults seeking deep, long-term healing and emotional wellbeing.

Learn more about working with Judy →

Or schedule a free consultation to get started.

Share
Prev
Next

Archives

  • 2025
  • 2024

Categories

  • Anxiety
  • Counseling
  • Featured
  • Mental Health
  • People Pleasing
  • Therapy
  • Trauma

Recent Posts

  • A computer chip with the letter AI printed on it for AI Therapy: Risks, Benefits and How to Choose Wisely blog.
    AI Therapy: Risks, Benefits and How to Choose an AI Therapist Wisely
  • Blue toy figure hovering over toilet representing emetophobia
    Emetophobia: Understanding and Treating the Fear of Vomiting
  • Modern car driving along curvy asphalt road amidst lush autumn trees in countryside on sunny day for blog on unhooking: ways to move forward in life
    Unhook from Overthinking: ACT Therapy Tools to Live with Purpose
  • Explore a forked trail in Manning Park, BC amidst lush greenery and conifer trees for blog on Choices: A fork in the road
    Navigating Life’s Crossroads: Making Choices That Align with Your Values and Bring You Joy
  • A small Chihuahua dog peacefully sleeps on a soft blanket draped over a leather sofa for blog on Comfortable does mean good
    Why Your Comfort Zone is Holding You Back

Healing Hearts Counseling

Healing Hearts Counseling, provides psychotherapy services for adolescents and adults.

    • Serving Bethesda, Ellicott City, Potomac and all of Maryland
    • Servicing Enterprise, Henderson, Summerlin and all of Nevada
    • Serving Charleston, Columbia, Greenville and all of South Carolina
    • Serving Chittenden County, Burlington and all of Vermont.

Quick links

  • Home
  • Treatment for Anxiety
  • Treatment for OCD
  • Treatment for Trauma
  • Contact Me

© 2014-2024 Healing Hearts Counseling, LLC. All Rights Reserved.

Terms & Conditions  |  Privacy Policy  |  Social Media Policy  |  Good Faith Estimate