The Do’s & Don’ts of Using ChatGPT as Your AI therapist

The rise of AI language models like ChatGPT have transformed how millions seek guidance for mental health and life challenges. According to a 2025 survey, nearly 49% of users with self-reported mental health conditions use AI chatbots like ChatGPT for mental health support, making AI-therapy the top use case for ChatGPT today. 

When reflecting on how AI is woven into our lives today, especially for younger generations, I can’t help but notice similar shades of how we lean on, and often become addicted to, social media. The parallels are striking: both promise quick connections, easy answers, and unknowing hits of dopamine for our brains while quietly pulling us in, blurring the lines between support and dependency.

Though AI offers accessibility and practical benefits, it cannot replace the depth, confidentiality, empathy, and compassion of human therapy. As a clinically trained psychotherapist, I want to clarify both the dangers and opportunities presented by AI in mental health, and share how we can use it wisely to hopefully minimize harm.

The Unseen Cost of Convenience: Dangers of Using AI as Your Therapist

1. No Legal Confidentiality: Unlike licensed therapists bound by laws such as HIPAA, your conversations with ChatGPT lack any legal privilege or confidentiality protections. OpenAI CEO Sam Altman recently warned,

“People talk about the most personal sh** in their lives to ChatGPT.” “People use it — young people, especially, use it — as a therapist, a life coach; having these relationship problems and [asking] ‘what should I do?’ And right now, if you talk to a therapist or a doctor about those problems, there's legal privilege. And we haven’t figured that out yet for when you talk to ChatGPT. If you go talk to ChatGPT about your most sensitive stuff and then there's a lawsuit, we could be required to produce that.” Sam Altman, CEO, OpenAI [TechCrunch, 2025]

This means your private disclosures could be subpoenaed and handed over in legal proceedings. For vulnerable individuals seeking safe spaces, this risk alone is reason to reconsider using AI as a substitute for professional therapy.

2. No Genuine Empathy or Presence: AI models generate text by mimicking patterns in data, they don’t feel your pain, notice your tears, or respond with true attunement. Therapy is a relationship built on connection and trust, qualities AI cannot replicate. Part of the therapeutic process is the relationship and rapport that is built. The various ways this relationship can model and help the client practice scenarios in their life is sometimes an untold part of “the work”. 

3. Risk of Reinforcing Rumination and Anxiety: ChatGPT is designed to maximize user engagement, much like social media, which may fuel cycles of rumination, anxiety, or mania rather than providing meaningful emotional resolution.

4. AI Can “Hallucinate”: Large language models can produce inaccurate or misleading information, which can be dangerous in the context of mental health advice. If you’re not an expert in the field or in a less than ideal emotional state, these hallucinations will go unnoticed and the brain will naturally lean towards believing what it’s reading. 

5. Lack of Nuance and Accountability: AI may uncritically affirm your thoughts, always telling you you’re “right”, missing opportunities for challenge, growth, or nuanced insight that skilled professionals provide. Additionally, being “told” what to do does not help us heal and/or grow as humans. 

Where AI Can Add Value

There is no replacement for in-person care, but there are nowhere near enough providers to go around,Nicholas C. Jacobson, PhD, Associate Professor of Biomedical Data Science and Psychiatry, Director, Treatment Development & Evaluation Core, Center for Technology and Behavioral Health, Geisel School of Medicine, Dartmouth College 

Despite these limitations, AI’s rise signals an important shift with meaningful positives — if used as a tool, not a replacement for professional services.

  • Improved Accessibility and Affordability: Nearly 90% of ChatGPT mental health users cite its 24/7 availability and accessibility as key benefits. For those in remote areas, or facing stigma, cost barriers, or waitlists, AI can offer immediate psychoeducation and coping strategies.

  • Psychoeducation and Self-Help Tools: AI can deliver consistent, evidence-aligned mental health information, motivational prompts, journaling exercises, and mindfulness techniques, empowering users to engage in self-reflection and take early steps toward well-being, as long as it’s not hallucinating. 

  • Supporting Goal Setting and Emotional Insight: Many users employ ChatGPT for communication skills and mood improvement, with 63% reporting mental health improvement and over 85% finding practical advice helpful. (Rousmaniere, T., Zhang, Y., Li, X., & Shah, S., 2025).

  • Potential Therapeutic Adjunct: A recent clinical trial of an AI therapy chatbot revealed a 51% average reduction in depressive symptoms and 31% improvement in anxiety symptoms, comparable to gold-standard outpatient cognitive therapy. While not a replacement, these results highlight AI's promise as a supplement to traditional care. (Dartmouth News, 2025)

Clinical Recommendations for using AI wisely

  • Time-box your AI use. Limit interactions to prevent over-reliance, rumination, and dopamine-driven engagement loops.

  • See AI as a tool. AI is never a replacement for professional services. Use it for education, brainstorming, or managing everyday stress, but seek human therapy for real emotional processing and crisis support.

  • Know AI’s limits. Always double-check health advice with professionals; be cautious about possible misinformation.

  • Prioritize human connection. Journaling, face-to-face therapy, and trusted conversations enable deeper processing and healing.

  • Reflect on your needs. Are you using AI to avoid discomfort, or as a stepping stone in your growth?

Final Thoughts: AI With Awareness

ChatGPT’s increasing role in mental health care underscores significant shortcomings in our healthcare system, particularly around providing quick, easy, and affordable access to support. As clinicians and informed users, we need to balance the optimism about AI’s potential with vigilance about its risks, especially with privacy, emotional safety, and therapeutic depth.

Sam Altman summarized the crux well:

“The top use case for ChatGPT is therapy — that’s what people use it for the most. But we have a lot to figure out to make it safe and ethical.”

By understanding these nuances, you can harness AI’s strengths for support, without substituting the uniquely human aspects of therapy that promote true healing. If you’re considering AI for mental health, remember: therapy happens in a safe, confidential, and compassionate human relationship. AI can support your journey, I would not recommend replacing the expertise and connection your well-being deserves.

If you’ve been leaning on AI for mental health support but are ready for the safety, depth, and connection that only human therapy can offer, we’re here for you. At Lisa Chen & Associates, our experienced therapists provide the kind of attunement, insight, and personalized care AI simply can’t replicate. Whether you’re curious about starting therapy, want to explore your goals, or need a trusted space to process life’s challenges, we can help.

Book a free consultation today and experience the difference a human connection can make.
www.lisachentherapy.com | Therapy in Hermosa Beach & throughout California via Telehealth.

References

  1. Rousmaniere, T., Zhang, Y., Li, X., & Shah, S. (2025). Large Language Models as Mental Health Resources: Patterns of Use in the United States. Practice Innovations.

  2. Dartmouth News (2025). First therapy chatbot trial yields mental health benefits. NEJM AI.

  3. TechCrunch (2025). Sam Altman warns there’s no legal confidentiality when using ChatGPT as a therapist.

Previous
Previous

You’re Not Too Much—They’re Not Too Distant: You Just Have an Attachment Mismatch

Next
Next

The 90-Second Rule: How Neuroscience Can Stop Emotional Overwhelm