Can ChatGPT Be Your Therapist? The Promise and Perils of AI in Mental Health Support

A woman sits in deep conversation with a humanoid robot representing ChatGPT, symbolising the emotional complexities and mental health risks of using AI as a substitute for therapy.

Hi - My name is Andy Selway-Woolley (he/him) and I’m a Clinical Solution-Focused Hypnotherapist. I run HeadFirst Hypnotherapy, based in Upper Heyford (near Bicester), Oxfordshire and also work online nationally.

With artificial intelligence (AI) tools like ChatGPT becoming increasingly accessible, many people are turning to them for emotional support, self-help, and even therapeutic advice. While there are some clear benefits to using AI in mental health contexts, there are also serious concerns. The question remains: can ChatGPT—or any AI—safely take on the role of a therapist?

The Promise of AI in Mental Health

Let’s start with the positives. ChatGPT and similar models offer immediate, anonymous, and non-judgemental responses 24/7. For those experiencing loneliness, stigma, or anxiety about reaching out to a real person, AI can feel like a low-pressure first step. Some research has shown potential for chatbots to reduce symptoms of anxiety and depression in the short term. For example, a randomised controlled trial by Fitzpatrick, Darcy and Vierhile (2017) found that the chatbot Woebot significantly reduced symptoms of depression in college students over a two-week period.

AI tools can also support mental health literacy, guiding users to better understand their emotions, identify helpful strategies, or signpost them to services. A review by Vaidyam et al. (2019) highlighted how conversational agents can assist in monitoring symptoms and encouraging treatment adherence. In these cases, AI acts as a complementary tool, not a replacement.

The Dangers of Over-Reliance

Despite these advantages, experts are quick to caution against confusing AI support with therapy. AI cannot form a genuine therapeutic relationship—something consistently shown to be central to therapeutic change (Wampold, 2015). ChatGPT doesn’t “know” you, nor can it detect non-verbal cues, tone of voice, or risk indicators such as suicidal ideation in the nuanced way a trained human can.

A 2023 study by Minielly, Hranchuk and Longstaff warns that over-relying on AI in vulnerable moments may lead to false reassurance or inappropriate advice, particularly when users interpret the output as professional guidance. Even though platforms like ChatGPT include disclaimers, many users still assign human-like intelligence or empathy to the responses, which can foster misplaced trust (Nass & Moon, 2000).

Crucially, AI lacks professional accountability. If a therapist gives harmful advice, they can be reported, regulated, and sanctioned. An AI chatbot cannot be held to these same standards. Nor does it have the capability to engage in safeguarding or crisis intervention, a critical limitation when someone is in distress or at risk of harm.

Ethical and Clinical Boundaries

There is also an ethical dimension. A qualitative study by Luxton (2020) raised concerns that AI-driven mental health tools risk blurring the boundaries between informational support and clinical treatment, potentially misleading users into thinking they are receiving therapy when they are not.

Moreover, data privacy remains a key issue. While OpenAI has made strides in improving user data handling, any use of AI for discussing sensitive topics like trauma, abuse, or mental illness raises questions about confidentiality and informed consent (WHO, 2021).

So, Can ChatGPT Be Part of the Picture?

Perhaps the most responsible answer is: yes, with clear limits. AI can provide a stepping stone—a way to reflect, explore feelings, or gain general knowledge. But it should not replace therapy, especially for those experiencing complex emotional struggles, trauma, or mental health diagnoses. As Wampold (2015) emphasises, healing often occurs through relational depth, attunement, and trust—qualities no chatbot can replicate.

If you're using ChatGPT to support your wellbeing, that’s okay. Just make sure it’s part of a wider toolkit. Reach out to a qualified therapist, talk to your GP, or contact support organisations if you’re struggling.

References

Fitzpatrick, K. K., Darcy, A., & Vierhile, M. (2017). Delivering cognitive behavior therapy to young adults with symptoms of depression and anxiety using a fully automated conversational agent (Woebot): A randomized controlled trial. JMIR Mental Health, 4(2), e19. https://doi.org/10.2196/mental.7785

Luxton, D. D. (2020). Ethical issues in AI-based psychotherapy. In Artificial Intelligence in Behavioral and Mental Health Care (pp. 139–153). Elsevier.

Minielly, N., Hranchuk, K., & Longstaff, H. (2023). Therapy bots and the illusion of intimacy: Ethical considerations in AI mental health tools. AI & Society, 38, 133–145. https://doi.org/10.1007/s00146-021-01206-2

Nass, C., & Moon, Y. (2000). Machines and mindlessness: Social responses to computers. Journal of Social Issues, 56(1), 81–103.

Vaidyam, A. N., Wisniewski, H., Halamka, J. D., Kashavan, M., & Torous, J. B. (2019). Chatbots and conversational agents in mental health: A review of the psychiatric landscape. The Canadian Journal of Psychiatry, 64(7), 456–464.

Wampold, B. E. (2015). How important are the common factors in psychotherapy? An update. World Psychiatry, 14(3), 270–277. https://doi.org/10.1002/wps.20238

World Health Organization. (2021). Ethics and governance of artificial intelligence for health: WHO guidance. https://www.who.int/publications/i/item/9789240029200


This blog is for general informational purposes only and does not constitute therapeutic advice, diagnosis, or a substitute for professional mental health care. The content is based on published research and reflects current academic and clinical perspectives on the role of artificial intelligence in mental health support. Readers are encouraged to seek guidance from a qualified therapist or medical professional for personalised support. This article does not imply or promote the use of AI as a replacement for regulated therapy.

The author is a qualified therapist; however, this blog is not a therapy session, nor does reading it establish a therapeutic relationship.

If you are in emotional distress or experiencing a mental health crisis, please contact your GP, a mental health professional, or an appropriate emergency service.


Book your 60-minute Initial Consultation today and start your journey towards better well-being

In-person at my Oxfordshire Therapy Room (Upper Heyford, near Bicester or Online via Zoom).

Andy Selway-Woolley, Solution-Focused Hypnotherapist, smiling and wearing a purple hoodie and cap, with a bookcase behind him. He looks happy and approachable. Based in Upper Heyford, nr Bicester, Oxfordshire, offering solution-focused hypnotherapy

My name is Andy Selway-Woolley (he/him) and I am a fully qualified Clinical Solution Focused Hypnotherapist and Psychotherapist. I run HeadFirst Hypnotherapy®, based in Upper Heyford (near Bicester), Oxfordshire. 

I assist people in regaining control of their lives by retraining their brains to overcome limiting thought patterns, master their emotions, and cultivate resilient behaviours for a brighter and more positive future.. I know… It’s awesome!

a nutshell, I ‘get you out of your own way’. Because, let’s face it, a lot of us are.

Solution Focused Hypnotherapy taps into the power of your subconscious mind so you can move forward towards the life you’ve always wanted to live. It’s a quick, practical way to address thought patterns, emotions and behaviours that are holding you back in life.

I’m a registered and accredited member of the Complementary & Natural Healthcare Council (CNHC), Association for Solution Focused Hypnotherapy (AfSFH) and National Council for Hypnotherapy (NCH).

I service the main towns and cities around Upper Heyford, including Bicester, Banbury, Oxford, Kidlington, Witney, Brackley, Charlbury, Northampton, Buckingham, Chipping Norton, Thame, Didcot, Abingdon, Henley-on-Thames, and Aylesbury, along with other local areas.

I also work nationally and offer remote hypnotherapy services online, so no matter where you're based, support is just a click away!

Next
Next

Why You Can’t Switch Off – And How Sleep, Stress, and Hypnotherapy May Support Your Brain to Reset