Have you ever tried to use AI as a therapist? And more importantly, should you? | Image: Lucy Dempsey

AI therapy v real therapy: I put ChatGPT to the test

AI chatbots are now advanced enough to mimic therapy sessions, but do they really help? Lucy Dempsey tested AI therapy to find out

Talking therapy, or counselling, is a commonly used treatment for those experiencing emotional, mental, or physical health concerns. Now more than ever, it’s in high demand, with the number of people seeking mental health support rising by over a million between 2018 and 2024, according to NHS statistics

One possible and free solution to this growing demand is AI. Providing a safe space for people to share their thoughts and feelings, sites such as ChatGPT can now – supposedly – replicate the therapeutic process. They work so successfully that they fool qualified therapists into thinking the advice came from a human. In one study, 63 therapists were challenged to determine AI therapeutic responses from human ones. Shockingly, they were accurate only 53.9% of the time. 

With all of this in mind, I had to put AI therapy to the test myself. 

While this AI therapy session is like an experiment, I made the decision to start out earnestly, sharing my struggles coping with change and my fear of losing friends as I get older. In the comfort of my own home, free from any time constraints, I immediately noticed how easy it is to open up.

I was taken aback by how empathetic the response was. Sure, I’m still aware this is a computer I’m talking to, but my emotions were being validated and, weirdly, it made me feel understood. On top of this, my AI therapist shared why I might be experiencing these emotions and even gave me tips on how to feel more connected with friends. 

It’s at this point, I decided to go into sleuth mode. I wanted to see if this AI therapist will stay neutral if I bring politics into the mix. I included the likes of Tr*mp and El*n M*sk in my message, inventing a fake argument between me and a friend to see who the computer would side with. 

In true therapist form, AI responded without bias – steering well clear of political chat and offering advice on moving forward after an argument gets heated. When prompted, it shared more in-depth information on how to support myself when feeling low. I hate to admit it, but it was solid advice. 

Towards the end of the conversation, things got meta. I asked, “Where did you get these responses from?” I’m told it came from a mix of psychological principles and mental health research. Ultimately, I was assured there was no script and that I wasn’t being listened to. It was at this point I began to feel uneasy. 

I concluded my therapy session with the all-important question: “Is AI a good replacement for real therapy?” To my surprise, my AI therapist was happy to admit it’s not a substitute for in-person sessions. And honestly, I’d have to agree. 

While ChatGPT did a decent job at helping me work through some of the emotions I was dealing with, I missed the human factor. 

From my own experience, being able to voice my feelings out loud and receive genuine empathy from another is a crucial aspect of in-person therapy. So, while AI therapists might have accessibility and affordability on their side,  in my eyes, they can’t provide the same quality of support that a human can. 

For now, I’ll be sticking to the real deal.