It feels like everyone’s talking about AI these days — chatbots, smart assistants, algorithms that seem to know what we want before we do. And it’s slowly finding its way into the world of mental health too.
There are now apps that offer therapy-style chats, AI tools that track moods, and platforms that suggest coping techniques when you’re struggling. It’s fascinating, and in some ways, it’s hopeful — technology making mental health support more accessible, especially for people who might not otherwise reach out for help.
But as a counsellor, I find myself wondering: what happens to the human side of therapy in all this?
Isn't human connection... well an important part of being a human?
I see the human connection, the relationship and trust an integral part of therapy and indeed mental health. I just wonder where this will leave us if we start to rely on AI, technology and ChatGPT for our mental health.
The helpful side
Let’s be fair — there’s real potential here. I am not going to say there is not a benefit because there is.
AI tools can make therapy and mental health support more accessible. They can offer something when services are stretched, or when people need a check-in between sessions. They can help with journaling, tracking emotions, to spot patterns, or even remind someone to take a breath when life feels overwhelming.
And for therapists, AI might even take away some of the admin load — letting us focus more fully on the person in front of us. That part I can get behind. Yes... I use it for creating tools, my diary, my website, business admin and an absolute timesaver in proofreading and creating an understandable blog post that makes sense (taking out all my waffle, shortening/summarising, re shuffling, creating my titles and adding in my SEO words... etc.).
However, just to make it clear it is ABSOLUTELY NOT for any kind of client information, notes or details (hello...GDPR and Privacy).
Where things get complicated
But… here’s the thing.
AI can do a lot, but it can’t do this....
It can’t offer presence.
It can’t sit with silence, or notice the way someone’s breathing changes when something painful comes up.
It can’t sense the tremor in a voice that says, “This is hard for me to talk about.”
It can't see the first tear rolling down someone's cheek.
It doesn’t hold space, or gently reflect something back so that it lands differently.
It doesn’t feel the moment when someone lets go, or smiles for the first time in weeks, or realises something that’s been sitting quietly underneath for years.
It can't offer you consistency, a seat, cold water/cup of tea, fidget tools, a tissue...
That’s what makes therapy human. It’s not just about talking — it’s about being witnessed and seen.
Being met where you are, without judgement, by another person who’s really there.
AI might be able to mimic words, but it can’t offer attunement. It can’t feel empathy.
... And it can’t build the kind of trust that helps someone feel safe enough to change.
....That’s the heartbeat of therapy — not the information we share, but the connection that holds it all together.
On another note- There’s also the issue of privacy. Emotional data is deeply personal — and when it’s stored, analysed, or shared by AI systems, it raises big ethical questions about confidentiality and safety. AI is always learning which I find scary but what exactly is it learning and where is all of the information you share (which can be deep and personal) going exactly?
Can you trust and build a relationship with AI?
A closing thought
AI might offer knowledge, tools, and data — but only humans can offer compassion.
Therapy is about being seen, heard, and understood — not by an algorithm, but by another person.
Add comment
Comments