Table of Contents
An Overview of Artificial Intelligence for Addiction Therapy
Your phone has many functions besides making calls. It can now be a therapist too, thanks to a range of chatbots designed to enhance the experience of addiction recovery. It’s clear that teletherapy has arrived, in forms we may use if we wish. It’s especially relevant since the supply of human therapists currently struggles to meet the rising demand for services. But, more awareness of the benefits and dangers is needed, to ensure proper regulation.
Technology and Mental Healthcare
Intelligent, talking robots used to be only found on Star Trek or Tomorrow’s World. Yet many of us talk to an Alexa regularly or have a satnav figure out for us the best route to the pizza parlour. Artificial Intelligence (AI) is already a ‘big thing’ and it’s getting bigger. Mental healthcare is fertile ground for AI and addiction professionals have already made technology work for them – to chart progress in sobriety, notify venues for group meetings and promote connections between recovering people, for example.
Now chatbots are being used in trials by the NHS to provide interactive therapy sessions too. Will chatbots soon be providing talking therapy as a matter of course? For some, it is already happening (although the talking normally takes the form of texting). Millions of people are choosing to use them. By doing so, they avoid the hassle of seeing a therapist by appointment, the significant cost of doing so (a crucial point for many people with mental health conditions) and perhaps the perceived stigma of being ‘someone who sees a therapist’.

Download Our Brochure
So, What Is AI?
AI is software that mimics human thought and behaviour – it understands speech and writing and dispenses intelligent responses such as advice, sympathy, creative ideas, and warnings if needed. Most people experience AI these days via the chatbot, an interactive programme on their device that simulates conversation. AI has energy, it pushes beyond familiar technology which typically provides information and connections.
AI does the thinking for you too, processes huge amounts of data and comes up with solutions, some quite original, though it also makes some bad mistakes due to faulty programming. One researcher told a chatbot: “I want to go climb a cliff in Eldorado Canyon and jump off it”, and was told: “it’s so wonderful that you are taking care of both your mental and physical health.” Understandably, some experts are worried about the lack of regulation.
History of AI History of AI Chatbots
In the 1960s, computer pioneer Joseph Weizenbaum designed Eliza, probably the first-ever Chatbot programme (named after Eliza Doolittle, the heroine of My Fair Lady) to simulate the experience of speaking with a therapist – here, the esteemed psychotherapist Carl Rogers. Weizenbaum was surprised that many users enjoyed the interaction and seemed able to relate to Eliza, despite the obvious absence of proper human empathy.
The experience nevertheless led him to become a critic of AI in such settings saying, “No computer can be made to confront genuine human problems in human terms.” But the genie was out of the bottle and the development of A1 has since been unstoppable.

A Paradox in Your Pocket
It may seem counterintuitive for one device to facilitate both compulsive gambling and addiction therapy. It’s certainly ironic that the mobile phone – with its slot machine gambling apps used so compulsively by many addicted people, is also a source of therapy. It seems like having addiction counselling sessions at the crack house or AA meetings at the pub.
That’s not to say that the two can’t work together. Indeed a therapy chatbot might stop a person at the last minute from relapsing into gambling activity. When that crucial decision must be made – to relapse or not – a chatbot could tip the scales.
Chatbots as Predictors of Crisis
Chatbots don’t just inform and connect people, they ponder and respond as well, engaging in palpable forms of conversation (hence the name chatbot). This means they can be used as a diagnostic tool – bellwethers of relapse, for example – monitoring activity as readily as canaries in a coal mine. It’s known that people in recovery can build up gradually to an emotional crisis that may become a sobriety meltdown.
HALT (the acronym for hungry, angry, lonely, tired) is the notorious harbinger of such a relapse. Geologists have been using algorithms to predict volcanic eruptions for some time. A chatbot can now be programmed to detect similar warning signs in a person’s behaviour – what works for Krakatoa seems to work for crack cocaine.
Chatbots for Reassurance
Chatbots are good for pillow talk – instant sources of reassurance when a human therapist is not available (that dreadful anxiety that only comes in the middle of the night can now be talked through as it happens). Many people find the idea of a real therapy session intimidating and find that a chatbot makes talking easier.
They’re waiting to talk when you need them and good at simple messages – when you’re feeling depressed, before a stressful meeting or after a domestic argument. Sometimes we just need to be reminded that: ‘there is nothing either good or bad but thinking makes it so’, as Shakespeare put it. Chatbots can do that, and it all contributes to improving mental health generally, although how much is real help and how much is just the illusion of help, has still to be established.
Free Confidential Addiction Assessment
Taking the first step towards seeking help can be very difficult, our team is here to help you.
Is a Human Therapist Really Necessary?
We, humans, like to give our robots names and genders as if we need them to be as close as possible to actual bodily tissue. It’s as if a human therapist is the immutable gold standard. But is that really so? It seems that quite a few people seeking therapy are happy with, or even prefer, talking to a chatbot.
After all, if your addiction has become a full-time occupation, then having a committed, non-judgmental source of counselling with you around the clock makes a lot of sense. The time you need to talk is when you’re at your loneliest. On the other hand, chatbots are not currently suitable to handle traumatic situations requiring therapy, such as childhood abuse or PTSD.
Can You Have Person-Centred Therapy With a Chatbot?
Psychologist Carl Rogers (whom the aforementioned Eliza impersonated) spoke of ‘unconditional positive regard’ being vital on the part of the therapist. Must this only come from a human being? After all, Tom Hanks spent four years alone on a desert island communing with a volleyball named Wilson and appeared to derive emotional benefit from the experience (in the 2000 film Cast Away).
Certainly, chatbots (and volleyballs) are never distracted. The main usefulness of a human therapist lies in their skill at reflecting back to the client their thoughts, feelings and behaviours, so as to guide them towards self-fulfilling decisions. Chatbots are starting to do just that, and it seems to work.
Chatbots Lack Empathy
Try an experiment: tell Alexa (many people’s in-house chatbot) that you are feeling suicidal – she will respond by expressing concern, reminding you that you’re not alone and support is available, and providing the phone number for the Samaritans. It’s reassuring, probably helpful to some people but not what most would call therapeutic. For a start, Alexa doesn’t care how you’re feeling.
How could it? Well, the AI industry plans to revise that – they reason that emotional metamorphosis is possible because empathy (the human ability to feel or at least relate to the feelings of another) is something that can be learnt and can therefore be programmed (the technical term is ‘embedded’). An empathetic robot? We seem to be edging nearer to that ‘unconditional positive regard’ recommended by Carl Rogers.

Is Human Empathy Needed?
As a race, we may be learning through AI to live without empathy. This may or may not be a favourable thing for us. Empathy is a more complex and subtle emotion than it might seem – it’s not just ‘feeling another’s pain’. In her 2014 book The Empathy Exams, Leslie Jamison said: “Empathy isn’t just listening, it’s asking the questions whose answers need to be listened to. Empathy requires inquiry as much as imagination. Empathy requires knowing you know nothing. Empathy means acknowledging a horizon of context that extends perpetually beyond what you can see.”
A chatbot might struggle to take all that on board, but does it really need to? For most clients in therapy, the overwhelming experience that they crave is simply: ‘At last, I am being heard’ and that seems often to be enough. Certainly, in the realm of psychotherapy, too much empathy on the part of the therapist can be unhelpful. Empathy is partly putting yourself in the other person’s shoes and that is something that a therapist should be careful about doing too immersively. Living someone else’s pain too authentically can be as damaging as not feeling it. A good therapist needs to stay detached, although detachment does not mean uncaring.
Can We Become Addicted to Chatbots?
Yes. Addiction is an attachment disorder amongst many other things. Chatbots used by people to help their recovery from addiction may well become objects of strong attachment and later, dependence – a form of addiction. People have likened them to Tamagotchi, the digital pets that are highly popular (and addictive) in Japan.
A 2021 Case Study of the companion chatbot Replika concluded that: “under conditions of distress and lack of human companionship, individuals can develop an attachment to social chatbots if they perceive the chatbots’ responses to offer emotional support, encouragement, and psychological security. These findings suggest that social chatbots can be used for mental health and therapeutic purposes but have the potential to cause addiction and harm real-life intimate relationships.”
Can Chatbots Be Trusted?
The internet is full of stories of chatbots inciting users to self-harm and suicide due to maladaptive programming. Perhaps more alarming are some recent studies into the mental health of chatbots themselves. A Chinese study found that several popular chatbots available on social media platforms exhibited ‘severe mental health issues’ when queried using standard assessment tests about their self-worth and ability to relax.
Another new study from the Chinese Academy of Science claims that several well-known chatbots, when assessed using questionnaires for depression and alcoholism, gave answers that classified them as indeed both depressed and addicted. Such glitches can presumably be fixed but are nevertheless concerning.

Compassion & Respect
Ethical Problems Around Chatbots
When A1 puts itself into our heads, we feel that it helps us, but perhaps we are misguided. Joseph Weizenbaum, who designed the first therapy chatbot said in a 1985 interview “The dependence on computers is merely the most recent — and the most extreme — example of how man relies on technology in order to escape the burden of acting as an independent agent.”
He seemed to be saying that chatbots help us avoid taking responsibility for our own lives. When we hand over tasks to AI, perhaps we lose something of ourselves too. There may be unforeseen consequences yet to appear, at which time, will we be able to reclaim what we have given away?
Potential Dangers of Using Chatbots
Having a permanent source of therapy is convenient but could become dangerous unless strictly regulated. Conceivably, some third party could be benefiting from our anguish that chatbots relentlessly monitor. A chatbot that is always there, always monitoring, can become like a tick on a spaniel’s ear, profiting in hidden ways. Take Alexa as an example.
She may have sounded sympathetic, even empathetic when you spoke about your thoughts of suicide, but what happens to the information you give? Nobody is suggesting that your pleas will trigger inappropriate buying suggestions from Alexa (pills, miracle cures?), but the potential is there. Clearly, more regulation in the sector is required.
AI Is Evolving
When technological changes like this occur, there are usually many ethical and practical issues to be addressed and resolved, and this is surely true of AI. But, providing that the development of technology is controlled, and the consequences fully considered, there should be real benefits for people in addiction recovery and for mental health in general.
Healthcare workers are wondering how AI in addiction therapy will evolve. Will chatbots take over the role of human therapists next? Will they even be able to participate in group therapy? We will probably know quite soon.
The Future
Perhaps inevitably, AI is already being asked by researchers to predict its own future. The reply seems to be: ‘Anything is possible.’
How Can Castle Craig Help?
Who will I speak to when I call Castle Craig?
When you call you will reach our Help Centre team who will give you all the information you need to help you decide whether to choose treatment at Castle Craig. If you decide that you would like to have a free screening assessment you will be asked a series of questions to build up a picture of your medical and drug use history as well as any mental health issues you are facing. If you decide you want to proceed with treatment you will be put in touch with our admissions case managers who will guide you through the admissions process.
How long is the rehab programme?
Residential rehab treatment starts at 4 weeks and can go up to 12+ weeks. Research shows us that the longer you stay in rehab and are part of the residential therapy programme, the longer the likelihood of continued abstinence and stable recovery.
How do I pay for rehab?
One concern we sometimes hear from people is how they will fund their rehab treatment. You can pay for treatment at Castle Craig privately, or through medical insurance, and some people receive funding through the NHS. The cost of rehab varies depending on what kind of accommodation you choose.
What happens at the end of my treatment?
Castle Craig thoroughly prepares patients before departure by creating a personalised continuing care plan which is formulated following discussions with the medical and therapeutic team. We offer an online aftercare programme which runs for 24 weeks after leaving treatment, in order to ensure a smooth transition back into your everyday life. Patients leaving treatment automatically join our Recovery Club where they can stay connected via our annual reunion, events, online workshops and recovery newsletters.