For what must be the hundredth time, I recently heard a therapist colleague say that artificial intelligence can never replicate humans, so it is irrelevant to psychotherapy. But the exponential growth of AI processing power will eventually impact greatly on the world of psychotherapy. Well outside the purview of the therapy community, another step change just took place.
Does AI processing power matter for psychotherapy?
This week, Nvidia CEO Jensen Huang hand-delivered the first DGX H200 to the bosses of OpenAI, best known for their publicly available artificial intelligence (AI) app, Chat GPT. Tom’s hardware tells us that the DGX H200 is a “brand-new and rocket-fast GPU-based server …holding the new H200 Tensor Core GPU inside a powerful enterprise-grade server shell.” But too much jargon already! It’s enough to know that this represents a massive leap in performance which will supercharge the speed and efficiency of AI workloads.
Why would this matter to a therapist like me? Because it represents the next generation in scale and speed of the hardware which underpins the high performance computing which runs AI. And with AI comes therapy bots. And it is becoming harder to predict what the delivery of psychotherapy will look like in ten or five or even two years’ time.
AI-powered therapy exists already
Therapy bots are here already. A quick search online reveals many platforms designed to support and enhance mental health and increasingly to deliver therapy.
Check out the therapy types used and you see that at least rudimentary cognitive behavioural therapy (CBT) interventions can be delivered by a therapy bot. And dialectical behaviour therapy (DBT) and acceptance and commitment therapy (ACT) are incorporated in others. If you know a little about therapy, this is not surprising. It is easier to extract the more process-driven elements from these and turn them into algorithms. They can then be facilitated by AI and self-administered without a human therapist. The same is true for aspects of life coaching. That says nothing about their quality or effectiveness. But as an indicator of how far things have come we note the experience of Matthias Barker.
A therapist himself, he writes in Therapy Today (May 2024) that he decided to build a therapist AI bot himself. He doesn’t say how well it worked. And he abandoned it over ethical concerns. But the significance here is what inspired him in the first place: he had had “such a positive therapeutic experience with AI”. Notably, he was not even using a therapy app, but the publicly available, free-to-use ChatGPT app. And who built it? OpenAI – who just took delivery of that latest Nvidia tech, the DGX H200.
Why AI speed matters in psychotherapy
With this ever more powerful tech, the large language models (LLMs) on which the likes of Chat GPT depends will be trained ever faster using even larger amounts of data. And with AI processing power deployed in psychotherapy, the user interface in therapy apps will be so much faster too. Soon AI-powered apps will be able to respond to users (such as therapy clients) several times faster than the fastest typist in the world. It will respond faster than human therapists providing text-based therapy possibly could. The seemingly instant answers make the experience feel one step closer to interacting with another human.
Can AI processing power in therapy mimic a human?
Of course, it takes far more than a snappy response for us to experience a virtual therapist as human. So let’s pick a therapist skill that is more delicate and refined. For example, managing the infinitely subtle and sometimes tenuous dynamic which is the co-regulation that happens between two human autonomic nervous systems. A therapist will to some extent be conscious of and managing this. If language-based AI can be integrated with an AI which monitors a therapy client’s emotional state, then it could respond accordingly. (Ideally this would be done without the client having to use wearable device, especially while in a state of distress.) But co-regulation is the interaction between two human nervous systems. It is still hard to see how an AI-powered therapy app will even begin to replicate this from its side.
AIs intricate use of language should not be overlooked
But something often missed by sceptics is that, even here, words count for quite a lot. And it is words that underpin large language models. Matthias Barker says he “grew tearful and felt deeply moved by the insights” facilitated by his Chat GPT experience. If we take that literally, then what he realised intellectually produced an emotional response in him.
Taking this step further, we already know that humans sometimes project a human-ness onto machines (robots) that doesn’t exist. If AI can deliver words that are extremely well attuned to human distress, then this projection might be extended to an AI. In fact, I regard it as more likely than not that the approximation to humans may turn out to be something to which human nervous systems respond. (And remember, even the attunement of even one human to another will often be imperfect and inconsistent.)
AIs will learn to approximate a humanly-provided therapy environment
Moreover, AI learns in ways that humans don’t. In very short order, AI finds new and successful ways of operating and tackling problems that humans have never stumbled upon. With feedback as to how well it is working, it is massively effective in fine-tuning its approach. This increases the possibility of a human connecting positively on a emotional level with a machine. Of course, this is not yet therapy. And therapy involves working with the more difficult aspects of relationship between therapist and client, not just building a warm, fuzzy vibe. But it is a step towards using AI processing power to build the conditions in which psychotherapy can potentially take place. And the pace of increase in these capabilities can only enhance this to levels we have not yet seen.
How we experience future AIs will affect psychotherapy too
How far will this go? Imagine an alien race that can analyse human neurology and psychology so minutely that it can mimic an experienced human therapist. And do it so well that the client cannot tell the difference. AI is not there yet. But processing power is growing so fast that no one knows how soon this might come. In fact, Mustafa Suleyman, Microsoft AI CEO, calls for a new metaphor to better imagine how we will experience AI in the near future. While acknowledging his offering as imperfect, he suggests thinking of AI as a new species – a digital species. It won’t be human, but will be able to mimic humans closely.
And, what no one is yet saying: we cannot know for sure that AI can never replicate human capabilities in therapy. More surprising still, being human may turn out not to be the decisive factor.
That said, I don’t feel at risk of being out of a job anytime soon!
AI processing power will push therapy beyond what we can imagine
Motivation for developing therapy apps will vary widely. From a genuine desire to help people with mental health problems. To commercial reasons. To possibly – probably? – exploiting vulnerable people. Striking as it already is, this development has hardly begun. The explosively fast increase in AI processing power will drive the development of innovative therapy apps with unforeseen capabilities and implications. And this will fuel a cascade of debate on safety and efficacy. One thing we can be sure of, though, is that the deployment of AI processing power in psychotherapy matters.
Read more: The ethics of AI-powered psychotherapy
Header image by Delta Works.
