NewsBite

People are turning to AI apps like Chat GPT for therapy

AI apps like ChatGPT are rising in popularity, but one expert warns people might be using them for all the wrong reasons.

Humanoid robots race humans in a half-marathon in Beijing

Cast your mind back to the first time you heard the phrase, “Google it.”.

Early to mid 2000s, maybe? Two decades later, “Googling” is swiftly being replaced by “Ask ChatGPT.”

ChatGPT, OpenAI’s groundbreaking AI language model, is now having anything and everything thrown at it, including being used as a pseudo-therapist.

Relationship issues, anxiety, depression, mental health and general wellbeing – for better or worse, ChatGPT is being asked to do the heavy lifting on all of our troubles, big and small.

This is a big ask from what was infamously labelled a “bullshit machine” by Ethics and IT researchers last year.

ChatGPT is being asked to do the heavy lifting on all of our troubles. Picture: Sebastien Bozon/AFP
ChatGPT is being asked to do the heavy lifting on all of our troubles. Picture: Sebastien Bozon/AFP

The role of AI in mental health support

A recent report from OpenAI showed how people were using the tool, which included health and wellbeing purposes.

As artificial intelligence is accepted into our lives as a virtual assistant, it is not surprising that we are divulging our deepest thoughts and feelings to it, too.

There are a variety of therapy apps built for this specific purpose. Meditation app Headspace has been promoting mindfulness for over a decade.

But with the rise of AI over the last few years, AI-powered therapy tools are now abound, with apps such as Woebot Health, Youper and Wysa gaining popularity.

It’s easy to pick on these solutions as gimmicks at best and outright dangerous at worst. But in an already stretched mental healthcare system, there is potential for AI to fill the gap.

According to the Australian Bureau of Statistics, over 20 per cent of the population experience mental health challenges every year, with that number continuing to trend upwards.

When help is sought, approaches which rely on more than face-to-face consultations are needed to pick up the slack in order to meet demand.

People are turning to AI for health and wellbeing purposes. Picture: Supplied
People are turning to AI for health and wellbeing purposes. Picture: Supplied

Public perception of AI therapy apps

The prevalence and use of AI therapy apps suggest there is a shift in the public perception of using tech to support mental health.

AI also creates a lower barrier to entry. It allows users to try these tools without needing to overcome the added fear or perceived stigma of seeing a therapist.

Which comes with its own challenges, notably lack of oversight of the conversations taking place on these platforms.

There is a concept in AI called human-in-the-loop. It embeds a real life professional into AI-driven workflows, ensuring that somebody is validating the outputs.

This is an established concept in AI, but one which is being skipped over more and more for pure automation.

Healthcare generally has human-in-the-loop feedback systems built into it – for example, a diagnosis is double checked before action is taken.

Strict reliance on AI apps alone typically skips this part of the process.

There are risks involved in using Chat GPT this way. Picture: Supplied
There are risks involved in using Chat GPT this way. Picture: Supplied

The risks of replacing human therapists with technology

The fact is we are asking important questions of something that does not have genuine, lived experience.

For a start, OpenAI states that ChatGPT has not been designed to replace human relationships.

Yet language models are general purpose tools, and users will inevitably find a way to put them to work in new and unexpected ways.

There are few conversational limits in place and it is available to users all day, every day.

Combine that with its natural communication style, its neutral emotional register and ability to simulate human interaction – treating it as a confidant is a logical development.

But it is important to remember: whatever wisdom it imparts is a pastiche of training data and internet sources.

It cannot truly know if its advice is good or bad – it could be convincingly argued that it also does not care if it is giving you the right advice.

OpenAI states that ChatGPT has not been designed to replace human relationships. Picture: Marco Bertorello/AFP
OpenAI states that ChatGPT has not been designed to replace human relationships. Picture: Marco Bertorello/AFP

Let’s push this thought further: AI does not care about your wellbeing. Not really.

It will not follow up if you don’t show up on schedule, nor will it alert carers or the authorities if it believes something is wrong.

We get into even pricklier territory when we recall that AI wants you to respond well to it, which increases user preference ratings and keeps you coming back for more.

This is where living, breathing therapists are key. Their gut instincts are not running on any definable algorithm.

They use their knowledge of a patient and their years of training and experience in the field to formulate care plans and appropriate responses if things are going off track.

“The risk is that people see new tech as a panacea,” says Macquarie University Psychological Sciences Professor Nick Titov.

“But we are working with very vulnerable people. We have a responsibility and duty of care.”

Titov is Executive Director of Mindspot, a digital psychology platform funded by the Australian Government.

The free service seeks to remove obstacles when needing to access mental health support. K

ey to the platform is the ability for people to access real, qualified therapists.

“Whether it’s our mental health or general health, use cases will always differ, and so there are nuances which must be considered. Tech alone is not an end to end solution.”

Artificial intelligence is being accepted into our lives as a virtual assistant. Picture: Getty
Artificial intelligence is being accepted into our lives as a virtual assistant. Picture: Getty

Real vs. simulated care

So, while AI support might not be “real”, does the distinction actually matter if the user feels cared for?

As long as users feel AI solves or alleviates their immediate concerns, it will continue to be used.

But the majority of people seeking AI-driven therapy will turn to largely unmonitored platforms – including tools like ChatGPT, which were not purpose-built.

One promising approach mixes the supervision of real-life professionals with AI.

Australia-based clinical psychologist Sally-Anne McCormack developed ANTSA, an AI therapy platform which gives therapists visibility of conversations their clients have with the AI-powered chatbot.

“I see AI as a support tool,” McCormack says. “But with most apps, you don’t know what the AI is saying, and you don’t know what the clients are saying back.

“I couldn’t believe nobody was overseeing it.”

More people than ever are suffering from mental health issues. Picture: Supplied
More people than ever are suffering from mental health issues. Picture: Supplied

The app provides users with prompts and recommendations, but does so under the watchful eye of their treating practitioner.

“We make it clear to users that you are speaking to AI. It is not a real person and your practitioner may view the conversation,” she said.

“Even so, clients are telling our AI things they have never told their practitioners. In that moment, there’s no judgement.”

Convenience, availability, lack of judgement – all of these are factors in people using AI for everyday tasks.

Just as “Google it” reshaped how we seek information, “Ask ChatGPT” is reshaping how we build a spreadsheet, create stories, seek advice – and ultimately navigate this thing called life.

But maybe mental health support demands something fundamentally more human.

The ongoing challenge will be deciding precisely how AI and human expertise come together.

Tristan Foster is a digital technology specialist. He is Principal Consultant at Grand West Consulting and writes on developments in AI.

Originally published as People are turning to AI apps like Chat GPT for therapy

Original URL: https://www.thechronicle.com.au/technology/online/people-are-turning-to-ai-apps-like-chat-gpt-for-therapy/news-story/22134d19107c0498ff63b28ace50fd03