NEWS

Chat Bots and AI Are Changing Mental Health Care, Beware

person typing on chatgpt

Galeanu Mihai / Getty Images

Key Takeaways

  • Some people are turning to AI platforms like ChatGPT for mental health support, but this is risky when they definitely aren't qualified medical professionals.
  • AI chatbots for mental health are being developed, however.
  • AI in mental health is still relatively new but could become more useful in the future.

Since its launch at the end of November last year, ChatGPT has taken over the internet's imagination. The artificial intelligence chatbot developed by OpenAI has a seemingly unlimited number of uses—when it comes to information generation—and it definitely doesn't stop at writing essays and cover letters. Now, people are also starting to use it as a therapist.

How does this work? ChatGPT provides information in a conversational way, so if you ask it a question relating to your feelings of hopelessness and fear about the future, it can offer you tips pretty fast. It may not be a real person, but it delivers the exact information you need without you having to sift through pages of online information, nor does it keep you waiting for an appointment with a human (or online) therapist.

The only problem is, ChatGPT isn't designed to be a therapist, and it isn't updated past 2021. It should go without saying there are numerous risks to getting health suggestions from an AI robot who did not go to medical school, but it does spark a broader conversation about the potential uses of AI in the mental health sector—when applied appropriately.

“ChatGPT and the conversations around impact could actually open up more opportunities,” explains Chaitali Sinha, head of clinical development and research at Wysa. “Rather than fear around chatbots there is now an expectation they can be used for positive impact. People are expecting more out of AI than they used to."

Potential Benefits of AI in the Therapy Context

Even if we’re to discount ChatGPT for the moment, there are still a lot of advancements being made when it comes to mental health and AI. Platforms and apps like Wysa are helping to change the way we think about mental health—as Sinha says, “It should be just as normal to have a mental health app on your phone and in your hand as any other app”. 

Human therapists are just that—human. They can be biased, even if it's subconsciously. A therapist could be racist, homophobic, or ableist—using an AI therapist removes the risk of someone speaking to a therapist who might not actually work for them.

Or, people might feel worried about their therapist judging them or sharing confidential information, or perhaps not quite feel ready to speak to a real person about sensitive topics—particularly if they’ve had negative experiences with therapy in the past. 

Not only that, but sometimes therapists can get sick, or have things going on in their own lives that mean they need to cancel or postpone appointments. They might burn out, or not be on top form for an appointment. In contrast, an AI therapist won’t. 

Ergo Sooru, CEO of DrHouse

In the future, it is likely that AI-based tools will be able to provide more personalized and tailored treatments for individuals suffering from various mental health issues by utilizing a variety of data points such as lifestyle habits, medical history, and more.

— Ergo Sooru, CEO of DrHouse

And there are often long waiting lists for therapy. A human therapist can only speak to one person, or group, at a time. And they need sleep, lunch breaks, and time away from work too. Could AI therapy get waiting lists down, or become a more affordable option for people struggling to find the money for therapy? 

“People don’t experience struggles at ‘convenient’ times,” says Sinha. “Their mental health is always there, becoming more challenging when it does, not when it’s easiest to get help.

“But none of our therapy mechanisms that we have so far actually are designed for the lived experience of someone with mental health and AI is bridging the gap.”

She brings up that often, people will have an appointment once a week, whereas they might need support at another time. She describes AI as “bridging the gap of the needs people have”, in a way that humans just can’t. 

“People are unlikely to call crisis support helplines every night, but can get support from apps like Wysa, and health professionals can provide Wysa to use in between sessions, so people are not left with no support.” 

Drawbacks of Artificial Intelligence

There are potential drawbacks to AI therapy too. 

Possible security and privacy issues are a big one. When you share information with an AI therapist, where does it go? Could there be risks of hacking, for instance? Indeed, the ethics of AI in mental health care is something that’s previously been discussed, with researchers weighing up the trade-off between more personalized care and concerns around privacy, autonomy, and bias.

Also, human connection is really important in therapy. Even though it's a doctor-patient relationship, the face-to-face interaction is a huge part of what makes therapy effective. It's why people open up.

“AI-based therapies are not able to interpret the non-verbal cues which are essential for a successful therapeutic relationship,” says Ergo Sooru, co-founder and CEO of DrHouse. “Nor are they able to provide the same level of emotional support and understanding as a human can.”

The development of AI therapists is certainly exciting. But we aren’t quite there yet. Even the best AI therapist won’t be of the same standard as a good, qualified, and capable human therapist. 

What This Means For You

The possibilities for the future of mental health care are vast, and artificial intelligence might be a viable option down the road. But right now people should be wary of the temptation to use tools for Chat GPT for mental health advice. The technology just isn't there yet. But even if it does get there one day, there is no substitute for in-person therapy, and the goal should be supplemental care, not a replacement for talking to a certified professional—who is human.

Incorporating AI Into Mental Health Care

A combination of human and AI therapy is probably the best option looking forward. Quick support when you're in dire need, but human care as your main, ongoing solution for care. After all, everybody is different. For some people, a human therapist might be the only option they entertain. Others might be more open to AI therapy.

Sooru says that there should be a great degree of human intervention in AI therapy “to ensure that the patient is receiving the best care possible”. 

But AI can still be used in mental health care even if the therapists themselves aren’t AI. A study from 2019 involved AI categorizing thousands of statements from therapists in online cognitive behavioral therapy (CBT) sessions spanning 90,000 hours, and the study supports the idea that CBT is linked with improvement in patient symptoms. 

Studies like this one show how AI can support mental health care in other ways, particularly as advancements continue to be made. 

The Future of Mental Health and AI

“​​It’s still early days for AI related to mental health, although clearly changes are happening,” says Sinha. “But what we are seeing is an appetite for a digital solution. Wysa might be artificial intelligence but the team behind it are very much humans, and bring immense expertise and experience to ensure that Wysa provides empathy and compassion.

“What we want is to offer people the tool that works for them, on their terms.”

“In the future, it is likely that AI-based tools will be able to provide more personalized and tailored treatments for individuals suffering from various mental health issues by utilizing a variety of data points such as lifestyle habits, medical history, and more,” adds Sooru.

“Overall, AI-based tools can be beneficial in aiding mental health, but currently, they should not be used as a replacement for traditional forms of therapy.”

4 Sources
Verywell Mind uses only high-quality sources, including peer-reviewed studies, to support the facts within our articles. Read our editorial process to learn more about how we fact-check and keep our content accurate, reliable, and trustworthy.
  1. Yang Y, Hayes JA. Causes and consequences of burnout among mental health professionals: A practice-oriented review of recent empirical literaturePsychotherapy. 2020;57(3):426-436. doi:10.1037/pst0000317

  2. Rubeis G. iHealth: The ethics of artificial intelligence and big data in mental healthcareInternet Interventions. 2022. doi:10.1016/j.invent.2022.100518

  3. Ewbank M, Cummins R, Tablan V, et al. Quantifying the association between psychotherapy content and clinical outcomes using deep learningJAMA Psychiatry. 2020;77(1):35-43. doi:10.1001/jamapsychiatry.2019.2664

  4. Koutsouleris N, Hauser TU, Skvortsova V, De Choudhury M. From promise to practice: towards the realisation of AI-informed mental health careThe Lancet Digital Health. 2022;4(11):e829-e840. doi:10.1016/S2589-7500(22)00153-4