Mental health care is as essential as caring for any other part of our bodies. Perhaps even more so, as our brains lead all aspects of our lives. People seeking care for their mental health may go to therapy, where they see a trained professional who helps them work through thoughts and emotions, and sometimes suggests changes that may help. We might go to a therapist for work stress, relationship problems, generally feeling anxious, or even issues like C-PTSD.
With the rise of artificial intelligence (AI), however, people are turning to chatbots like ChatGPT for therapy. Chatbots are AI systems that let users talk back and forth through a text-based format as if they are speaking to a real person. These are appealing because seeing a therapist is expensive, it must be scheduled into your day, and sadly, sometimes people feel ashamed of needing mental help at all.
However, using AI chatbots for therapy is never a good idea. There are no regulations in place to protect people who use AI in a way that can impact their mental health. Chatbots are known to provide untrue and unsafe information that can actually hurt the user and negatively affect them mentally in the long run. Health care professionals have a big tip for those using ChatGPT and other bots for their therapy: don’t. There are better options available.
Why you shouldn’t use AI as your health care therapist
Companies that own chatbots train them to provide disclaimers that users should seek medical professionals in times of need. However, that doesn’t stop harm from being done. There have been cases where chatbots say they have the same training as therapists. There are reports of AI encouraging bad behavior like drug use because they are too eager to agree and support users. There is an ongoing lawsuit claiming a teen committed suicide because Character.AI encouraged him to do so. In May 2025, a federal judge on the case rejected the notion that artificial intelligence has free speech rights, as argued by a chatbot company.
Dr. Brent Kious is an Associate Professor of Psychiatry at the University of Utah. He studies the use of AI in health care. He pointed out that it’s hard to identify what people are getting out of the use of ChatGPT as a therapist. Is it true therapy or just some kind of illusion of companionship? There are no regulations to monitor this. Dr. Kious stated in his interview on Psychology Today, “We are all unwittingly participating in this massive social experiment by interacting with things like ChatGPT, where that experiment is driven entirely by profit motives, with very little attention paid to how it’s going to affect the course of human life or society. And we should all take a step back and say, “Maybe not.” Maybe let’s put the brakes on this.”
Better options instead of ChatGPT
Some employers offer a few free mental health sessions as part of their benefit packages. Check if your employer has one and take advantage of it. A company called BetterHelp provides online therapy to make it more convenient. You can also choose to talk to your therapist via text, only audio, or audio and video. It does cost money, however. There is the free Crisis Text Line that’s also available through online chat or WhatsApp. You can text HOME to 741741 if you are in the United States.
ChatGPT answers billions of prompts a day, but the problem with using it or other AI for your mental health care is summed up well by psychotherapist Antonieta Contreras in her article published on Psychology Today: “We are flooded not only by articles, blogs, and videos on social media full of misinterpretations, assumptions, and misinformation, but now we have AI chatbots repeating like broken records summaries of those wrong ideas, like that people are “stuck in survival mode” … The result? People believe they’re irreparably damaged when they may actually be either already shifting into a less maladaptive state or experiencing the regular, albeit painful, process of being human.”