Skip to content

​People Are Turning to AI for Therapy—But Is It Safe? 

What if you had a therapist available to you any time of the day or night, and you never had to coordinate your schedules to book an appointment or worry whether they accepted your insurance for payment? On top of that, what if you felt instantly comfortable opening up to them and they were always present for you, no matter how many times you sought their counsel?

If this sounds like an ideal scenario, many would agree—and this is why an increasing number of people are turning to AI as an ever-present listening “ear” to help them solve their problems. 

People seeking emotional and mental health support from chatbots like ChatGPT and DeepSeek are discovering that these types of platforms are not only effective but also surprisingly poignant and understanding with their responses when asked for advice. One woman told BBC that she uses DeepSeek for nightly “therapy sessions” and that she “teared up reading” a reply the chatbot gave in response to her woes, stating, “Perhaps because it’s been a long, long time since I received such comfort in real life.” Others who have used mental health apps that feature a chatbot make bold proclamations like, “Although he’s a robot, he’s sweet. He checks in on me more than my friends and family do” and “This app has treated me more like a person than my family has ever done.”

To dive deeper into this topic, I spoke with a psychotherapist and a psychologist to get expert input on the trend and to the founder of Abby, a chatbot developed to provide mental health support.

Chatbots “bridge the gap” for those who can’t access or afford therapy

It’s not just a happy accident that people are pleasantly surprised by the AI-generated support they’re receiving. Chatbots are providing a vital service, because while therapists are in high demand, their availability is extremely limited. AI is filling the void—and doing so without bias.

Jessica Jackson, Ph.D., licensed psychologist and founder of Therapy is For Everyone Psychological & Consultation Services, PLLC, says that people turning to AI for therapy is “not surprising at all. The mental health system has long struggled with barriers to care, whether due to cost, long wait times or a lack of providers who reflect the identities and experiences of those seeking care. AI-powered mental health tools are attempting to fill some of these gaps by offering immediate, low-cost, or even free support.”

Pyramid of Success offer

According to The National Institute for Health Care Management (NIHCM) Foundation “49% of the U.S. population lives in a mental health workforce shortage area,” and “six in 10 psychologists report not having openings for new patients.” Additionally, according to the study Cyberpsychology, Behavior and Social Networking, even when an AI “conversational agent” had data on a patient’s age, race or ethnicity, gender and annual income, they provided counseling that was unbiased.

Chatbots offer a desirable alternative to human-based therapy—but not just because it’s difficult to find a therapist due to socioeconomic factors or lack of availability. Finding a human therapist that you feel comfortable opening up to can also present a hurdle that’s a dealbreaker for some who don’t want to be vulnerable in front of another person.

Psychotherapist Ben Caldwell, Psy.D., LMFT, concurs. “It’s hard to blame anyone for turning to AI for therapeutic support,” he says. “It’s notoriously difficult to access a human therapist, even if your health insurance covers it. Accessing support through AI is cheap, easy, always available and incapable of judging you. None of those tend to be true of human therapists.” 

This shortage of mental health providers is one of the reasons that prompted Julian Sarokin, founder of Abby, to develop a tool that he says is “designed to be more than just a chatbot.” He explains, “our mission is to make mental health support accessible to those who otherwise couldn’t afford traditional therapy. A significant portion of the population is completely priced out of professional mental health care, and Abby exists to bridge that gap—not to replace therapists but to provide an alternative for those who would otherwise have no support at all.”

AI therapy is preferred over human input in some cases

AI therapy can be extremely helpful and convenient for people dealing with problems that aren’t life-threatening. In one study, AI responses generated in therapy were even preferred by users over human-written responses.

Tools like Abby are specially programmed to provide AI-powered support. Sarokin explains that Abby is “a personalized companion that…. picks up on patterns in your thoughts, emotions and concerns, allowing it to tailor responses, suggest relevant insights and offer guidance that feels increasingly aligned with your personal journey.” Additionally, Abby “dynamically adapts to conversations in real time and moves fluidly between different therapeutic modalities based on the user’s input, adjusting its approach as it learns from feedback.” 

In what types of circumstances is it safe to use AI for therapy?

According to Caldwell, “AI can help you… navigate stressful moments where you could use some immediate support but aren’t in a crisis.” He adds that “for what we might think of as the problems of daily life, like problems in friendships or at work, AI seems relatively safe…. For people who feel isolated, stuck or even like there might be something wrong with them, this experience can be a tremendous comfort.” 

However, Caldwell cautions that “even in these circumstances, it’s important to remember the risks and limitations that come along with talking to an algorithm. If its responses don’t feel right, they probably aren’t right.”

Jackson says she’s “hesitant to use the word…  ‘safe,’” adding that “there is still a lot of work to be done to determine what is safe and what is not. AI can be useful for psychoeducation, skill-building, mood tracking and guided self-help exercises, particularly for individuals dealing with mild stress, anxiety or general emotional distress.”

Human intervention is still necessary

Although Sarokin developed Abby to help people deal with their problems, he doesn’t believe that AI should entirely eclipse human therapists. “There will always be a vital role for human therapists,” he says. “Human connection and professional expertise are irreplaceable, and Abby is not designed to take their place. Instead, it serves as a stepping stone, a supplement or a starting point for those who need help but lack access.”

Caldwell wants users to know that there are limitations to the support AI can provide. “If you’re in crisis, you need to talk to a person,” he advises. “That’s especially true if you’re having thoughts of suicide, self-harm or harming others. You should also talk to a human therapist for issues of serious mental illness, abuse or trauma.”

Jackson echoes this. “More complex concerns—such as trauma, severe depression, suicidal ideation, psychosis or intricate interpersonal issues—require the nuance, empathy and clinical judgment of a trained human professional,” she says.

Things to consider when using AI for therapy 

Jackson also warns that chatbots may give AI therapy users “the illusion of emotional connection.” She explains that “people might feel ‘heard’ by an AI chatbot, but that connection is not reciprocal or truly relational. Unlike a human therapist, AI does not possess genuine empathy, ethical responsibility or the ability to follow up in a meaningful way.”

Additionally, “AI models… can generate biased, misleading or inappropriate responses. If someone in distress receives advice that invalidates their experience or fails to recognize a crisis, the consequences could be severe,” she says.

If these limitations are taken into consideration, AI can be used as a helpful therapy tool in certain circumstances. Sarokin hopes that Abby “can provide a safe space to share your thoughts, gain clarity and feel heard…” and be used “to receive guidance and support when you need it most.” At the end of the day, he wants people to know that “you’re not alone.”

The advice and information shared herein is not a substitute for seeking support from a human mental health professional, nor is it intended to replace care, guidance or emergency intervention provided by a human therapist when needed.

Photo by dodotone/Shutterstock

The post People Are Turning to AI for Therapy—But Is It Safe? appeared first on SUCCESS.