Find a therapist Search articles

Could your next therapist be a chatbot?

Reviewed by Stephanie Steinman, PhD, CSAC

Image of a pair of hands typing on a keyboard, one is human and one is robotic

After an especially hard day of parenting, I collapse into a chair and log in to Woebot, an app driven by artificial intelligence (AI) that bills itself as “your personal mental health ally.”

The app gives me three starting options, and I choose “Get help with a problem.”

“Tell me more!” says a chat bubble. (The program used in this type of app is known as a chatbot.)

“My daughters are fighting, and I’m stressed.”

Woebot gives me another set of options: Work on the stress part of my problem, or work on the relationship part. They’re so intertwined, I’m not sure how to separate them.

I choose “relationship,” and a list of eight emotions pops up, with corresponding emojis. I pick “angry.” Woebot suggests some breathing exercises, then checks in to ask what I’m thinking.

“I feel like a bad mother,” I type.

“Okay, do you think there may be any cognitive distortions in this thought?” Woebot asks.

After Googling “cognitive distortions,” I spend a few more minutes selecting answers, then give up and close the app. The whole thing feels like a choose-your-own-adventure novel from my childhood. If I pick the right combination of answers, will I feel better? I can’t tell.

Putting a bandage on the mental health crisis

The scarcity of mental health resources in America has gotten a lot of attention via the COVID-19 pandemic, but the problem started long before.

One study from 2009 shows that 77% of US counties had a severe lack of mental health professionals at the time, with more than half the counties’ needs going unmet.1 This deficit was only made worse by the pandemic.2 Add the recent surge in mental health awareness—a silver lining of the past few years—and you have the recipe for a critical shortage.

This shortage is especially evident in American schools, where young people have felt the burden of the pandemic in a significant way.3 Zach, a certified school counselor in Pennsylvania, notes that the American School Counselor Association recommends one school counselor for every 250 students.4 His state’s average ratio is 360 to 1, while the ratio across the country is closer to 430 to 1.

Zach sees the real-life consequences of those numbers every day. “I’ve had countless conversations with families who either get put on months-long waiting lists or give up their search for services because it’s simply too difficult,” he says.

Enter the chatbot. Designed with young adults in mind, free AI apps like Woebot are meant to help address our chronically short supply of professional mental health care.5 Zach understands why this technology might appeal to his students, despite its flaws.

“I think many young adults would be more likely to use tech when thinking of mental health,” he says, “whether for the convenience of not having to leave home or to avoid the stigma that unfortunately still remains around therapy and counseling.”

The basics on mental health AI chatbots

Woebot and its main competitor, Wysa, have clinicians and other mental health professionals involved in their development. Both apps claim to deliver cognitive behavioral therapy (CBT), which focuses on thought patterns, feelings, and behaviors.

The backbone of AI chatbots is a tool called natural language processing (NLP).6 NLP is used to understand and build responses to phrases and words used in everyday speech. As more users contribute data in the form of conversations, the AI keeps learning and improving.

I found Woebot and Wysa pretty simple to use, and I appreciated some of their self-care suggestions. Both have exercises rooted in mindfulness, such as breathing and muscle relaxation. But the chat features felt stale, offering mostly prepopulated responses to generic questions around my initial “problem.”

Both apps include an SOS feature with ways to find emergency services, though the apps won’t contact those services for you. Woebot and Wysa are free, but Wysa offers an upgrade where you can meet with a coach for $29.99 a month—a feature they pushed often inside the app.

Concerns about AI chatbots

So they’re easy on your budget and can even be fun, but do these apps help address the issues they’re designed to solve? Many experts are unconvinced. According to Emma Bedor Hiland, PhD, author of “Therapy Tech: The Digital Transformation of Mental Healthcare,” this technology has some serious hurdles to overcome.

Chatbots aren’t accessible to everyone. Recent estimates suggest that 22% of American households don’t have access to the internet.7 A smartphone is also required to use chatbot apps, and people living in rural communities—the same places where in-person mental health care is harder to come by—are less likely to have smartphones than city dwellers.8 “Even if a chatbot is created in a way that’s supposed to get basic cognitive behavioral therapy tools into people’s hands,” says Bedor Hiland, “we’re already ignoring the fact that there are people who don’t have access.”

Chatbot interactions aren’t protected by privacy laws. The Health Insurance Portability and Accountability Act of 1996 (HIPAA) is a federal law requiring the creation of national standards to protect sensitive patient health information from being disclosed without the patient’s consent or knowledge.9 Woebot, Wysa, and similar apps are considered “direct-to-consumer,” which means they’re not required to protect users’ medical information in the same way a mental health professional would. “The FDA [Food and Drug Administration] says if it’s direct-to-consumer and you can get it on the Apple App Store or Google Play app store, it’s not covered by HIPAA,” Bedor Hiland explains.

Chatbots haven’t been thoroughly studied. In a recent review of scientific literature, researchers found just 11 clinical trials involving chatbots had been conducted as of 2021. The results showed only minor improvements in depressive symptoms when chatbots were used, compared with traditional therapeutic interventions.10 The main clinical study supporting Woebot’s effectiveness at delivering CBT was conducted with a small, homogenous group of Stanford University students.11

Where artificial intelligence falls short

Bedor Hiland believes the lack of humanity in AI is the core issue when it comes to mental health. The companies behind apps like Woebot and Wysa may have mental health professionals in senior programming roles or on their boards, but “these technologies are programmed with specific parameters, and if the information you share with an app is beyond what it’s technologically able to recognize, it can’t respond effectively,” she says.

Case in point: In 2018, the BBC conducted a series of tests with Wysa and Woebot that resulted in instances where the apps failed to recognize the severity of user statements about their mental health concerns, including child abuse.12 While errors like these are rare, they could be catastrophic when someone is experiencing a true crisis.

The problems extend beyond chatbots to AI in general. Advocates and researchers have been sounding the alarm about racial disparities in everyday AI technologies like facial recognition—and in the US criminal justice system, where AI has been used to predict a person’s likelihood of reoffending and was found to be skewed against people of color, especially Black people.13

Is there hope for AI chatbots in mental health?

AI chatbots may not be a one-size-fits-all solution to our mental health crisis, but they can be beneficial when used in specific ways.

Chatbots can help reach more teens. Young people like the students at Zach’s school are impacted by the scarcity of school counselors, and many also face shortages of mental health care in their communities. The American Academy of Child & Adolescent Psychiatry reports there are only 11 specialized psychiatrists for every 100,000 children in the United States.14 While experts work to close that gap, apps like Woebot and Wysa may be able to act as a safety net for a population more inclined to embrace the technology.

AI can be an effective training tool. When clinicians want to look critically at their own work with the goal of improving, they normally have to sort through their session transcripts by hand. Two companies are using AI language processors to do this work and discover the language that works best for certain therapeutic scenarios.15 In addition, the Trevor Project, which offers mental health support for the LGBTQIA+ community, is using an AI chatbot to train counselors for a variety of scenarios.16

Chatbots can help you help yourself. “Chatbots are fundamentally about taking responsibility for yourself,” says Bedor Hiland. “We call that ‘responsibilization.’ It’s the idea that things once considered beyond your capacities—as a nonprofessional in the mental health care space, for example—are now within reach.” Having a greater sense of control over your own mental health can be empowering.

AI and the therapeutic alliance

Among the most valued aspects of psychotherapy is the “therapeutic alliance,” a term used to describe the bond between therapist and patient. In a recent study involving Wysa, researchers (including, it’s worth noting, two employed by the company) found that the therapeutic alliance was not only present between a client and the Wysa chatbot, but potentially as strong as the one formed with an in-person therapist.17

Even so, Zach isn’t convinced these apps can help the young adults he counsels. “Therapy takes work,” he says. “I’ve tried it several times and have found it difficult to stick to for a variety of reasons: time commitment, differences in personality, et cetera.” When it comes to apps, he says, “I think it’s possible young people will live with that in-the-moment help. But I don’t believe that real, tangible change and improvement can occur that way.”

Finding the right therapist is a crucial way to start enjoying the benefits of a therapeutic alliance. Visit our directory to look for a mental health professional near you.

The future of AI in mental health

However you feel about artificial intelligence, the technology is here to stay. More and more companies are using AI to streamline aspects of our daily lives, including mental health care. Bedor Hiland believes we can make strides toward more ethical and equitable AI if we explore it using a critical, realistic lens.

“My hope is that my research helps people become more aware about what technology can offer and what we imagine technology can do,” she says. “But also what its limits truly are.”


This month, therapist.com is exploring the future of digital mental health. What’s next for therapy, now that tech has truly entered the room? The first piece in this series takes a close look at virtual reality and its evolving role in mental health treatment, especially exposure therapy; and the final piece explores whether teletherapy has fulfilled its promise of making mental health care more accessible.

Sources

About the author

Amye Archer, MFA, is the author of “Fat Girl, Skinny” and the coeditor of “If I Don’t Make It, I Love You: Survivors in the Aftermath of School Shootings,” and her work has appeared in Creative Nonfiction magazine, Longreads, Brevity, and more. Her podcast, “Gen X, This Is Why,” reexamines media from the ’70s and ’80s. She holds a Master of Fine Arts in creative nonfiction and lives with her husband, twin daughters, and various pets in Pennsylvania.

Related articles

Concerned woman stares at screen of smart phone

Is it healthy to take a break from the news?

If the 24/7 news cycle is affecting your mental health, you’re not alone...

A woman video chats on her laptop while taking notes

Teletherapy and the promise of access

Once hailed as the path to accessible mental health care, has teletherapy...

A father and son sit on a couch playing video games together

Can video games help level up therapy for kids?

Though gaming isn’t free of risk, some video games used responsibly in...

A stressed woman looking at her smartphone.

Digital technology and mental health

Digital technology has become an essential part of daily life. Learn about the...

See more