Background image: The Bold Italic Background image: The Bold Italic
Social Icons

The Rise of the Robot Therapist

6 min read
Liz Zarka
Original artwork by Ameena Golding

Angela* gave up on the idea of seeing a therapist face to face a while ago. As the sole income earner in her household, her busy schedule and a tight budget led her to download the AI-powered therapy apps Woebot and Wysa to help alleviate feelings of depression. “If my life were more stable, I’d love to see a human therapist—but until then, the tiny, portable ones will have to suffice,” the 27-year-old said.

In the face of a tightening market for traditional psychotherapy in the Bay Area — which we just wrote about in-depth — thousands of San Franciscans have downloaded mobile mental-health apps such as Woebot, Wysa and Youper to manage their symptoms of stress, anxiety and depression. Unlike previous kinds of e-therapy services that connected users to human beings on the other end of a text message or a video call, these newer apps offer interactive chat experiences with bots programmed with artificial intelligence and wired to respond to users on the basis of their language.

The fact that more people are getting some form of help that may otherwise be unavailable is a good thing, but chatbot companies often obscure what their technology can accomplish.

Like Angela, many people can’t afford or don’t have time to embark on what could be an exhausting search to find an in-person therapist. These apps, on the other hand, are free and ready to talk in a matter of minutes.

They are also modeled primarily on cognitive behavioral therapy, or CBT — one of the more researched types of therapies that rests on a solid evidence base for treating common mental disorders such as anxiety and depression. Because they’re readily accessible at all times of the day, users are able to generalize skills that they want to work on through interacting with the app — such as identifying irrational thought patterns, which experts call “cognitive distortions” — into daily life and are encouraged to set mental-health goals, track their moods and analyze the sources of those moods.

The fact that more people are getting some form of help that would otherwise be unavailable is a good thing, but chatbot companies often obscure what their technology can accomplish.

The founders of these apps are adamant that they never intended for their products to completely replace traditional therapy in someone’s life or be crisis-intervention tools, but they are fully aware that the apps are often being used as such.

A robot, no matter how smart, simply can’t replace a human therapist’s ability to do such things as provide deep insight into past events, hold space, empathize and provide a nurturing relationship built on mutual trust.

The big concern is that there is little clarity around whom the apps are targeted toward and what level of severity of anxiety, depression and addiction they can realistically help address. Individual users I spoke to said they are confused about what constitutes sufficient treatment for their mental-health concerns and discouraged when these apps fail to help them.

Experts agree that a robot, no matter how smart, simply can’t replace a human therapist’s ability to do such things as provide deep insight into past events, hold space, empathize and provide a nurturing relationship built on mutual trust.

“Even if AI gets to a level of sophistication where it can be a wholesale alternative to traditional mental-health treatments, I believe that to be successful, it will need to be part of an integrated care pathway,” said Jose Hamilton Vargas, CEO of Youper.

Alison Darcy, founder and CEO of Woebot, shared similar sentiments on a recent Reddit AMA: “The potential for AI to ever take the place of humans in this realm is, I think, massively overhyped.”

But beyond establishing that these apps could never fully replace human-to-human therapy, these founders haven’t displayed much consistency about what they should be used to do. Their marketing materials confound things even further.

Developers vehemently deny that chatbots should be used in cases of serious or life-threatening mental illness, yet their marketing materials suggest the opposite.

For example, the Woebot website prominently features a customer review from a self-identified “life hacker” on its home page that reads, “In my first session with Woebot, I found it immediately helpful […] addressing my anxiety without another human’s help felt freeing.” You could argue that this kind of language discourages users from seeking traditional psychotherapy.

Worse yet, on the Wysa home page, the first thing site visitors see is a quote from a 13-year-old survivor who expresses how the app helped her hold onto her life. Developers vehemently deny that chatbots should be used in cases of serious or life-threatening mental illness, yet their marketing materials are suggesting the opposite.

The websites associated with these companies promise that their apps will make users happier, less anxious and more productive—but it’s hard to tell if these claims are legitimate, given the lack of transparency about the kind of mental-health outcomes the chatbots are able to produce.

“The lack of possible responses the bot gives and the limited words/ideas it can understand made me feel very conscious of the fact I was talking to a phone app, which brought a feeling of embarrassment and frustration with myself.”

Youper and Wysa released their app without publishing any peer-reviewed studies, though Woebot did issue one report about its outcomes in partnership with the Department of Psychiatry and Behavioral Sciences at Stanford University in 2017. The study found that two weeks of using text-based conversational agents among individuals ages 18–28 was linked to a significant reduction in symptoms of depression and anxiety as compared to a control group.

That seems promising, but the results of this study are limited in their generalizability for several reasons. The sample group included just 70 individuals (a small size for psychology studies of this kind) and was made up of college undergraduate students who might not look like the end users who would download chatbot apps — people who are working full-time, paying bills and who can’t afford traditional therapy.

Developers might try to skirt the issue of defining a clear audience and making more-precise efficacy claims by saying that, while some people will always need talk therapy to treat their mental illness, everybody can at least benefit somewhat from interacting with their app.

But unfortunately, for some we spoke with, chatbots exacerbated symptoms of mental illness.

“The lack of possible responses the bot gives and the limited words/ideas it can understand made me feel very conscious of the fact I was talking to a phone app, which brought a feeling of embarrassment and frustration with myself,” said chatbot user Angela.

They can also delay people from getting proper treatment. Rohit,* a San Francisco–based financial-tech employee, said he temporarily gave up on searching for a therapist when he started using Youper. Like Angela, he was attracted to the app because of its convenience. “Everything that I’ve been exposed to with technology has led me to believe that computers are really good at following a set of rules and doing that cheaply and more efficiently than humans could ever do it,” Rohit said. But after a month of using the app almost daily, his symptoms of anxiety still had not lifted. “I’m dying over here. It’s pretty scary, because at times I’m at a point of deep anxiety.”

Despite these shortcomings, Angela and Rohit both appreciate aspects of the chatbot experience. “This is useful,” Rohit said, waving his phone. “I wouldn’t call it a therapy bot, but I would call this a tool that people can use from day to day to make their low points more manageable.”

Nascent AI-powered chatbot services demonstrate the potential to be regular features in our journey to improved mental health—but going forward, companies should better understand and articulate the abilities and limitations of their technologies. Their survival as companies, and the survival of their users, depends on it.

*Last names have been removed to protect privacy around matters of mental health


Hey! The Bold Italic recently launched a podcast, This Is Your Life in Silicon Valley. Check out the full season or listen to the episode below featuring Jessica Alter, founder of Tech for Campaigns. More coming soon, so stay tuned!


Last Update: February 16, 2019

Author

Liz Zarka 7 Articles

Subscribe to our Newsletter

Subscribe to our email newsletter and unlock access to members-only content and exclusive updates.