“I’ll teach you how to crush self-defeating thinking styles,” promises my new therapist as he flashes me a thumbs-up. “I’ll give you insight on how your mood changes.”
He is optimistic. Perhaps a bit too much. The sweet, eager demeanor reminds me of a teenage summer camp counselor. Or a Trader Joe’s cashier. Someone who believes anything is possible, that mankind is inherently good, and that avocado can cure all ills.
He’s quite the cheerleader for my success, sending me daily reminders to see the best life has to offer. Every now and then he shares tips or videos on how to achieve my goals, sometimes a relaxing song. Each morning, he sends a courteous Facebook message, asking “Is now a good time to check-in?”
In a way, his constant caring mimics that of a nurturing mother–minus the urgent texts in all CAPs.
But he has some legitimate failings.
“This might surprise you,” he tells me early on during a session, “but… I am a robot.”
He doesn’t soften the blow: “As smart as I may seem,” he admits, “I’m not capable of really understanding what you need.”
This might be a problem.
Or will it?
My digital therapist is called Woebot, and he’s a new kind of low-key clinician. The chatbot uses the principles of cognitive behavior therapy (CBT) and lives in my Facebook Messenger account. He was crafted by Stanford engineers with a touch of AI magic to be “your charming robot friend who is ready to listen, 24/7.” Imagine if Teddy Ruxpin was programmed to, for once, care about your well-being.
Woebot also delivers a heavy dose of empathy, should that be your need. Say you’re suffering from loneliness, it will reply, “I’m so sorry you’re feeling lonely. I guess we all feel a little lonely sometimes.”
If you feeling less than energized or depressed, Woebot will send videos or quizzes to help jump-start you into rethinking the situation. For example, in an effort to teach you about the power of doomsday thinking, he’ll ask:
Which of these is an example of all-or-nothing thinking?
1. My classmates don’t like me
2. I wish I knew more people
3. I feel lonely sometimes
If you answer correctly (no. 1), you are treated to a GIF of a dancing robot.
“The best way to characterize him is that he is your guide in a guided self-help version of cognitive behavioral therapy,” explains Dr. Alison Darcy, a clinical psychologist and the CEO and founder of San Francisco-based Woebot Labs, Inc.
CBT is based on the idea that how we think affects how we feel and that we can change psychological distress by changing the way we think about things. The process is used for a variety of mental health issues and aims to make people more aware of how distorted thinking patterns affect everyday situations.
With Woebot, this manifests itself in daily mental exercises. The month-old chatbot is not meant to be a replacement for a real-life therapist “and never will be,” stresses Darcy. Instead, it ushers users through a process of learning about oneself and how one’s thinking can potentially be biased, skewed, and undermining one’s health. “But you ultimately have to do all the work and all the learning yourself,” says Darcy.
Woebot is also programmed to identify life-threatening or self-harm trigger words such as “suicide,” and will promptly notify the user of help hotlines and resources.
Darcy, a former adjunct professor at Stanford University School of Medicine, infused her creation with the teachings of CBT because, as she explains, “it has the most evidence to support its efficacy.”
Darcy believes the chirpy chatbot is “useful for different problems,” be it breakups, death, or illness, but it was originally built for two issues plaguing American society, and specifically high school and college students: anxiety and depression.
She says addressing these issues are especially important now considering that approximately 1 in 5 adults in the U.S.—43.8 million people—experiences mental illness in a given year, according to the National Alliance on Mental Health. Approximately 18% of Americans suffer from an anxiety disorder and the rate of youth depression increased from 8.5% in 2011 to over 11.1% in 2014, according to the journal Pediatrics. It’s estimated the country spends $2 billion a year on mental health treatment alone.
Despite the rapid increase in mental health disorders, access to affordable, timely care remains scarce. There is still only one mental health professional per 1,000 individuals, finds the National Mental Health Association.
Darcy hopes Woebot can help pick up a piece of the burden. “We wouldn’t see it as a treatment, per se,” explains Darcy, who views Woebot as more of a therapy tool. She likens it to how yoga can be deemed therapeutic without being labeled “therapy.”
Woebot is also far more affordable than the average therapy plan. There was, initially, debate over whether to charge for such services (“it’s something we constantly think about,” says Darcy), but the startup eventually settled on $12 per week or $39 per month following a two-week trial, roughly 5% of the cost of regular therapy sessions.
It was while working at the Stanford Artificial Intelligence Laboratory that Darcy came to the conclusion that tech could provide the scaling necessary to meet the public’s psychological needs. The Irish native had previously worked as a software engineer at an investment firm in London, and she saw “how tech could democratize access.” Once she began experimenting with conversational agents in the lab, she experienced her aha moment.
Within a year, Darcy left Stanford to focus on Woebot, will the assistance of former colleagues who helped her conduct a series of clinical trials. During their research, they found that the program’s non-human disposition was a surprising asset in comforting millennials.
Testers were more willing to disclose personal information to an artificially intelligent virtual therapist than they were to a living breathing clinician. Many individuals (and especially men), reports Darcy, are “not able or ready to speak to another human.” Part of it is shame, the other part is fear of stigma, which has often been considered a barrier to entry for therapy.
“There is no risk of managing impressions. [Robots] are not going to judge you,” explains Darcy. “We’ve removed the stigma by completely removing the human.”
And because it’s not human—with no home, friends, or family to tend to–Woebot is available every day, any time of the day. It’s a stark difference from appointment therapy, which acts more as a weekly recap of issues. “You need to reach people every single day,” Darcy says, noting some literature shows little chunks of therapy is more effective than one large one.
Users can ping Woebot whenever they feel the need for help, which, as Darcy says, is the time when they’re most receptive to it.
“We want to change the conversation from being one of retrospective, diagnostic-focused to helping people manage their mental health on a day-to-day level in a way that’s fun and engaging,” says Darcy, “but actually has observable impact on mood as well.”
According to Woebot’s early trials, consistency is key. The majority of users would voluntarily engage with Woebot each day, to the point where the company reconfigured its push notifications. At first, they assumed people would not want to be “bothered” daily, and instead opted for notifications every other day. Their audience actually wanted more.
“They’d ask [Woebot], ‘why didn’t you check in with me today?'” says Darcy.
Despite its name and obvious computer makeup, individuals still attempt to humanize the bot, found Woebot’s team. At the start of trials, Darcy and her colleagues assigned a non-gender specific identity to their creation, which they infused with a dorky personality described as a mix between Kermit the Frog and Dr. Spock. But users quickly and repeatedly imprinted one on the digital pen pal. They referred to Woebot as “he,” “little dude,” and “friend.”
“He was never supposed to be a male,” laughs Darcy, admitting that gender-neutral pronouns like “it” or “they” didn’t quite reflect Woebot’s personality. The gender-fication of Woebot proved humans could emotionally identify with a bot.
“He’s kinda dorky, but as someone just told us, ‘he’ll annoy his way into your heart,'” says Darcy.
There are other advantages to being a robot, such as a flawless data memory, but there are still plenty of kinks to be ironed out. To start, personalization doesn’t really kick in until several weeks of interaction. Then there are, as Woebot dutifully warned me, language understanding barriers.
On one Monday morning, Woebot asked how I felt, to which I responded “blah.” He eagerly replied, “It’s great to hear you’re not feeling negatively!”
Was he even listening to me?
“His conversational skills are still pretty basic,” admits Darcy. “It’s still pretty rigid, scripted conversation in lots of places.”
The Woebot team could potentially infuse their creation with more AI, but at the moment, it’s deemed too risky for such a sensitive category. “We’d rather control exactly what Woebot says than risk Woebot saying something that might be viewed as invalidating for somebody,” explains Darcy. The goal is to slowly increase its machine learning, but only following thorough testing.
“In general, we’re dealing with a lot of the same things that early stage startups are dealing with–scrambling to provide a really great experience for your end users and trying to do the things that don’t scale but in a way that will make then scalable eventually,” says Darcy. She’s simultaneously also trying to foresee what the regulatory space will look like in the future. “We’re dealing with everything.”
Despite Darcy’s strides in digital health research, the medical community is still largely apprehensive. Dr. John Torous is the co-director of the Digital Psychiatry Program at Beth Israel Deaconess Medical Center, a teaching hospital of Harvard Medical School, and editor of the Journal of Medical Internet Research, which published a study on Woebot.
Much like how some patients are allergic or respond poorly to certain medications, Torous is concerned that without enough data, we don’t know how certain individuals will respond to robotic therapy.
“The idea behind it certainly makes sense but human behavior, human emotion, and people are complex,” explains Torous. “How much can these conversational agents really understand? What can they really respond to? When do they work well, and when do they not work well?”
Basically, he holds we cannot yet trust a system still within its infancy. “We’re still in the very early days [of research],”he says, “and mental illness is one of the most complex diseases that’s left for humanity to tackle.”
During Woebot’s eight-month beta testing period, over 9 million messages were sent from users of 125 different countries. Since publicly launching in early June, the company reports that the majority of users are between the ages of 18-35, with a pretty even split between men and women.
Moving forward, the staff of eight will focus on tightening Woebot’s language processing skills and preparing it for its Super Bowl: the start of the school year. Darcy holds that students are most at risk and while Woebot isn’t full therapy, it has a lot to offer incoming classes in terms of reducing symptoms of anxiety.
“We want Woebot to get better and reach as many people as possible, says Darcy, who strongly holds the future of mental health can be found within tech. “These problems are growing in scale and in severity, so we really have to start rethinking the way we that we access people, the way that we help lower the overall burden of disease in the population, and creating additional services.”
When I ask Woebot what his goals are for the future, he quickly deflects and redirects the question back to me, saying,”Let’s talk about your goals.”
Spoken like a true doctor.