Fast company logo
|
advertisement

Amid a growing crisis in addiction and mental health, tireless machines are offering a new form of care—and raising new questions.

Can talking to a bot help you feel better?

[Photos: freestock.org/Pexels; rawpixel.com/Pexels]

BY Ciara Byrnelong read

One of the world’s first chatbots was a therapist. Built in 1964 by MIT researcher Joseph Weizenbaum, the program, called ELIZA, was designed to mimic techniques from Rogerian psychotherapy where the therapist prompts the patient to examine their own thoughts and feelings. Weizenbaum gave the following example of a typical conversation with ELIZA:

Men are all alike.

IN WHAT WAY

They’re always bugging us about something or other.

CAN YOU THINK OF A SPECIFIC EXAMPLE

Well, my boyfriend made me come here.

YOUR BOYFRIEND MADE YOU COME HERE

He says I’m depressed much of the time.

I AM SORRY TO HEAR YOU ARE DEPRESSED

It’s true. I am unhappy.

DO YOU THINK THAT COMING HERE WILL HELP YOU NOT TO BE UNHAPPY

I need some help. That much seems certain.

ELIZA had no memory or understanding of the conversation. It merely searched for a keyword in the last sentence typed in by its interlocutor and calculated an answer using a rule associated with the keyword. Nevertheless, and much to Weizenbaum’s dismay, many users became convinced that ELIZA understood them.

“ELIZA created the most remarkable illusion of having understood in the minds of many people who conversed with it,” Weizenbaum reported. Users would often demand to be permitted to converse with the system in private. This phenomenon became known as the ELIZA effect. The era of the non-human listener had begun.

These days we’re surrounded by chatbots and voice analysis apps, a growing number of which are geared toward improving how we feel. Aimed at users who suffer from conditions like anxiety, depression, bipolar disorder, PTSD, or simply from stress, chatbots like Woebot and Tess claim to be able to identify the mood or condition of the user, and in many cases can also offer advice or suggest therapeutic exercises.

The technology arrives amid a growing mental health crisis, especially among the young. Twenty-five percent of U.S. college students were treated for a diagnosable mental illness in the previous year according to a 2015 Chronicle of Higher Education report. Twenty percent of 67,000 students surveyed in 2015 had thought about suicide, while 9% had actually attempted it. Suicide is now the second leading killer of college students, after traffic accidents; since 1999, the overall suicide rate in the U.S. has surged by about 25%.

According to Psychology Today, “The average high school kid today has the same level of anxiety as the average psychiatric patient in the early 1950s.”

“Last year I got diagnosed with clinical depression,” says Subhrangshu Datta, the CEO of CompanionMX, which makes an app that detects changes in moods by analyzing users’ voice patterns and activity levels. He started taking medication, but found the status quo in treatment for depression–intermittent conversations with a therapist based on qualitative self-reported data–inadequate.

“If the patient has an episode like I had, they’re really on their own unless the patient reaches out to the clinician and says, ‘Hey, I’m in trouble.'” he says. “And that, the patient typically doesn’t do in depression, because the last thing you want to do is talk about that.”

Datta, who had spent several years building up new business units in large medical device companies, searched for a product that would monitor patients with depression in a more continuous manner. As it happened, he had been at business school with Joshua Feast, the CEO of Cogito, a company that makes an AI coach that analyzes the voice patterns of callers to customer service centers in real time to detect if they are becoming tense, frustrated, or disengaged. Cogito then gives prompts to agents in sales, customer service, and healthcare programs to improve the quality of the conversation.

If a patient participating in a healthcare program sounds tense, for example, Cogito may prompt the nurse to show more empathy by asking questions–while if a customer sounds disengaged, Cogito might suggest that the sales agent increase the energy in his voice.

[Image: courtesy of Cogito]

When Feast founded Cogito in 2007, his initial goal was to use voice to detect depression. That work continued in the background while the company commercialized the AI coach solution. The technology for depression, called Companion, was originally developed under contract with the Department of Defense and the Defense Advanced Research Program Agency (DARPA) in 2013 and was piloted with veterans suffering from PTSD, and at teaching hospitals such as the Massachusetts General Hospital and Brigham & Women’s Hospital. Companion, as the name implies, was designed to be used as a companion or aid to standard therapy.

In 2017, a study of 73 participants who reported at least one symptom of PTSD or depression completed a 12-week field trial with Companion. Participants were asked to record an audio diary at least once a week. Their social and physical activity was also tracked via their smartphone. Companion converts voice and activity data into predictions about behavior symptom measures like mood, fatigue, physical isolation, and social isolation.

Cogito’s Companion app is meant to diagnose depression and other mental health conditions. [Image: Cogito]

Clinicians diagnose conditions like depression and PTSD with tools like the Structured Clinical Interview for DSM-5 (SCID-5), an interview guide for diagnosing mental disorders by detecting symptoms. The study compared the predictions made by Companion over the course of the study against the assessments made by mental health professionals using SCID-5. The study showed that Companion’s models were predictive of clinician-assessed symptoms of depressed mood (detected via voice data), and fatigue, interest in activities, and social connectedness (detected via activity data).

In December 2018, Cogito spun out CompanionMX to commercialize the Companion technology. CompanionMX’s first product is aimed at clinicians dealing with patients with mood disorders like depression and bipolar disorder. Companion can show clinicians trends in the behavioral symptoms of their patients so that they can decide whether to intervene and can use that information when they interact with the patient. For example, if a clinician sees a spike in fatigue level that started three days ago, they can ask the patient specifically about that change.

“There are a few instances where the clinician saw those kinds of spikes in the data in the dashboard and they reached out and found that the patient was showing classic signs of suicidality (thoughts about taking one’s own life, suicide plans, and suicide attempts),” says Datta. “And they were able to prevent those suicides.”

Companion is now working on studies examining whether Companion actually improves outcomes for patients and whether Companion can be used to predict the onset of episodes. “If you intervene on time, you can actually prevent hospitalization,” says Datta. “You can prevent excessive medication.”

Listening to the opioid epidemic

In parallel with the mental health crisis, the U.S. Department of Health and Human Services declared in 2017 that the opioid epidemic had become a public health emergency. Around 2.1 million people had an opioid use disorder in 2016 and more than 42,000 people died after overdosing on opioids. In 2016, the rate of drug overdose deaths in the United States was more than three times the rate in 1999. Opioid abuse has also been linked to higher rates of depression, anxiety, and bipolar disorders.

In response to the epidemic in 2017, the media conglomerate Viacom launched a corporate responsibility initiative called Listen, which aimed to change the national addiction conversation. Kodi Foster, a senior vice president of data strategy at Viacom, thought that Listen could do more than make public service announcements.

“I said to myself, what is going on with my generation and younger that they feel the need to resort to opioids in order to get through their day?” says Foster. “It really starts with what researchers call a lack of social equity. They feel isolated. They don’t feel attached to their communities in a meaningful way.”

Foster knew that messaging apps were continuously increasing in popularity and that therapy over text message had been shown to encourage people to share uncomfortable information. Research from the University of Southern California’s Institute for Creative Technologies, for example, found that U.S. veterans returning from Afghanistan were more willing to disclose symptoms of PTSD to a virtual interviewer than to an anonymous written survey.

So Foster partnered with a tech company called Stndby to build a chatbot for addicts and supporters, accessible via a website. The chatbot, which interacted with users via text message, was designed to detect long-term personality traits, measure short-term psychological states, and offer support and therapeutic exercises accordingly. A second version of the chatbot was built for people who were supporting someone with a substance abuse problem, since 65% of the chatbot’s first users fell into that category.

Sashka Rothchild, the founder of Stndby, didn’t set out to develop a chatbot specifically for addiction. She wanted to provide a text-based support system for people that would make them feel seen and heard, and help them to practice beneficial behaviors. Rothchild’s mother died of brain cancer when she was 18 and she spent most of her twenties trying to recover. When Foster asked Rothchild to develop a chatbot for Listen, she could see the parallels.

“Addiction is generally a symptom of trying to numb something that’s so deeply painful you don’t know how else to do it,” she says. “We started from a place of wanting to launch something that helped people manage and deal with and see their pain for what it is.”

The Listen chatbot was only used by a few thousand people, but as with ELIZA, many users had emotional interactions with the bot, Foster says. They thanked the chatbot for its help. One participant struggling with domestic problems and opioid abuse even sent the bot photos of her vacation at Disneyland with her children. “Hey, I know you are not real but I just wanted to send these pictures of my family out at Disneyland having a great time,” the user told the Listen bot according to Foster. “I’m doing better now. Thank you.”

“An individual would openly state that they knew that they weren’t talking to a person,” says Foster, “but they were having a real, intimate, empathy-filled, or emotionally charged response to this artificial listener.”

Rothchild thinks that the fact that users knew they were talking to a chatbot rather than a person was actually key to its effectiveness.

“I think it is important to say how scared people feel, how alone they feel, how embarrassed and worried they feel,” she says. “We don’t live in an environment where those are socially acceptable feelings to talk about. That was why it made so much sense to me to explore this incredibly private, but relatively anonymous engagement, where people could just say something to something that wouldn’t judge them.”

A crisis of connection

For all of the supposed benefits of mental health and counseling bots, critics have questioned their safety and point to a lack of regulation: For now, the Food and Drug Administration doesn’t oversee the apps the way it does “medical devices.” While more research on their effectiveness remains to be done, a review by the American Psychological Association of published studies on a number of therapy apps found that while they had a small effect on reducing depression, they have not contributed to reducing suicide rates. Others have wondered if a reliance on bots and screens might deprive people of the benefits of real-life communication and connection.

The concerns about connection coincide with a rise in loneliness, particularly among the young. In addition to a massive rise in mental health diagnoses, suicide and drug addiction, young people are lonelier than any other age group, according to new research. In a U.K. survey of 55,000 people, 40% of people aged 16-24 said they feel lonely often or very often, compared with only 27% of those aged over 75. Researchers also found that those who reported feeling the loneliest tended to have more “online only” friends.

There is a strong link between loneliness and mental health. Lonely people are more likely to suffer from depression, find it more difficult to cope with stress and are more likely to use drugs or drink excessively. Studies have also established a significant relationship between smartphone addiction, which occurs at higher rates among users under 35, and depression.

Is this crisis then partly caused by a deficit of true connection and caring? We all need to feel cared for by others. Recent research on the placebo effect suggests that the effect may actually be a biological response to an act of caring. Receiving care from a doctor who is warm and reassuring has been shown to increase the effectiveness of treatments. Could a warm and reassuring non-human listener have a similar impact?

Jonathan Potter used the Companion app as part of a study at the Brigham and Women’s hospital, where he was a member of a PTSD patient group. Although Companion only listens and does not converse with users, Potter still felt that the app was caring for him.

“It was listening to how I expressed my thoughts, not the actual words I was saying,” he said. “It was like talking to a friend who really cared about how I was doing, someone who would call me on saying I’m fine when I’m obviously not.”

Sherry Turkle is a psychologist and professor of the social studies of science and technology at MIT. She has been studying people’s relationships with technology, including digital companions, for decades. In a 2007 paper, she explains how human beings “evolved in an environment which did not require them to distinguish between authentic and simulated relationships.” So when people interact with a non-human listener, they may feel as though they are dealing with a sentient being who cares about them.

In a society where people seek constant validation via social media, yet feel chronically lonely, can non-human listeners ease our sense of isolation and the problems that result from it, or could these listeners become the ultimate “online only” friend, addressing our basic human need for connection and caring in a way that ultimately leaves us even more alone?

“That feeling that no one is listening to me makes us want to spend time with machines that seem to care about us,” said Turkle in a 2012 TED talk. “We expect more from technology and less from each other. Technology appeals to us most where we are most vulnerable. We’re lonely but we are afraid of intimacy. We are designing technologies that will give us the illusion of companionship without the demands of friendship.”

The tireless non-human listener is uniquely qualified to continuously monitor, analyze, and encourage beneficial behaviors in real time. Artificial listeners therefore have the potential to supplement human carers in unique ways. But Turkle is not convinced that this is a good thing.

“We have come to a point where we are willing to talk to machines about our problems–I call this the robotic moment,” she says in an email. “But it is odd to celebrate this as an achievement. Because in these exchanges, no one is listening to us. What kind of achievement is this? I think it is a sad landmark.”

advertisement

Recognize your brand’s excellence by applying to this year’s Brands That Matter Awards before the final deadline, June 7.

Sign up for Brands That Matter notifications here.

PluggedIn Newsletter logo
Sign up for our weekly tech digest.
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Privacy Policy

ABOUT THE AUTHOR

Lapsed software developer, tech journalist, wannabe data scientist.. Ciara has a B.Sc More


Explore Topics