According to reports released by the Trevor Project, the largest LGBTQ crisis intervention nonprofit in the country, 1.8 million LGBTQ youth in the U.S. seriously consider suicide every year. A suicide attempt among that demographic occurs every 45 seconds; 54% of those young people report they don’t get the mental help they need.
To help correct that life-threatening gap in assistance, 700 counselors from the Trevor Project respond to issues, including suicidal thoughts, reported by LGBTQ youth via its online digital chat support, called TrevorChat and TrevorText. Before they are deployed to respond, the counselors receive intensive training in how to deal with such sensitive issues. Now, in partnership with Google.org, the organization has developed a “crisis contact simulator” to help streamline a core part of the training: role-playing. Staff members play the roles of at-risk youth, but now joining the team is Riley: a chatbot-like tool that simulates a genderqueer person in crisis.
Riley is a genderqueer youth from North Carolina whose coming out to friends didn’t go so well. They have a supportive family but are anxious about coming out to them, for fear of them thinking they’re a “freak” or kicking them out of the house. Riley’s not suicidal but has had suicidal thoughts in the past. The AI bot was trained with data including transcripts from the past human versions of the role-plays, and “a paragraph describing the most salient emotional and biographical details of Riley’s story,” says Dan Fichter, Trevor’s head of AI and engineering. The tool was continually tested, tweaked, and refined in collaboration with Google developers.
It’s designed to accommodate many different conversational inputs from trainees, who may not yet be experienced with correct crisis management techniques. The Trevor Project’s method is to ask open-ended questions, then deal with the person’s feelings and validate their experiences. In debrief sessions, human staff members will go through the transcripts and assess how well the trainees responded, which will guide where they can improve.
Overall, training totals about 40 hours and includes learning about gender and sexuality issues, crises such as bullying and self-harm, clinical suicide risk assessments, and communication techniques. The Riley role-plays are only the first of many they do, and while the others take place with staff for now, Trevor is working on developing new bots to simulate some different characters, with distinct issues and backgrounds, that could also form part of the training.
Counselors are trained to deal with a varied range of issues. Some young people message when they are seriously thinking of killing themselves. Others call when they’re facing violence, or struggling to come out to parents. Some are simply having a bad day. “No crisis is too large or too small,” Fichter says. He says the aim is for the counselor intervention to be a way for individuals to gain “more autonomy and control over their experiences.”
Mental health problems in general have skyrocketed during the pandemic, due to close quarters, stress, and economic strains. For LGBTQ youth (ages 13 to 24), spending more time in lockdown with family means more time away from social connections such as friends and supportive adults at school, and outlets such as extracurriculars. Many may be cooped up with people who victimize or abuse them, or simply don’t recognize their identities.
At Trevor, the requests for support at times are double the pre-COVID-19 numbers; in the year spanning August 2019 to July 2020, the group received more than 150,000 new calls, texts, and messages. The chatbot has allowed support to scale up during this time; Trevor is training a new volunteer cohort every month, with a goal of tripling its force this year. Volunteers who’ve signed up to work at night or on weekends, because of their busy schedules and to accommodate off-peak callers, can train during those convenient times with Riley.
Since the tool’s launch, two training cohorts have trained with it. Jen Carter, head of tech and volunteering at Google.org, says the technique of “using conversational AI to help train frontline counselors” is the first of its kind she knows of, and she hopes it can expand to do similar internal work at different charitable organizations. “Emulating youth language really does feel genuine,” she says of Riley’s capabilities. “I think, now, the model might do a better job of that than the adult staff.”
If you or someone you know is having suicidal thoughts, contact the National Suicide Prevention Lifeline at 1-800-273-TALK or text “HOME” to 741-741 to reach the Crisis Text Line.