The social media giant will expand the reach of its software to detect whether a user might be suicidal. The software originally rolled out in the U.S. in March and works by identifying phrases and other clues a user posts on the site that could suggest they are suicidal. If the software determines that they are, the user will be sent resources that can help them cope–such as the information for a telephone helpline.
A specialized team at Facebook will also contact local authorities if they believe self-harm is imminent. Facebook says that in the last month alone the company has contacted first responders more than 100 times. Facebook declined to say which country would see the suicide prevention AI rollout next, but their ultimate goal is to launch it worldwide.