Artificial intelligence could help suicide hotlines better respond to callers, according to a new Montreal study. The algorithm detects whether the caller is anxious or angry, and changes in their emotions.
“Our artificial intelligence (AI) model was trained with actors,” says Alaa Nfissi, an AI expert at Concordia University who presented his results at a conference in California last winter. “It subsequently performed 74% better at identifying the emotions of someone calling a suicide hotline.”
The information provided by the algorithm helps responders better identify the emotions of callers. “If you talk to someone in crisis, you’re not necessarily going to remember that 10 minutes ago, the person was calmer or more anxious,” explains Brian Mishara, a psychologist at the Université du Québec à Montréal (UQAM) who founded Suicide Action Montréal 20 years ago and is a co-author of the study. “These are changes that can be gradual.”
If a suicidal person is very distressed or anxious, the counselor may favor a calm tone of voice, “say things to encourage him to take his time, or try to make him feel more comfortable.”
The goal is to help stakeholders be empathetic.
Empathy is about identifying emotions and then expressing that you understand what the person is feeling. It’s difficult in a context where people don’t necessarily say how they feel. You have to detect emotions through words, through a tone of voice.
Brian Mishara, psychologist
Will callers like being advised by an algorithm? “We haven’t had any negative feedback,” says Nfissi. “But initially, it could be used as a training tool to recognize emotions rather than during interactions with callers.”
Interveners are generally open to anything that can help them, Mishara says. “Studies have shown that they are often more negative in evaluating their work on crisis lines than the people who call. Callers often appreciate the work of the intervenors more than the intervenors themselves,” he says.
Can telling a caller, “I think you’re upset,” make them more angry? “It depends on the case, but it can have a positive effect,” Mishara says. “Programs that teach children to recognize other people’s emotions, for example, can have a positive effect on parents when they hear their child say, ‘Mom, you’re really upset because I did this.'”
Could suicide risk be detected in text messages or social media? “There’s already research on this, and it may be that computers are often as good as humans at determining suicide risk,” Mishara says. “But you need a human to ask the question, to ask someone at risk what their intentions are.”
The AI algorithm was trained with English data. But since these relatively high success rates were achieved with only 12 hours of recording, it should be easy to have a French version, Nfissi said.
Learn more
-
- 32 per 100,000 people
- Rate of hospitalization for attempted suicide among men in Quebec in 2022
Source: inspq
- 53 per 100,000 people
- Rate of hospitalization for attempted suicide among women in Quebec in 2022
Source: inspq