close
close

AI tool can help improve the quality of 113 counseling

AI tool can help improve the quality of 113 counseling

The Hotline 113 allows people with suicidal thoughts to anonymously express their feelings and concerns, with the ultimate goal of preventing suicide. In 2023, almost 185,000 were performed. telephone and chat calls, 21.5% more than a year earlier. 113 is constantly looking for new ways to improve its services.

As part of his PhD, Salim Salmi worked on a tool that uses artificial intelligence to analyze online conversations and then advises the advisor how to continue the conversation. But such a tool must “know” which conversations are effective and which are not. This was one of the key questions in Salmi’s research, and also a big challenge: “We have a lot of text data, but we can’t link it to the results. Because people contact us anonymously, you don’t know how they feel after the conversation. So we asked them to fill out a questionnaire before and after the interview. This questionnaire checks for the presence of characteristics that may indicate suicidal behavior. Gives a rating. Do we see a change after the conversation?

By collecting such data over the years, a picture has emerged of which interviews lead to a “good” outcome and which do not. Using this information, Salmi trained an artificial intelligence model to learn which elements of the interview improved and which deteriorated. He designed the model so that he could look back and see which sentences the model considered normative for the outcome of the conversation.

Digital assistant

The next step was to develop a digital assistant that would be able to provide suggestions to the advisor during a chat conversation. The model searches a database of successful conversations for texts relevant to the current chat and makes a suggestion. In this way, helpers are shown examples of conversations that actually took place and they decide whether to use them.

Results

The digital assistant has been thoroughly tested. First in a group of 24 counselors, and then in a randomized trial in which 27 counselors received the tool and 21 did not. Salmi: “They decided for themselves when to consult the assistant and what to do with the suggestions.” Results: “Conversations with the AI ​​tool lasted slightly shorter on average. In terms of self-efficacy – belief in one’s ability to successfully complete a task – counselors reported little difference. They decided to use the AI ​​assistant mainly in difficult situations. conversations where they didn’t know how to reach the person.”

The next step is to expand the model from chat to telephone calls. After completing her PhD research, Salmi is starting a new project: integrating her tool with a platform that people use to request help by dialing 113.