An AI chatbot is said to have helped increase the number of patients referred to mental health services by the NHS in England. The system worked particularly well for minority groups, as a new study shows. The demand for such offers is already increasing, especially since the COVID-19 pandemic. In 2022, 4.6 million patients were referred to NHS mental health services – the highest number since records began. The number of people who come into contact with such services is constantly growing. However, according to the British Medical Association, neither the financial resources nor the number of specialists are sufficient to meet the increasing demand.
The company Limbic now wanted to find out whether AI could lower the barrier to entry into care by helping patients receive help more quickly and efficiently. The research, published in the journal Nature Medicine, looked at how the chatbot called Limbic Access affected referrals to the NHS Talking Therapies for Anxiety and Depression programme. These are a range of evidence-based psychological therapies for adults with anxiety disorders, depression, or both.
The study looked at data from 129,400 people who visited websites to contact 28 different NHS talking therapy services across England. Half of them used the chatbot on their website, the other half used other data collection methods such as web forms. The number of referrals from services that used the Limbic chatbot increased by 15 percent during the three-month study period, compared to a 6 percent increase in referrals for services that did not use it. Referrals from people from minority groups, including ethnic and sexual minorities, increased significantly when the chatbot was available – about 179 percent for people who identify as non-binary, 39 percent for Asian patients and 40 percent for Black patients.
Crucially, say the report’s authors, the increased number of patients referred did not lead to an increase in waiting times or a reduction in the number of clinical assessments carried out. That’s because the detailed information the chatbot collected reduced the time spent assessing patients by human doctors, while improving the quality of assessments and freeing up other resources. It’s important to remember that an interactive chatbot and a static web form are very different methods of collecting information, emphasizes John Torous, director of the division of digital psychiatry at Beth Israel Deaconess Medical Center in Massachusetts.
Screening via chatbot
Visitors to the chatbot-enabled websites were greeted with a pop-up explaining that Limbic is a robot assistant designed to help them receive psychological support. As part of an initial evidence-based screening process, the chatbot asks a series of questions, including whether the patient has any long-term medical problems or previous diagnoses from mental health professionals. Several questions are then asked to measure symptoms of common mental health problems and anxiety, with questions tailored to the symptoms most relevant to the patient’s problems.
The chatbot then uses the collected data to create a detailed referral, which it shares with the service’s e-medical record. A staff member can then access this referral and contact the patient within a few days to self-assess and initiate treatment. Limbic’s chatbot is a combination of different types of AI models. The first uses natural language processing to analyze a patient’s inputted responses and provide appropriate, empathetic responses. Probabilistic models then use the data entered by the patient to tailor the chatbot’s responses to the patient’s most likely mental health problem. These models are able to classify eight common mental health problems with 93 percent accuracy, according to the report’s authors.
develop the field further
“In a way, this shows us where the field could be going – that it will be easier to reach people to study them, regardless of the technology,” says Torous, not involved in the study. “However, the question arises as to what kind of services we offer to such people and how we can allocate these services. Mostly, the patients who used the chatbot and then gave positive feedback to Limbic mentioned its simplicity and convenience. They also said reported that the referral gave them more hope for recovery or helped them know they were not alone. Non-binary respondents were more likely to mention the non-human nature of the chatbot than patients who identified as male or female. This suggests this noted that interacting with the bot helped avoid feelings of judgment, stigma or fear that might trigger a conversation with a real person.
“The fact that people from gender, sexual and ethnic minorities, who are typically difficult to reach, are interacting with the bot relatively more often is a really exciting result,” says Ross Harper, founder and CEO of Limbic. He is also a co-author of the study. “This shows that in the right hands, AI can be a powerful tool for equality and inclusion.”
Addressing a growing skills shortage, Harper adds: “There aren’t enough mental health professionals, so we want to use AI to empower the existing professionals. This collaboration between human professionals and AI systems – that’s where “We can truly solve the mental health supply and demand imbalance.”
To home page