Artificial active

4 Ways Artificial Intelligence Improves Mental Health Therapy – The European Sting – Critical News & Insights on European Politics, Economy, Foreign Affairs, Business & Technology

(Credit: Unsplash)

This article is brought to you thanks to the collaboration of The European Sting with the World Economic Forum.

Author: Kayleigh Bateman, Senior Editor, Educational Content


  • Mental health professionals use artificial intelligence (AI) to improve the accuracy of diagnoses and treatments.
  • Therapists are turning to AI to help them with stretched workloads.
  • 84% of psychologists have seen an increase in demand for treatment for anxiety.
  • The technology assists in the quality control of treatment and in the training of therapists.

Driven by a global pandemic, the healthcare industry is finding new ways to adapt quickly and safely. For many, technology has been the key.

In the mental health field, 84% of psychologists who treat anxiety disorders say there has been increased demand for treatment since the start of the pandemic, according to a survey by the American Psychological Association. This is up from 74% a year earlier.

Already used in many industries, it is becoming clear that the use of AI in mental health services could be a game-changer to provide more effective and personalized treatment plans. The technology not only gives better insight into the needs of patients, but also helps develop techniques and training for therapists.

Here are four ways AI has improved mental health therapy.

a graph showing how mental health professionals are reporting increased workloads since the start of the pandemic as demand for therapy skyrockets
Mental health professionals are reporting an increased workload since the start of the pandemic as demand for therapy skyrockets. Image: American Psychological Association

1. Maintain high standards of therapy with quality control

With increased demand for services and stretched workloads, some mental health clinics are exploring automated ways to monitor quality control among therapists.

The mental health clinic Ieso uses AI analyze the language used in his therapy sessions through natural language processing (NLP) – a technique where machines process transcriptions. The clinic aims to provide therapists with better insight into their work to ensure the delivery of high quality care and to help trainees improve.

Tech companies have taken notice and are providing clinics with the tools to better understand the words spoken between therapists and clients. In the UK and US, software company Lyssn provides clinics and universities with technology designed to improve quality control and training.

2. Refine the diagnosis and appoint the right therapist

AI helps doctors detect mental illness earlier and make more precise choices in treatment plans.

Researchers believe they can use the information gleaned from the data for more successful therapy sessions to help match potential clients with the right therapists and determine what type of therapy would work best for an individual.

“I think we’ll finally get more answers about which treatment techniques work best for which symptom combinations,” said Jennifer Wild, clinical psychologist at the University of Oxford. MIT’s technology journal.

Additionally, AI research can refine patient diagnoses into different disease subgroups to help physicians personalize treatment.

Using AI technology, therapists can sift through large amounts of data to identify family history, patient behaviors, and responses to previous treatments, to make a more accurate diagnosis, and to make more informed decisions. on treatment and choice of therapist.

Machine learning – a form of AI that uses algorithms to make decisions – is also harnessed to identify forms of post-traumatic stress disorder (PTSD) in veterans.

3. Monitor patient progress and modify treatment if necessary

Once paired with a therapist, it is necessary to monitor the patient’s progress and track improvements. AI can help identify when a change in treatment needs to take place or if it’s time to bring in another therapist.

For example, Lyssn uses an algorithm to analyze statements between therapists and clients to reveal how much time is spent in constructive therapy versus general chatter during a session in order to make improvements.

The Ieso team also studies the utterances during the sessions, focusing only on the patients rather than the therapists. In a recent paper, the team identified “active change” responses spoken by clients, such as “I don’t want to live like this anymore” and also “exploring the change conversation” where the client thinks about ways to move forward and make a change.

The team noted that not hearing such statements during treatment would be a warning sign that therapy was not working. AI transcripts may also open up opportunities to investigate the language used by successful therapists who have their clients say such statements, to train other therapists in this area.

4. Justify cognitive behavioral therapy (CBT) instead of drugs

The use of drugs as a treatment for mental health problems like depression has increased. Number of patients in England prescribed antidepressants in the third quarter of 2020-2021 was up 23% compared to the same quarter in 2015-2016, according to the NHS.

However, the UK’s National Institute for Health and Care Excellence (NICE) recently updated its guidelines to encourage the use of CBT before medication for mild depression.

AI can help validate CBT as a treatment, according to researchers at Ieso. In a JAMA Psychiatry article, researchers used AI to discern the sentences used in conversations between therapists and patients.

CBT aims to identify negative thought patterns and find ways to break them, which means therapists use statements to discuss methods of change and planning for the future. The researchers concluded that having higher levels of CBT chat in sessions instead of general chat correlated with better recovery rates. mental health, digital

What is the Forum doing to ensure digital mental health security?

New ethical questions about the safety, effectiveness, fairness and sustainability of digital mental health care – online and through apps – are being raised around the world, and companies are held to account for their creation. and their approval of services.

the Global Governance Toolkit for Digital Mental Health, launched by the World Economic Forum and Deloitte, provides governments, regulators and businesses with the tools to protect personal data, ensure high quality of service, and address security concerns related to the boom in mental health care digital and behavioral.

“People are turning to apps on their smartphones in an attempt to deal with a growing number of mental health issues. This toolkit will help to ensure their security and confidentiality. ”—Arnaud Bernaert, Head of Shaping the Future of Health and Healthcare, World Economic Forum

Businesses can join the World Economic Forum to shape the future of mental health technologies responsibly, through the platform for Shaping the future of health and healthcare.

Learn more about our impact.

Improvements outside the clinic

Wearable technologies are another avenue where AI is improving mental health therapy.

In conjunction with the clinic sessions, therapists use technologies such as the Fitbit to determine ways to improve processing. For example, mental health care providers can monitor a patient’s sleep patterns with a Fitbit instead of relying on it to provide accurate reports.

The long-term effectiveness of AI in treating mental health has yet to be thoroughly tested, but early results look promising.

While the use of AI within the mental health ecosystem offers opportunities to improve systems, it also opens up the potential for abuse and mistreatment. In order to guard against this risk, the The World Economic Forum has launched a toolkit provide governments, regulators and independent insurance organizations with the means to develop and adopt standards and policies that address ethical concerns related to the use of disruptive technologies in mental health.

“In mental health, trust is more than mitigating the risk of unethical and malicious use, it’s working with communities to act responsibly,” wrote Stephanie Allen of Deloitte and Arnaud Bernaert, responsible for global health and health care at the Forum. report. “Not only is this the start of this journey – which will not be easy – but we have a clear medical, moral and economic imperative to do better.”