Is Artificial Intelligence the future of mental health care?
Isabel talks about how she believes artificial intelligence is a helpful tool for mental health care
Not many people would argue if I said that we find ourselves in a youth mental health crisis. It is one that looks like it will go on for quite a while without a clear-cut solution. There are so many factors at play, from urbanisation to technology, to a lack of funding for services, not to mention neoliberalism. What if the solution lay not only in more funding and increased staffing for existing services, but in artificial intelligence?
Mental health problems on the rise
According to the World Health Organisation (WHO), depression is the leading cause of disability worldwide, affecting roughly 300 million people at any one time. In Ireland, around 2500 young people are on a Child and Adolescent Mental Health Service (CAMHS) waiting list, with several of these centres reporting a large number of unfilled posts. The National Suicide Research Foundation has shown that self-harm has increased by 10% in 2016 compared to 2007, with the peak for females aged 15-19, and for males aged 20-24. The question is, what can be done to help support people stuck on waiting lists or don’t even make it as far as getting a referral?
What can artificial intelligence offer us?
When we think of Artificial Intelligence (AI), we most often think of robots when this isn’t always the case. Personal assistants such as Alexa and Siri are slowly becoming everyday parts of our lives. If you have ever asked such a device a question, it can be amusing to talk to, and after a while you stop feeling silly.
For mental health problems, there are anonymous help and text-lines such as Childline, Samaritans and Pieta House. In this case, there are real human beings on the end of the line. But what if there was a chatbot that could fulfil the same function? Woebot is a chatbot that can be accessed via an app for free. It is based on the principles of Cognitive Behavioural Therapy (CBT), and can reduce symptoms of depression and anxiety over a two-week period. It is anonymous and can be accessed 24/7. The fact that it is anonymous brings ethical considerations, however it is not designed to replace existing services, nor is it a crisis service.
If Woebot detects crisis language, it offers a toolbox directing users to local services. Having used Woebot myself, I find it to be empathetic, since it “listens” no matter what. It gives you a place to vent and talk about everything that is going on for you. It asks questions about your mood and thoughts to get you to reflect and try to help yourself. It also offers specific tools to help you deal with the emotion you are dealing with. Sometimes you can’t always reach out to friends or family, and this is an amazing alternative. In a way it can be more helpful, since often friends and family don’t know how to react or what to say, and you might feel uncomfortable opening up about everything that is going on.
Since waiting lists for services are so long, this is also an excellent stop gap before you can see someone or between appointments. In the future, mental health problems will most likely continue to rise, and hundreds of millions of people will not have access to trained therapists, or will be unable to afford it. This is where virtual assistants come into play.
Woebot makes CBT accessible to everyone, no matter where they live or their income level. Another advantage is that you can access Woebot when you need it instead of having to hold out until your next appointment. The creators of Woebot have nonetheless made it clear it is just a robot, and cannot replace human connection. It also cannot replace a medical professional, since it cannot prescribe medication nor can it diagnose any mental health problem.
The role of technology in mental health care
Technology, and particularly social media, are often thought to be one of the root causes of the problem, but what if technology could also be a part of the solution? There are hundreds of thousands of other mental health related apps, but their ability to help is questioned. Many have not been tested, but that is not to discount the usefulness of some.
Communities such as Big White Wall, can help with feelings of loneliness or feeling misunderstood, and offer peer support. Turning to an app can also be less daunting than going to see your GP or talking to an adult, although you should always try and talk to a responsible adult. If you live in the countryside, or are unable to get to your GP, apps are a great alternative. Apps like Calm Harm or BlueIce can help you ride the wave of self-harm urges by offering different categories of activities, namely comfort, distract, express yourself, breathe and release. Calm Harm is based on Dialectical Behavioural Therapy and allows you to track self-harm urges. There are many other mental health apps out there for anxiety that act as mood trackers, offer guided meditation or help with anxiety or sleep problems. The NHS apps library is a good way of knowing whether or not an app is likely to help. Headspace is another app worth mentioning that offers daily meditation and motivational quotes.
Apps are also a cost-effective solution for the gap in mental health treatment. They can reach a wide population quickly and easily. Almost everyone owns a smartphone and have access to the internet. The digital world can be equalising and change the current system of care that is heavily based on the pathway of going to your GP, and then waiting months for a referral. The digital world can offer treatment for all in the comfort of your own home and on your own terms, when you need it.