How AI apps improve mental health treatment

Last updated: Apr 26, 2023
Eugene Kruglik
Healthcare Development Expert
Yulia_Baranchuk
Copywriter, Vention

Mental health is no longer taboo. We see more public figures like Ryan Reynolds, Selena Gomez, and even the tough guy Dwayne "The Rock" Johnson opening up about their mental health struggles. They’re not speaking up just for themselves: Mental Health America’s stats show that in the USA alone almost 20 percent of adults (nearly 50 million people) were diagnosed with a mental illness in 2022.

The good news doesn’t just lie in more public awareness of mental health issues. Emotional wellness is now regarded as a priority for companies, with all sorts of businesses including mental health support in their benefits packages. If you’re in the business of both wellness and software development, then the gathering steam of artificial intelligence should be top-of-mind.

A great example? Our client Dialogue, the largest telemedicine player in Canada, which together with Tictrac provides wellness offerings to employers and insurers with the help of AI components such as preliminary screening and first-pass diagnosis via AI chatbot.

Technologies like these that combine software and therapy can drastically change the way people support their mental health — and they’re at the very heart of a promising and likely very lucrative industry.

The skinny on AI mental health technologies

Machine learning models

ML helps analyze crucial data from data sets, personalized experiences, electronic health records (EHR), and complicated information of huge volumes. We can’t identify mental health conditions with the help of a blood test (at least not yet), but we can use machine learning algorithms to help doctors identify crucial behavioral bio-markers at a higher speed and with greater accuracy.

Depression, anxiety, and mental illnesses of different kinds may vary from person to person in symptoms; moreover, there are many subtypes of disorders. By analyzing the data automatically, ML can be useful for therapists in precise medical prescriptions and personalized treatment plans for this or that subtype of mental illness.

Check out Headspace Health (“Give stress a day off”), which has thus far received $215 million in funding, for a great example of how machine learning and mental health can be fused. In 2022, Headspace acquired Sayana, a personalized self-care company powered by machine learning.

Chatbots powered by AI algorithms

There are plenty of bots on the market — Wysa (“No stigma. No limits.”), Replika (“The AI companion who cares”, Woebot (“Small chats for big feelings”) — that can offer you an empathic friend to chat with whenever you like, but increasingly, there are bots with AI algorithms that can help treat urgent mental healthcare, on demand.

Of course, they can’t replace a human being with empathy, and they can’t read body language, but AI and data-crunching capabilities of chatbots nowadays are impressive: They can provide us with supportive messages, reminders to take medications in time, analyze your mood, or even mimic a human therapist's interactions with you — all around the clock.

Still, it’s debatable if they can be used by patients with severe mental conditions like suicidal thoughts or at the moment of crisis and the aggravation of the disease. But definitely, chatbots are gaining more attention today, as they can help people overcome the fear of judgment and the barrier of talking about their mental problems with a human being.

ChatGPT

The release of ChatGPT by OpenAI has shaken the entire tech world with its ability to create human-like texts, write code, and even design graphics. One of the areas where it shows huge potential is mental health. Companies like Koko (“crowdsourced cognitive therapy”) and Doc.ai are experimenting with the integration of this technology into their apps and chatbots to provide mental health consults — and AI-powered software is driving a conversation about the intersection between narrative and mental health. As Doc’s CEO put it, “We are at an inflection point in the history of computational semiotics. Computers are telling us stories about ourselves.”

Keep in mind that the technology is young, and it’s restrained by thorny issues with HTTP protocols and cybersecurity challenges. Moreover, experts and developers engaged in the development of GPT models continue to tackle issues such as hate speech, harassment, violence, political content, adult content, spam, and deception. Koko, for example, recently experimented with ChatGPT conversing like a mental health counselor for about 4,000 people, without alerting the users that the answers were generated by AI. Lesson learned: If you are unleashing a chatbot, follow best practices for building chatbots and make it clear to the user from the start of the conversation that they’re talking to a bot.

NLP

Speech disorders like a derailment, incoherence, and single word generation problems can signal schizophrenia, psychosis, or severe stage of depression. Recent research based on the Hebrew language showed that with the help of natural language processing (NLP), it’s possible to distinguish between healthy and unhealthy speech.

Given the ubiquity of social media and blogs, NLP provides therapists the opportunity to analyze posts and articles of their patients with great accuracy using AI algorithms to detect markers of suicidal thoughts or anxiety attacks. For insights on how that’s done, take a look at Hume AI, a research lab and technology company dedicated to emotional well-being, builds NLP-powered products that can identify an individual’s emotional state by analyzing their online activity. (And no need to mask your true feelings: This digital Sherlock knows when you're lying or being sarcastic.)

Computer vision

Gestures, facial expressions, poses, eye gaze, and movements also matter when it comes to AI and mental health assessment. Computer vision can help doctors identify if a person has a mental disorder and even distinguish depression from schizophrenia and the degree of disease affection by analyzing photos and videos. Computer vision can also aid in the evaluation of a patient’s progress in treatment.

Voice recognition

Voice markers for depressed people are softness, monotonousness, an abundance of pauses and delays in speech, and various indicators that can be caught by voice biomarkers technology with the help of AI. Voice markers are already used by insurance companies and hospitals for calls and online consultations.

Voice recognition can also be helpful for personal usage: Sonde ("Mental Fitness"), for example, encourages its users to make a voice journal to identify their level of stress (and can crow about recent funding of $19.25 million. Another example is Kintsugi — a company built by two ambitious women to help detect the signs of clinical depression and anxiety using ML and voice biomarkers that raised $20 million last year.

No matter the technology, safety needs to be top-of-mind. “The main direction of AI development is to improve its security, as well as the latest algorithms and models to reduce the line between human help and a machine,” says Eugene Kruglik, a healthtech delivery manager at Vention who’s worked on cutting-edge projects spanning digital pharmacy, electronic medical records (EMR), medical devices, mental health, and telemedicine.

AI benefits in mental health treatment

It takes two to therapy, so one of AI's core goals is to enhance the collaboration between clinicians and patients, including in between sessions. It can also address the shortage of qualified mental health workers: In the US alone, over one-third of the population lacks stable access to mental healthcare services.

Personalization

AI can help doctors adjust treatment plans individually to every patient’s needs, which increases treatment efficiency. With the opportunity of analyzing questionnaires, voice, and behavioral samples, and data from a patient’s social media accounts, it’s more likely that the diagnostics, medication, and treatment plan will be case-specific, and it’s easier to make some adjustments in the process.

A great example is Healthily (“Health questions answered”), the world's first medically approved self-care platform that matches a patient’s personal needs to the latest information from healthcare providers. The system detects potential risks for mental wellness and helps therapists create personalized treatment plans to better support long-term recovery.

Support for therapists

Accurate diagnostics is a crucial part of any treatment, especially when it comes to patients’ mental health issues. With the help of AI, therapists can analyze a huge amount of data faster and more efficiently and direct the time savings toward communicating with patients and zeroing in on accurate diagnoses.

For example, AI can be used for note-taking in psychotherapy (Mentalyc) or for triage and assessment (Limbic, which partners with the UK’s NHS).

Self-therapy

For some patients, it’s easier to share their inner thoughts with an AI chatbot rather than with a therapist. Non-medical chatbots like Replika can function like a confidante and help users explore their deepest feelings or even benefit from romantic companionship — an affordable and 24/7 solution for people with depression or social anxiety issues.

Mind the gap

As is so often the case with emerging technologies, regulators are playing catch up. As the American Psychological Association notes, mental health apps in general are in a boom phase in terms of funding, which is great in terms of healthcare access and affordability, but has a worrying catch: There are likely 10,000-20,000 mental health apps currently on the market, but there’s no way of ensuring their safety, and not just from a clinical perspective.

Mental health apps “fail spectacularly,” some argue, when it comes to privacy and cyber security — and that’s not just a business issue but a moral one.

“We need to solve the ethical issues in AI in mental health by balancing the benefits of technology with maintaining the human connection and empathy that is essential for effective mental health treatment,” says Kruglik.

Keep reading: