top of page
Search
Writer's pictureSandy Sanbar

Use of AI in Mental Health

There is a shortage of mental health professional in the U.S.


  • In 2022, the American Psychological Association stated that the demand for mental health services continues to increase while psychologists are struggling to provide needed care.

  • In 2023, according to the DHS, the psychiatrist workforce, which is about 45,580, has a shortage of 6%. In 2024, the shortage will be about 9% and in 2025 it will reach12%.

  • In December 2022, Congress passed the Mental Health Access Improvement Act (S.828/ H.R.432) which was signed into law by President Biden. This Bill will allow licensed professional counselors and marriage/family therapists to enroll as providers with Medicare. As of January 1, 2024, licensed professional counselors and marriage/family therapists will be able to bill Medicare Part B and be reimbursed for approved services.

  • Another approach to meet the shortage is AI (artificial intelligence) which may provide new tools to assist mental health professionals and decrease their burden.


Many AI technologies are used in mental health, as listed here:


Traditional Mental Therapy versus AI-Driven Therapy


The strength of traditional mental therapy lies in its human connection. The empathetic understanding, the nuances of human emotions, and the personalized feedback is irreplaceable. But traditional therapy often requires appointments and can sometimes be geographically or financially prohibitive. And human therapists can vary in their approaches and techniques.


In contrast, AI-driven therapy can simulate conversations, but the genuine human empathy is still out of its reach. But AI-driven therapy, being digital, offers 24/7 support, which breaks barriers of time and location. AI-driven therapy provides consistent, data-driven feedback and intervention.


Compared to human psychiatrics and psychologists, AI presently lacks the important traits of human imagination, insight and empathy.


Examples of Symptom Checkers (or Medical Diagnosis Apps).


Symptom Checkers utilize AI-based healthcare apps. They are designed to improve and streamline tasks, thereby enabling doctors to work more efficiently. They are being used by patients Online before seeing a doctor, as well as at some major hospitals and doctors’ offices.


Mental health chatbots and Robots


These are AI tools differ from symptom checkers. They are designed to be used in non-emergency situations. They are not meant for mental health crises.

Patients who are going through serious depressive episodes need immediate help. They should seek advice from mental health professionals straight away, instead of using an app.


History of Mental Health Chatbots - A Timeline


have developed over the past 73 years as noted on the timeline on this slide. Presently, there are numerous chatbots, and they are in use in every field of medicine.



Dr. Joseph Weizenbaum


In the 1960s, Dr. Joseph Weizenbaum, a computer scientist at M.I.T., created a computer program called Eliza. It was named after Eliza Doolittle, the heroine of George Bernard Shaw’s “Pygmalion.”


In 1956, the musical “My Fair Lady” was first staged on Broadway and ran successfully for 6 years.


In 1966, Dr. Weisenbaum’s Eliza made history by becoming the first-ever chatbot.


Eliza did not have “memory” and “no processing power,” but it created a therapeutic illusion. It was designed to simulate Rogerian therapy, in which the patient directs the conversation, and the therapist often repeats what the patient says.

ELIZA ROGERIAN THERAPY EXAMPLE



Many people who tried the Eliza program found it both useful and captivating.

  • Dr. Weisenbaum’s own secretary asked him to leave the room so that she could spend time alone with Eliza.

Doctors saw it as a potentially transformative tool. Several hundred patients an hour could be handled by the computer system.


In 1966, three psychiatrists wrote in The Journal of Nervous and Mental Disease,

PARRY


In 1972, another program named PARRY was written by psychiatrist Kenneth Colby, then at Stanford University. Unlike Eliza, PARRY attempted to simulate a person with paranoid schizophrenia. Psychiatrists who were given transcripts of therapy sessions often couldn’t tell the difference between PARRY and humans.


In 1972, Parry and Eliza met up for a therapy session:

WOEBOT


In 2017, Alison Darcy, a clinical research psychologist at Stanford, founded Woebot, which is an automated mental-health support through a smartphone app. Its approach is based on cognitive behavioral therapy.


WYSA


In 2018, Wysa was another popular mental health chatbot, which offers users emotional support. It uses evidence-based therapeutic techniques such as CBT.


New Model Faces of Chatbots


Chatbots have advanced to human figures, as noted on this slide. There are also fully developed humanoids that walk and talk like humans.



Medical and Mental Health Chatbots


The medical and mental health chatbots are tools that augment doctors’ intelligence. Their use has become increasingly popular.


Some patients prefer to interact with chatbots rather than human therapists - less stigma in asking for help


The benefits of chatbots are:


1. Availability around the clock.

2. Chatbots provide critical information instantly.

3. They provide personalized care.

4. Collect data for future reference.

5. Improve patient engagement.

6. They are cost-effective.

7. Reduce wait times for patients and providers, and

8. Improve accuracy and consistency of healthcare work


Pearls about the use of AI in mental health


Pearl #1


Beside the 1:1 in-person meeting, some psychiatrists are using AI from a variety of data sources. These include medical records, social media, and online searches, all of which aid in identifying behavioral changes indicative of mental health issues.


Pearl #2


Digital psychiatrists use AI to analyze conversational language patterns and types of behaviors on smartphones in order to detect underlying conditions.

  • By looking at speech patterns and human movements, smartphones could pick up on subtle changes indicating the start or worsening of symptoms. Wearables devices, such as an AI or smart watch, may notice subtle physical changes — long before patients themselves even notice problems.

  • These devices could be bringing objective, real-time data to provide medical care.



Pearl #3


AI-driven chatbots are designed to emulate human practitioners.

• The chatbots may recognize mental health issues sooner and recommend suitable interventions that humans might miss.

• Some believe that AI-driven therapy and support systems seem to be the future of mental health.



Pearl #4


AI-based interventions bridge a gap caused by therapist shortages, by extending mental healthcare to a broader population while preserving the choice for human therapy.



Pearl #5


AI-generated videos can help children with autism acquire essential skills in a controlled environment.

AI has also shown promise in assisting individuals with PTSD, anxiety and depression.



Pearl #6


AI can personalize treatment regimens by suggesting non-pharmaceutical alternatives tailored to individual profiles.





Pearl #7


One critical issue in mental health is predicting self-harm before it happens. AI has the ability to predict suicidal tendencies through data analysis. It can outperform human assessments, and provide valuable data-driven insights.



Pearl #8


Combining AI-generated diagnoses with those from human practitioners can result in a more accurate and unbiased assessment.

AI can mitigate human biases by employing impartial algorithms for diagnosis and treatment recommendations.


However, the inclusion of AI-driven mental healthcare raises concerns about the erosion of human elements, like empathy and trust in the therapist-patient relationships.


In September 2023, Dr. Bobby Hoffman, a psychologist, titled his article with this question, “Is AI the Key to Improving Mental Healthcare Accessibility? He stated:


  • AI can improve mental health diagnosis, therapy, and treatment, by increasing access and personalizing care.

  • AI automation can help address therapist shortages, extend care to more people, and reduce costs.

  • AI can assist people struggling with autism or PTSD, while mitigating biases and enhancing mental healthcare.”


Ethical Considerations and Privacy Concerns


  • Data Security: As chatbots collect sensitive mental health data, ensuring its security becomes paramount. There's a need for stringent regulations to protect user data from breaches.

  • Transparency: Users must be made aware of how their data is used and for what purposes. Transparent practices are essential to build trust.

  • Boundaries: It's crucial to establish clear boundaries between AI-driven and human-driven care. Users should know when it's necessary to seek human intervention and not solely rely on AI.

  • Bias and Misdiagnosis: AI models can sometimes inherit biases present in their training data. This can lead to misdiagnosis or unsuitable therapeutic suggestions, underlining the importance of regular updates and checks


16 views0 comments

Comments


bottom of page