Analysis: Chatbots for mental health care are booming, but there's little proof that they help
The mental health app industry is booming, with many new apps claiming to use AI to provide care.

KFF Health News
In the last few years, between 10,000 and 20,000 apps have flooded the mental health market, promising to "disrupt" traditional therapy. ChatGPT and other AI innovations have created a frenzy, leading to the idea that chatbots could provide mental health care.
These numbers help explain the reason: Pandemic stress led to millions of Americans seeking treatment. In the United States, the shortage of mental health professionals has been long-standing; more than 50% of counties are without psychiatrists. The Affordable Care Act mandates that insurers provide mental and physical coverage at parity. This has created a huge gap between the demand and supply.
This presents an opportunity for entrepreneurs. The South by Southwest conference, held in March, was where many health startups showcased their products. There was a near religious belief that AI would rebuild the health care system, with apps and machines capable of diagnosing and treating all types of illnesses and replacing doctors and nurses.
In the area of mental health, there is a lack of evidence. The FDA has not examined many of the apps that are available. Few have been independently tested to see if they work. Though marketed to treat conditions such as anxiety, attention-deficit/hyperactivity disorder, and depression, or to predict suicidal tendencies, many warn users (in small print) that they are ' not intended to be medical, behavioral health or other healthcare service' or ' not an FDA cleared product.'
This marketing force is a formidable one, and there are many reasons to be cautious.
Joseph Weizenbaum predicted decades ago that AI could not be a good therapist. However, it was possible to make AI sound like a therapist. His original AI program was a psychotherapist called ELIZA. It used natural language programming and word recognition to sound like a therapist.
Do you believe that coming here can help you to not be unhappy?
Weizenbaum was terrified by ELIZA, despite it being hailed as a triumph of AI. I interviewed him once. He claimed that students would treat Eliza as if she were a real therapist. However, he had created 'a trick'.
He predicted the development of much more sophisticated programs, like ChatGPT. He told me that 'the experiences an artificial intelligence might have under such circumstances is not a human experience'. "The computer, for instance, will not experience loneliness as we know it."
It's the same for anxiety and ecstasy. These emotions are so complex in their neural makeup that scientists cannot pinpoint where they come from. Can a chatbot create transference, that empathic flow of information between doctor and patient which is essential to many forms of therapy?
Bon Ku, the director of Thomas Jefferson University's Health Design Lab and a pioneer of medical innovation, said: 'The core principle of medicine is a relationship between humans and human beings -- and AI cannot love.' I have a human psychotherapist and they will never be replaced.
Ku would like to see AI be used to free up time for practitioners to connect by reducing their tasks such as record-keeping, data entry and other administrative duties.
Some mental health apps can be harmful, even if they may eventually prove to be worthwhile. Researchers noted that users criticized these apps because they were'scripted and lacked adaptability beyond textbooks of mild anxiety or depression'.
Insurance companies may be tempted to use chatbots and apps to satisfy the requirement for mental health parity. It would be cheaper and easier to offer an app or chatbot than a panel human therapists.
The Department of Labor, perhaps recognizing the flood of AI entering the market last year, announced that it would be stepping up its efforts to ensure insurers comply with the mental parity requirement.
Last year, the FDA also said that it "intends to exercise enforcement discretion" over a variety of mental health apps which it will evaluate as medical devices. To date, no app has been approved. Only a few devices have been given the breakthrough device designation by the agency, which expedites reviews and studies for devices with potential.
These apps are mostly what therapists refer to as structured therapy. Patients have a specific problem and the app responds with a workbook approach. Woebot, for example, combines mindfulness exercises and self-care for postpartum depressive disorders (with answers from teams of therapists). Wysa is another app which has been designated as a breakthrough device. It delivers cognitive behavioral treatment for anxiety, chronic pain, and depression.
It will take some time to gather reliable scientific data on how app-based treatment works. Kedar Mate of the Institute for Healthcare Improvement in Boston said, 'The agency has very little data to draw conclusions.'
We don't yet know if app-based mental healthcare is better than Weizenbaum’s ELIZA. AI will certainly improve over time, but it is premature for insurers at this stage to claim that an app can meet the mental health parity requirements.