Leading the Artificial Intelligence Revolution

Article

Experts discuss psychiatry’s role in the advancement of AI technologies at the 2022 APA Annual Meeting.

Alex_AdobeStock

Alex_AdobeStock

CONFERENCE REPORTER

“Artificial intelligence (AI) is here to stay, and it’s really very important for the psychiatric field to take a leading role in developing it.”

P. Murali Doraiswamy, MBBS, FRCP, of the Duke University School of Medicine discussed some of the latest developments in AI and its current and potential applications for psychiatry at the 2022 American Psychiatric Association (APA) Annual Meeting. He was joined in a panel discussion by Robert M. Califf, MD, MACC, commissioner of food and drugs for the US Food and Drug Administration (FDA), and moderator Samantha Boardman, MD, psychiatrist and author of Everyday Vitality: Turning Stress Into Strength.

“If I was a computer programmer describing the unmet needs in the mental health field, I would use a formula that would go something like this: 40, 40, 30, 0. This means only 40% of people who need care get access to the care they want; of those, only 40% get optimal care; of those, only 30% achieve remission; and 0% of people get preventive care in terms of resilience,” Doraiswamy said. “That’s the problem in this field—and the hope and promise is that AI and technology can help with some of this.”

Doraiswamy noted that $4 billion has been invested in AI, which has been doubling every 3 months for the past few years and transforming multiple fields, including psychiatry. In psychiatry, digital health coaching, stigma reduction, triage and suicide prediction, clinical decision support, and therapeutic apps have already transformed the field, with therapeutic wearables, robots and virtual reality avatars, protocol standardization, QC and practice management, and population forecasting on the horizon.

Mental health and wellness apps are particularly lucrative, with more than 5000 such apps currently on the market and around 250 million individuals having accessed them in the past 2-and-a-half years. Although some are FDA-approved—such as Endeavor Rx for the treatment of pediatric attention-deficit/hyperactivity disorder (ADHD) and Somryst for the treatment of chronic insomnia—it is important to note that many are neither approved nor overseen by the FDA unless they make a disease claim. Many also provide inaccurate information, send data to third parties, and are based on “black box” algorithms rather than randomized controlled trials. However, recent data shows that 82% of individuals believe robots can support their mental health better than humans can, as robots are unbiased and judgment-free, and able to provide them with quick answers.

In addition to this, clinicians have cited a number of potential harms of AI, including diagnostic and treatment errors; a loss in creative clinical thinking; greater risks of dehumanization and the jeopardization of the therapeutic process due to a lack of empathy; and less privacy and more fatalism in general. Clinicians also worry that the process for the machine to reach a diagnosis could turn into a “black box” process, and that, if time saved is used by administrators to increase patient loads, it might lead to greater clinician burnout.

Another potential concern for clinicians is job security in light of the fully automated psychiatrist, which is another form of AI in development. Doraiswamy emphasized that a development like this should be regarded as an enhancement to psychiatric practice rather than as a replacement. “This is not going to replace you,” Doraiswamy said. “This is a personal assistant that’s going to be sitting in the waiting room doing the initial interview intake, get all the intake ready, and then summarize this information and have it ready for you so that you can make a diagnostic assessment.”

Clinicians have also cited a number of benefits of AI, including better outcomes through more standardized protocols and the elimination of human error; less bias due to race or gender; and scalability of treatment. They say AI can also encourage patients to answer truthfully and accept support more objectively; use big data more efficiently than humans; provide practical guidance for trainees; and help elucidate etiologies of diseases that are currently not well understood.

The evidence for the predictive benefits of AI for psychiatry is also growing. Recent research has found support that AI may be able to detect Alzheimer disease 5 years before diagnosis and predict future depressive episodes and risk of suicide, and that machine learning may be able to predict a mental health crisis using only data from electronic health records. “By and large, this shows you the promise,” Doraiswamy said. And, as DeepMind—which is owned by Google—recently released a general-purpose AI agent, “it’s probable that you could have 1 AI program that could help with all 159 diseases in the DSM-5,” Doraiswamy said.

In order to maximize the potential benefits of AI and ensure that psychiatry leads this revolution, Doraiswamy recommended that the field develop clinical practice guidelines; provide proper education and training; ensure that cases are relevant and human centered; advocate for equitable and accountable AI; implement QI methods into workflow; create benchmarks for “trusted and safe” apps; and work with payers to develop appropriate reimbursement.

“We need to step back and acknowledge that digitation in our society and broad access to the internet are having profound effects that we don’t yet understand, and that as we develop technologies with the plasticity that these technologies have, as opposed to traditional medical devices, you can change the software very quickly,” Califf said. “It’s a tremendous potential benefit, but it also carries very specific risks that we need to be aware of.”

“Some very reasonable people might argue that AI and psychiatry don’t belong even in the same sentence—that AI should play no role whatsoever in mental health care, and that the psychotherapeutic relationship is sacrosanct,” Boardman concluded. “But in a world where there are so many unmet mental health needs, I think there’s a very good argument that AI can not only improve care and diagnostics and increase access, but also reduce stigma and even flag potential issues and symptoms before they appear and reduce burnout among professionals.”

Related Videos
ADHD
brain depression
brain
nicotine use
brain schizophrenia
eating disorder brain
dementia
schizophrenia
virtual conference
© 2024 MJH Life Sciences

All rights reserved.