Blog

Article

Artificial Intelligence Will Never Replace Psychiatry—But It Will Try to Destroy It

Key Takeaways

  • AI and LLMs can mimic human conversation and assist in diagnosing mental health conditions but cannot replace genuine human connections in therapy.
  • The rise of AI in mental health care may increase demand for human therapists, as technology-driven interactions can lead to isolation and declining mental health.
SHOW MORE

AI chatbots may assist in mental health, but they cannot replace the essential human connection provided by psychiatrists and therapists.

AI

ipopba/AdobeStock

AFFIRMING PSYCHIATRY

In 1950, the brilliant British mathematician Allan Turing proposed a challenge to determine whether computers were able to think. In his version of the “imitation game,” humans were to exchange written messages with either a person or a computer. If humans could not correctly guess whether they were talking to one or the other, then Turing could conclude that computers were as intelligent as humans.

These days, it is fast becoming a trivial thing for large language models (LLMs) like ChatGPT to pass a rigorous Turing test.1,2 And yet no one really cares if they do. This is because we have all experienced how well artificial intelligence (AI) LLMs can dialogue pretty much like humans. (The main difference between us seems to be that LLMs are significantly more cooperative than actual human beings.3) On the other hand, we are now obsessed with bigger questions: Are AI chatbots people too? Are LLMs conscious? Can we have human relationships with chatbots? Will these things replace our human friendships, romances, and psychotherapies?


Apologies for short-circuiting the suspense, but I would like to furnish the correct answers to these questions: no, no, no, no, no, and no. While I lack space to explain my strong opinions of all of these, I would like to discuss the last of these questions at least briefly: Will AI replace us as psychiatrists and psychotherapists?

Will AI Replace Us?

On one hand, it is easy to make the argument that it will, and many people do. Even now, there are confident voices telling us that AI is going to replace doctors, and already does a better job of diagnosis. As one characteristically enthusiastic YouTuber put it (to 192,000 views), “Doctors are TOAST: A.I. Coming WAY FASTER (than expected) To Replace.”4 Meanwhile, if people who have trouble finding romance can fall in love and have satisfying relationships with chatbots, why wouldn’t an AI-bot make a better therapist than a human? The media, at least, is pursuing such questions with hard-hitting stories such as, “My Couples Retreat With 3 AI Chatbots and the Humans Who Love Them.”5

More seriously, researchers have opined that AI chatbots may someday help ease the shortage of therapists and our severe undersupply of mental health treatment. At the same time, they warn us that there are deep, as yet unanswered questions about the ethics, practicality, and long-term effects of chatbot therapists.6,7 Nevertheless, one study found that LLMs were better at diagnosing posttraumatic stress disorder and depression than mental health professionals (though they were worse at diagnosing schizophrenia),8 while another recent trial found that experienced therapists were unable to tell the difference between transcripts of ChatGPT and human psychotherapists, and in fact rated the ChatGPT responses as more skillful.9 An earlier systematic review found that LLMs were quite skillful at assessing depression, suicidality, and other mental states, as well as “generating empathic responses.” However, reviewers concluded that, “there are currently no studies available that provide evaluative evidence to support the clinical use of LLMs.”10

Nevertheless, it appears that some people are not waiting around for the research to catch up to their needs. Already, a Reddit post entitled, “ChatGPT has helped me more than 15 years of therapy. No joke,” has received 2400 upvotes.11 Gen Z, we are told, is especially enamored with using ChatGPT for psychotherapy, many of its members finding AI to be the “perfect therapist.”12 A quick Google search reveals at least 2 ChatGPT therapists immediately available, while there are many other proprietary helpers, such as Therabot, which offers (among other things) “an AI companion for mental wellness.”13

Are AI Chatbots People Too?

Are psychiatrists and other mental health professionals gradually going extinct? Will AI quickly surpass us in our ability to diagnose and treat mental illnesses? If the vendors of technology have their way, it certainly will. Their tried-and-true model is one of disruption, in which new technologies undercut and overturn well-established industries, bringing massive profits in the process.14

Will they succeed in doing so? They will not. They cannot, for the following reason: The more AI replaces human relationships in other aspects of life, the more desperately people will need mental health care from flesh and blood human beings. And this is because AI replaces human relationships in just the same way that soda replaces water: Soda is far more pleasant and refreshing in the short run, and it is terrible for the body in the long run. AI chatbots are typically more agreeable, available, efficient, and responsive than human interlocutors. They are ready on demand, eager to help, and are supremely in-the-know. All of this makes them more appealing at first. But AI chatbots and other forms of digital relatedness do not ultimately give us what we need. Instead, they eat away at life-sustaining patterns of human relationship, leading to isolation, loneliness, emptiness, and declining mental health.

How do we know this? Because it is already happening, and we are already suffering the consequences. Virtual forms of relatedness cannot substitute for real-time, face-to-face human interactions, which involved layer upon layer of evolved unconscious and nonverbal communication. Human beings are becoming lonelier, more socially fragmented, more polarized, and more unhappy as technologically mediated forms of social interaction replace the traditional ones.15 Rates of depression, anxiety, and suicidality have increased dramatically, and continue to rise. While the correlations are striking, evidence for causation is steadily accumulating.16,17 Although technology is unlikely to be only reason for the disintegration of our collective happiness and mental health, it is almost certain to make the short list. And the irony (lost on no one) is that the more extensively we become interconnected with people across the globe every day, the more systematically we are starved of actual wholesome, in-person, life-giving social connections.

Our Postrelational Future

As this process progresses, people are likely to lose the habits and skills that allow them to deeply connect with others during real-time, in-person interactions. People who are constantly distracted by online information are never going to learn how to sit with someone and have a long, absorbing conversation. Teens and young parents already receive hundreds of push notifications per day (on average),18,19 notifications which are disruptive to both cognitive function and social interactions.20

Given the ubiquity of smartphones and other screens, rising generations are not going to routinely learn how to sustain a close and positive relationship through all the inevitable problems and distractions that tend to disrupt such relationships. Even today, how many people actually sit and talk with another person (even their own spouse) uninterrupted for 30 or 45 minutes at a time, much less do so on a regular basis? The kind of deep attention and empathy we offer in mental health care will become increasingly rare as technology takes its toll. And as it becomes rarer, it will become more and more valuable. AI and LLMs are not going to diminish the demand for human psychiatrists and psychotherapists, they are going to increase it wildly.

Concluding Thoughts

Long ago in the 1990s, when I was a psychiatry resident, I had the fantasy that psychiatrists would one day be replaced by vending machines. People would simply come up to the machine, swipe a credit card, select their symptoms from a menu, and have a medicine pop out of the bottom of the machine. I have never forgotten that image, or the questions that it raised. Is this the true essence of functioning as a psychiatrist? We check off DSM symptoms, see what fits the diagnosis, and then prescribe the appropriate medication? Is that the basic service that we psychiatrists provide? And if so, cannot a machine prescribe medications better than a human psychiatrist? Can’t an AI-bot do a better job of interviewing a patient and making a diagnosis?

Maybe it can, at least hypothetically. But all of that reduces human patients to the level of content consumers and slowly kills them. People in distress will die without close and authentic human relationships, and chatbots will not keep them alive.21 What keeps our patients alive, every bit as much as medicines and accurate assessments, is a deep, caring bond with another human being who truly understands their plight and empathizes. And this is something an AI chatbot can never, ever replace.

Dr Morehead is a psychiatrist and director of training for the general psychiatry residency at Tufts Medical Center in Boston. He frequently speaks as an advocate for mental health and is author of Science Over Stigma: Education and Advocacy for Mental Health, published by the American Psychiatric Association. He can be reached at dmorehead@tuftsmedicalcenter.org.

References

1. Jones CR, Bergen BK. Large language models pass the Turing test. arXiv. 2025;2503.23674.

2. Biever C. ChatGPT broke the Turing test – the race is on for new ways to assess AI. Nature. 2023;619(7971):686-689.

3. Mei Q, Xie Y, Yuan W, Jackson MO. A Turing test of whether AI chatbots are behaviorally similar to humans. Proc Natl Acad Sci U S A. 2024;121(9):e2313925121.

4. Dhand S. Doctors ARE TOAST: A.I. Coming WAY FASTER (than expected) To Replace. YouTube. May 22, 2025. Accessed September 16, 2025. https://www.youtube.com/watch?v=UgGoMM2-3mA

5. Apple S. My couples retreat with 3 AI chatbots and the humans who love them. Wired. June 26, 2025. Accessed September 16, 2025. https://www.wired.com/story/couples-retreat-with-3-ai-chatbots-and-humans-who-love-them-replika-nomi-chatgpt/

6. Jurblum M, Selzer R. Potential promises and perils of artificial intelligence in psychotherapy–The AI Psychotherapist (APT). Australas Psychiatry. 2025;33(1):103-105.

7. Sedlakova J, Trachsel M. Conversational artificial intelligence in psychotherapy: a new therapeutic tool or agent? Am J Bioeth. 2023;23(5):4-13.

8. Levkovich I. Evaluating diagnostic accuracy and treatment efficacy in mental health: a comparative analysis of large language model tools and mental health professionals. Eur J Investig Health Psychol Educ. 2025;15(1):9.

9. Hatch SG, Goodman ZT, Vowels L, et al. When ELIZA meets therapists: a Turing test for the heart and mind. PLOS Mental Health. 2025;2(2):e0000145.

10. Guo Z, Lai A, Thygesen JH, et al. Large language models for mental health applications: systematic review. JMIR Ment Health. 2024;11:e57400.

11. ChatGPT has helped me more than 15 years of therapy. No joke. Reddit. Accessed September 16, 2025. https://www.reddit.com/r/ChatGPT/comments/1k1dxpp/chatgpt_has_helped_me_more_than_15_years_of/

12. Quiroz-Gutierrez M. Gen Z is increasingly turning to ChatGPT for affordable on-demand therapy, but licensed therapists say there are dangers many aren’t considering. Fortune. June 1, 2025. Accessed September 16, 2025. https://fortune.com/2025/06/01/ai-therapy-chatgpt-characterai-psychology-psychiatry/

13. Your AI companion for mental wellness. Soul Talk. Accessed September 16, 2025. https://talktothera.com/

14. Smith S, Pande V, Wolf D. Why tech must disrupt healthcare now. Raising Health Podcast. Spotify. October 8, 2024. Accessed September 16, 2025. https://open.spotify.com/episode/4sQOqHgigeAhhaRwJtY8BX

15. Sbarra DA, Briskin JL, Slatcher RB. Smartphones and close relationships: the case for an evolutionary mismatch. Perspect Psychol Sci. 2019;14(4):596-618.

16. Haidt J. The Anxious Generation: How The Great Rewiring of Childhood Is Causing an Epidemic of Mental Illness. Penguin; 2024.

17. Christakis D, Hale L. Handbook of Children and Screens: Digital Media, Development, and Well-Being from Birth Through Adolescence. Springer Nature; 2025.

18. Radesky J, Weeks HM, Schaller A, Robb M, et al. Constant Companion: A Week in the Life of a Young Person's Smartphone Use. Common Sense. 2023. Accessed September 16, 2025. https://www.commonsensemedia.org/sites/default/files/research/report/2023-cs-smartphone-research-report_final-for-web.pdf

19. Munzer TG, Miller AL, Weeks HM, et al. Greater mobile device‐prompted phone pickups are associated with daily parent stress. Acta Paediatr. 2024;113(8):1868-1875.

20. Stothart C, Mitchum A, Yehnert C. The attentional cost of receiving a cell phone notification. J Exp Psychol Hum Percept Perform. 2015;41(4):893-897.

21. Hill K. A teen was suicidal. ChatGPT was the friend he confided in. New York Times. August 26, 2025. Accessed September 16, 2025. https://www.nytimes.com/2025/08/26/technology/chatgpt-openai-suicide.html

Newsletter

Receive trusted psychiatric news, expert analysis, and clinical insights — subscribe today to support your practice and your patients.

Related Videos
world peace
AI
thinking outside the box
© 2025 MJH Life Sciences

All rights reserved.