Apple’s Siri, Amazon’s Alexa, and Google’s Assistant are taking over your smartphones and smart-homes. You’ve no doubt heard, seen, or used one, and likely, so have your patients. They’re great for checking the weather, asking niche trivia questions, trying out new cooking recipes, shopping online, and . . . virtual therapy? While conversational agent technology is growing rapidly, the various technologies (chatbots) might not yet be fully equipped to help patients with clinical needs related to their mental health.
Chatbots for patients with mental health conditions
Among studies conducted so far, educating patients and helping them stay adherent to their treatment plan were the most studied benefits of talking to a chatbot. This finding makes sense as chatbots can easily pull up mental health-related information and offer alarms and reminders. But can they do even more like delivering psychotherapy or making a diagnosis?
In July 2018, we performed a systematic search of the literature in the PubMed, EmBase, PsycINFO, Cochrane, Web of Science, and IEEE Xplore databases and isolated 12 studies that used a chatbot in a mental health setting with individuals at risk for or with depression, anxiety, schizophrenia, bipolar disorder, or substance abuse disorder. The review was published in the Canadian Journal of Psychiatry in March 2019.1
Although interest in deploying chatbots in clinical contexts is expanding, most studies are not reporting metrics for outcomes or efficacy in a standardized way but rather employing custom scales or unique outcomes, which makes it challenging to compare chatbot functionality. Some studies conducted just a few years ago are unreproducible today because the rate of technological advancement has outpaced that of research.
Basic conversational inability
Empathy and relatability are crucial yet seldom considered cornerstones for patients interacting with chatbots and matching and cooperating with the way humans communicate is paramount in the quest for humanoid chatbots. Most people use conversation fillers like “umm” and “ah” when talking naturally, and the simple omission of these filler words and pauses negatively impacts the perception of an already robotic-sounding chatbot.2 Aside from using variations on common phrases, this might be what chatbots need to build more rapport with patients and, in general, any user. Chatbots also don’t really understand tone of voice or the emotional subtext of language. Because they’re programmed to handle conversational cases specifically, they are typically not very good at answering open-ended “why” or “how” questions, which might lead down a tangent of irrelevance instead of an actual therapeutic conversation.
Dr Torous is Director and Ms Vaidyam is Technology Lead, Division of Digital Psychiatry, Beth Israel Deaconess Medical Center, Boston, MA. Dr Touros is also Digital Section Editor of Psychiatric Times. The authors report no conflicts of interest concerning the subject matter of this article.
1. Vaidyam AN, Wisniewski H, Halamka JD, et al. Chatbots and conversational agents in mental health: a review of the psychiatric landscape. Can J Psychiatry. March 2019; Epub ahead of print.
2. Gardiner PM, McCue KD, Negash LM, et al. Engaging women with an embodied conversational agent to deliver mindfulness and lifestyle recommendations: a feasibility randomized control trial. Patient Educ Couns. 2017;100:1720-1729.
3. Tielman ML, Neerincx MA, Bidarra R, et al. A therapy for post-traumatic stress disorder using a virtual agent and virtual storytelling to reconstruct traumatic memories. J Med Syst. 2017;41:125.
4. Bickmore T, Mitchell S, Jack B, et al. Response to a relational agent by hospital patients with depressive symptoms. Interact Comput. 2010;22:289-298.
5. Bickmore TW, Puskar K, Schlenk EA, et al. Maintaining reality: relational agents for antipsychotic medication adherence. Interact Comput. 2010;22:276-288.
6. Tielman ML, Neerincx MA, Van MM, et al. How should a virtual agent present psychoeducation? Influence of verbal and textual presentation on adherence. Technol Health Care. 2017;25:1081-1096.
7. Miner AS, Milstein A, Schueller S, et al. Smartphone-based conversational agents and responses to questions about mental health, interpersonal violence, and physical health. JAMA Intern Med. 2016;176:619-625.