Challenges with mHealth Apps for Depression and Suicide Prevention

Article

Learn what potential challenges come with mHealth apps and how to overcome them to best fit patient needs.

phone

SFIO CRACHO/Shutterstock

In the previous 2 parts of this series on mHealth apps, we examined the barriers in access to care and what apps are available to assist in treatment of depression and suicide prevention. In this installment, we will cover the potential challenges and unintended consequences of utilizing this technology.

Unintended Consequences: Inaccurate Information and Self-Diagnosis

Apps of all kinds in the Android Google Play and Apple Store calculate ratings based on independent user feedback. User feedback is subjective and does not involve a review process or other rigor. An app may have a 5-star rating based on user feedback; however, this rating does not require any third-party verification of the quality, security, efficacy, or validity of the app.1 Given the lack of consistent oversight or regulation, there is no penalty for mHealth apps providing non-evidence-based and/or inaccurate information.2 A lack of reliable oversight means there is no verification of information quality. A study by Stawarz et al3 found that many of the apps with high user ratings had an inconsistent evidence base, demonstrating that priorities to the user that would translate to a higher app rating do not indicate solid validity and sound evidence.

Another potential challenge specifically with mHealth apps for depression is that of patient self-diagnosis. Patients may self-identify that they are depressed, but in the absence of a full psychiatric evaluation by a trained mental health professional, they could very well meet criteria for a different condition that would require specific intervention and treatment tailored to that disorder. This can lead to negative treatment outcomes and misdirection of efforts supporting treatment of a condition that may not apply to a given consumer.

Clinical Validation and Efficacy

The clinical utility and efficacy of the data provided within these apps is an important consideration. Currently there is no formal review process for mHealth apps, and that has been an ongoing barrier. The existing medical research on apps is not very robust, and it is quite far behind relative to how widely mHealth apps are utilized.1 With the existing research being so limited, another factor to consider is that the studies that exist often include very small sample sizes.4 Research conducted on these apps also tends to be short in duration, without longer-range follow-up to evaluate outcomes.5

Privacy of Patient Information

Security of patient data is an important consideration within the context of using mHealth apps for depression. Protected Health Information (PHI) breaches are reported on the news nationally and worldwide every day. All breaches, whether large or small, have the potential for significant impact for individuals, communities, and larger populations. An in-depth study of some mHealth apps found that there was transmission of medical data via unsecured processes, and that privacy policies were lacking relative to the kind of information being obtained from individuals.1

The aim of a cross-sectional study by Tangari et al was to evaluate the collection of user data within mHealth apps, and to review the privacy practices of apps and potential risks to customer information.6 It was noted that amongst the top mHealth apps, privacy disclosures to the app user was consistently inadequate. The evaluation of privacy and security of these apps is complex and multifactorial. See the Table for an overview of the components addressed in the security review of apps.6

Table. Security Areas Addressed in Review Completed by Tangari et al

Table. Security Areas Addressed in Review Completed by Tangari et al6

In the study by Tangari et al, 665 different third parties were identified as being able to access patient information contained within the apps.6 It was noted that a list of some specific third parties had accessed the most data—more when compared with the rest of the third parties obtaining this information. The vast majority of the apps reviewed were able to obtain and share personal information of the apps’ users. The review found that many privacy policy links did not work or were not reliable.

Patient Safety and Risk

There is significant concern related to patient safety, especially when one’s symptoms are severe and amidst the presence of suicidal thoughts or actions. In the Martinengo et al2 review of suicide prevention apps, the findings demonstrated that none of the apps assessed patient risk or protective factors, or triggers; only one of the evaluated apps inquired about prior history of suicide. Individuals with a history of previous self-harm are at a substantially increased risk of suicide when compared with the general population.7

Comprehensive suicide risk assessments are important, complex processes conducted by highly trained, skilled clinicians in psychiatry and related fields. Evaluating patient risk is complex, even for the most skilled and experienced provider with years of training and expertise in the field. There are numerous risk factors for suicide, and yet the reality of the outcomes of suicide is far more nuanced and complex beyond a list of these potential contributing factors. Significant risk factors for suicide include but are not limited to: prior history of self-harm or suicide attempt,7 family history, psychiatric illness, recent personal loss, and comorbid medical illness, along with other demographic and social factors.8

Safety planning is an important tool used by mental health clinicians to work with patients who are dealing with suicidality when these patients are not at an acute phase of requiring inpatient hospitalization. In the review of suicide prevention apps, only 10% the depression apps and only 52% of suicide prevention apps that were reviewed utilized templates to support development of a safety plan for the app’s users.2 Only 4 of the apps reviewed allowed users to share the safety plan with a trusted member of their support system.2

Of the 46 suicide prevention apps, only 20% provided users with emergency access to trained mental health care professionals via the app.2 Of the depression management apps reviewed, 24% provided app users with direct access to a crisis line through the app. However, when the crisis contact information was verified during the review process, 9% of the crisis contact information in these apps was incorrect.

Some of the suicide prevention apps utilize chatbots.2 However, chatbots work by identifying key words and phrases to trigger subsequent responses based on various algorithms. In the apps utilizing chatbots, a very concerning statement from a patient stating that they are “sad” and “hopeless” did not prompt a follow-up question that would be triggered by a “worrying statement,” when, in fact, the aforementioned statement is quite concerning and would warrant further evaluation and clinical intervention.

Addressing Unintended Consequences

There are unintended consequences of mHealth apps for depression as they exist currently. A lack of external validity measures and exclusive use of user review ratings does not provide patients with information regarding the evidence base or efficacy of content provided within the app. As health care providers, our role is to translate evidence base into terms that patients can understand; the provider may be tasked with trying to do this, but given the aforementioned concerns, providers may be reluctant to do so. A potential opportunity to improve upon this would be facilitating collaboration and partnerships between academic and medical institutions for app development. With the knowledge base and the intimate awareness of security and privacy, these collaborations would be able to facilitate creation of an app that is evidence-based, externally validated, and suitable for use as a tool in collaboration with the health care team.

Privacy of collected data is an additional concern that is not unique to mHealth apps. There is a noted lack of transparency on the part of apps to provide a clear explanation to app users regarding how their data is collected and potentially shared.6 Ideally, the accountability for these policies should be placed on the individual app companies, with some standards and regulation by the app stores that are publishing the tools.

Best Practices to Support Care Using Patient-Facing Technology

To continue to facilitate mHealth apps that are effective, safe, and useful for patients, there are actions that can be taken to strengthen the aforementioned factors and to address some of the concerns outlined previously within this series.

A standardized, external review tool to validate apps would be a helpful framework to provide some added accountability and verification of app content. A tool discussed in one of the reviewed papers is the Mobile App Rating Scale (MARS), which seeks to confirm interrater reliability and consistency using 2 expert raters to evaluate based on a 23-item scale.1 This scale, like others that have been explored, lacks clear validity and has not been widely utilized to further evaluate its validity and efficacy. Including quality measures in App Store ratings or reviews that have been conducted by field experts, in addition to the existing system of exclusively customer review ratings, would be another way to potentially increase the accountability required for mHealth apps.

As the sheer number of mHealth apps being used to treat depression is expansive and potentially overwhelming to health care providers, further research and review may help guide the evaluation of apps and support those who choose to recommend or discuss app use with tehir patients as an adjunct to professional care. Using an evaluative tool such as MARs would provide health care providers with a more efficient way to be able to further encourage or perhaps redirect patients from the use of certain apps to support their mental health.

Ensuring that patients are informed consumers and aware of how their information is being used is imperative. App developers should be transparent in the writing and sharing of privacy policies, and further oversight regarding third-party sharing should be prioritized on a higher national level to remove some of the onus from the individual and to ensure that developers and corporations are accountable.

A best practice would be for supportive app tools to be encouraged when clinically appropriate, with clear guidelines that they are not a replacement or substitution for clinical care. This would be something for the patient and provider to approach as part of collaborative, patient-centered care. There are disclaimers that app users need to acknowledge in order to use these apps, but there should be ongoing check points and redirection to a trained mental health care professional beyond just an initial acknowledgment when first using the app.

Concluding Thoughts

With the rapid influx of new technologies, there are many benefits and unintended consequences in the use of mHealth apps in mental health care, there is also a potential role and benefit for well-done mHealth apps to address mental health, and specifically depression and suicidality. However, as a standalone intervention, mHealth tools are not reliable for managing these complex and potentially life-threatening conditions.2 The treatment of depressed and suicidal patients requires a collaborative, team-based approach including the patient, family, friends and a skilled mental healthcare provider in the utilization of different treatment modalities.

There are aspects of existing mHealth apps that may be ineffective, but more importantly there are components that can be potentially harmful. For example, almost 10% of the crisis hotline phone numbers in the apps reviewed by Martinengo et al2 were incorrect. Without a more comprehensive review process or verification mechanism, that kind of error can have potentially serious consequences. Additionally, this lack of oversight or guidance places the onus on individual health care providers and patients/app users. This may further impede the ability to make progress in this modality. Health care providers may be fearful of liability, which may steer providers away from considering the incorporation of app use into their practice.

There are significant and seemingly endless opportunities for how these apps can potentially be utilized to benefit patients and providers, provided this occurs in a safe and evidence-based manner with consideration for privacy of patient information. Ongoing research is needed, along with policy to encourage and promote safe app development and use, and clear, realistic outcomes associated with the app.

Mrs Robinson is a psychiatric-mental health nurse practitioner in the Seacoast, New Hampshire area, and a clinical assistant professor and program director of the Post-Masters Psychiatric-Mental Health Nurse Practitioner Certificate Program, Department of Nursing, University of New Hampshire, Durham. Dr Kallmerten is a Clinical Associate Professor and Director for the Direct Entry Master’s in Nursing, Evidence-Based Nursing and Clinical Nurse Leader programs at the University of New Hampshire, Durham. Her research interests include the use of Health Information Technology such as mHealth in support of health equity initiatives.

References

1. Powell A, Torous J, Chan S, et al. Interrater reliability of mHealth app rating measures: analysis of top depression and smoking cessation apps. JMIR Mhealth Uhealth. 2016;4(1):e15.

2. Martinengo L, Van Galen L, Lum E, et al. Suicide prevention and depression apps’ suicide risk assessment and management: a systematic assessment of adherence to clinical guidelines. BMC Med. 2019;17(1):231.

3. Stawarz K, Preist C, Tallon D, et al. User experience of cognitive behavioral therapy apps for depression: an analysis of app functionality and user reviews. J Med Internet Res. 2018;20(6):e10120.

4. Munoz Ramos R, Ferrer Cheng PG, Jonas SM. Validation of an mHealth app for depression screening and monitoring (psychologist in a pocket): correlational study and concurrence analysis. JMIR Mhealth Uhealth. 2019;7(9):e12051.

5. Deady M, Choi I, Calvo RA, et al. Ehealth interventions for the prevention of depression and anxiety in the general population: a systematic review and meta-analysis. BMC Psychiatry. 2017;17(1):310.

6. Tangari G, Ikram M, Ijaz K, et al. Mobile health and privacy: cross sectional study. BMJ. 2021;373:n1248.

7. Chan M, Bhatti H, Meader N, et al. Predicting suicide following self-harm: systematic review of risk factors and risk scales. Br J Psychiatry. 2016;209(4):277-283.

8. Stern TA, Rosenbaum JF, Fava M, et al. Massachusetts General Hospital Comprehensive Clinical Psychiatry. Elsevier; 2014.

Related Videos
depression
brain depression
brain
depression obesity
summer sadness
988
Experts on MDD.
Experts on MDD.
© 2024 MJH Life Sciences

All rights reserved.