The escalating use of electronic health records (EHRs) and health information exchanges (HIEs) is fraught with unintended and sometimes dire consequences—including medical coding errors and breaches of psychiatric patients’ privacy and confidentiality, according to 2 psychiatrists who scrutinize the field.
At the recent Second Annual International Summit on the Future of Health Privacy, psychiatrist Scott Monteith, MD, Clinical Assistant Professor in the Departments of Psychiatry and Family Medicine at Michigan State University and a medical informaticist, relayed the experience of a patient who discovered that her EHR erroneously reported a history of inhalant abuse. In reality, she had a history of “caffeine intoxication.” After much investigation, the problem was identified. The DSM-IV-TR code (305.90) is used for 4 different diagnoses, including caffeine(Drug information on caffeine) intoxication and inhalant abuse, but the EHR’s printout only made the inhalant abuse diagnosis visible. Although the error was reported to the EHR vendor, the problem persists after almost 2 years.1
Also presenting at the Summit was a woman identified only as “Julie,” to protect her identity. She is a public sector employee who formerly practiced law. Over a 6-year period, in weekly or biweekly therapy sessions with a psychiatrist at Massachusetts General Hospital, Julie gradually disclosed her depression, her family troubles, and her sexual abuse as a child. In 2002, when Julie began therapy, her psychiatrist made written clinical notes; in later years, the psychiatrist typed them into an EHR. Consequently, more than 200 pages of sensitive notes about Julie became available to any physician within the Partners HealthCare System who treated her.2,3
Julie discovered this widespread distribution when she saw a new internist because of a stomach ailment. The doctor was “acting strangely” and instead of addressing her presenting symptoms, he explained that he had read her therapy notes and asked if she was seeing a psychiatrist. Julie was shocked and angry.
“It makes me feel sick, just having a complete stranger know that much about me,” she told a reporter from Modern Healthcare.2 “It feels like a complete violation.” As a result, Julie found a therapist outside the Partners system who agreed not put notes into an EHR.
In an exclusive interview, Monteith told Psychiatric Times, “Clinicians tell patients that we will keep their information private and then turn around and put it into an EHR that can do just the opposite. . . . Putting confidential information into an EHR can be like putting it up on a billboard.” Many patients no longer feel they can trust their health care providers, even though trust is the foundation for good relationships, he added.
Monteith described an incident in which he was conducting an initial psychiatric evaluation and was reviewing with the patient her medical history.
“She flat out told me that she wasn’t going to tell me anything about her medical history because I was going to put it into the computer, and she didn’t know where it was going to go and didn’t trust where it might go.” Monteith added that the patient was high-functioning and not psychotic or paranoid.
As the health care industry moves to adopt EHRs—thereby creating multiple and expansive databases in a number of locations—the number of people with access to protected information increases, as do the opportunities for this information to be accidentally or intentionally disclosed, lost, or stolen.4
Over the past few years, health care organizations have reported 364 incidents involving the loss or theft of information ranging from names and addresses to Social Security numbers and medical diagnoses on nearly 18 million patients. That number is equivalent to the population of Florida, according to an Associated Press article.5 The article also reported that a thief stole medical information on more than 4 million patients of Sacramento-based Sutter Health in 2011 by the simple act of breaking a window with a rock at the affiliated Sutter Medical Foundation.
Hidden data flow
In a letter to Farzad Mostashari, MD, National Coordinator for Health Information Technology (HIT) at the US Department of Health and Human Services, Deborah Peel, MD, Founder and Chair of Patient Privacy Rights Foundation, described the value of a project with Harvard Data Privacy Lab to map hidden health data flow called the DataMap.org.6
“It is impossible for consumers to weigh the risks and benefits of using health IT and data exchanges when they have no idea where their data flows, who is using it or the purpose of its use,” wrote Peel, a psychiatrist and psychoanalyst.
She also pointed to a 2009 Agency for Healthcare Research and Quality report on 20 focus groups conducted across the nation.7 According to the report, “the majority of participants felt that their medical data is no one else’s business,” and should not be shared without their permission. They overwhelmingly wanted to be able to communicate directly with their providers with respect to how their personal health information is handled, including with whom it might be shared and for what purposes. What’s more, most believed they should automatically be granted the right to correct misinformation.
In a recent Web site posting (http://patientprivacyrights.org), Peel emphasized the importance of patients being able to control access to sensitive personal health information. The open source consent technologies, she explained, have been used for more than 12 years by many state mental health departments to exchange sensitive mental health and substance abuse data on some 4 million people in more than 8 states.
“Millions of patients/year refuse to seek treatment when they know they cannot control where their data flows,” she wrote. “Any HIE or EHR that cannot selectively share data with the patient’s meaningful consent, withhold data without consent, AND withhold erroneous data is a failed system or technology. The refusal of certain health IT companies to build technologies that comply with the law and what patients expect shows very poor judgment.”8
Not ready for prime time
With a few exceptions (eg, the Veterans Health Information Systems and Technology Architecture [VistA], a public domain software program), “EHRs are generally not ready for prime time,” Monteith said. This is despite federal government efforts to persuade physicians and other health care providers to use them.
The Health Information Technology for Economic and Clinical Health (HITECH) Act, enacted as part of the American Recovery and Reinvestment Act (ARRA) of 2009, was designed to promote the adoption and meaningful use of HIT. According to a fact sheet issued by the Office of the National Coordinator for Health Information Technology (ONC), ARRA provides resources to “help health care providers across the country use EHRs to increase quality safety, and efficiency of health care; train thousands of people for careers in health information technology who will help health care providers implement EHRs; and assist states in creating health information exchanges for the secure and efficient exchange of patients’ EHRs among health care providers.”9
In actuality, Monteith said, the ONC’s $29 billion incentive program perverted the marketplace and caused a rapid expansion of HIT, but expertise and infrastructure to support the expansion are lacking. “As a consequence,” he added, “there are a lot of people who have limited experience implementing complex HIT systems.”
Driving the current iterations of HIT, he said, are entrepreneurs, government policymakers, big business, and other big entities rather than physicians and patients.
Overall, Monteith said, current HIT has not demonstrated the promised benefits in terms of clinical quality improvement, clinical effectiveness, or return on investment.
He acknowledged that EHRs have allowed health care providers to access patient health information from multiple locations and to obtain legible information. But he said that what he and other colleagues too often encounter is “legible gibberish.”
Moveover, he added, “We have far more information about the patient than we have ever had, but finding relevant information is often like finding a needle in the haystack.” In addition, Monteith said, many health care providers are wary of EHRs and are keeping shadow paper records or notes, so there is no time saving.
Costs, both direct and indirect, are tremendous problems, according to Monteith. “For a large hospital system, the rule of thumb is that for every $100 million spent on purchasing an EHR system, expect to spend another $400 million in direct implementation.”
“It is a brave new world,” Monteith said. “the good news is we can do it right. We can protect patient privacy. We can improve quality of care in some respects. But we can’t over-reach, and we can’t do the sloppy work that we are doing now. We need to slow things down and do it right.”