August 2, 2010
By Alice Shepherd
For The Record
Vol. 22 No. 14 P. 10
Critics wonder what good it is to invest in EHR technology if it fails to engender itself to users who feel betrayed by its lack of intuitiveness.
EHR systems promise to increase efficiency and productivity, reduce costs, provide 24/7 accessibility to medical records, and improve clinical outcomes. Each day, about 300 vendors vie for market share, adding bells and whistles to already comprehensive product suites. Yet, if EHRs are so great, why isn’t every provider singing their praises? The problem, some say, lies not with functionality but with usability.
Stuck in the Past
While EHRs provide numerous benefits, users are sometimes frustrated with the time and effort it takes to enter information. “At this time, most providers are not realizing a productivity increase if they switch from paper to electronic records because it takes much longer to enter data,” says Jiajie Zhang, PhD, the Dr. Doris L. Ross professor and associate dean of research at the University of Texas Health (UT Health) Science Center at Houston. Zhang is also the principal investigator of UT Health’s National Center for Cognitive Informatics and Decision Making in Healthcare, which received a $15 million stimulus grant to conduct research to advance the adoption and meaningful use of HIT.
“Usability is a huge barrier to EHR adoption,” adds Eric Ford, PhD, the Forsyth Medical Center professor of healthcare management at the University of North Carolina in Greensboro. “In their current format, EHRs are cognitively burdensome and labor intensive to use. While physicians find that receiving lab and radiology reports electronically is a great time-saver, having to type formerly handwritten material is slow, and making selections from drop-down menus takes too many clicks. It’s still easier and quicker to write a note than to codify a diagnosis in an EHR. Science and industry need to improve the products dramatically.”
Ford and Zhang compare the current state of EHRs with personal computing using DOS 20 years ago. “People had to remember commands, like Shift + F4, and had cardboard labels above the keyboard to remind them of the most common commands,” says Ford. “Since then, consumer products have become far more usable, but that hasn’t happened yet in healthcare.”
“Fundamentally, DOS and Windows accomplish the same thing, but at the user interface level, users interact with the system in different ways,” says Zhang. “Without Windows, few people would use computers today. Fortunately, there are many things we can do to improve the usability of EHRs.”
“The effort to automate and facilitate access to information in the healthcare delivery realm has been increasingly focused on making data available,” says Amy Cueva, founder of Mad*Pow, a user-centered design studio. “With speed to market being a driving factor, user-centered design and usability have taken a backseat. As a result, products on which companies and our government have lavished money and development time may die on the vine rather than thrive. EHRs that are technologically exquisite but unusable become burdens to the very persons they are intended to serve. The ability of physicians and nurses to use EHRs is often a development afterthought; The product suffers and healthcare outcomes take a palpable hit.”
“Doctors describe their frustration with EHRs by listing symptoms such as ‘too many clicks’ or ‘I got lost,’” says Robert Schumacher, PhD, managing director of User Centric, a user research experience and design firm. “The real systemic problem is that EHRs have not been built to match the workflow of different users in clinical practice.”
A Human-Centered Development Process
How can EHRs be misaligned with user workflow when vendors pride themselves on developing their products with clinician input? The problem is that vendors don’t take a systematic approach to human-centered design, according to Zhang. “In the design process, they talk with end users who are not always typical, and they do so in an informal way,” he explains. “Then, when the product receives negative feedback upon release, they try to fix the problems. At that point, it’s too late and too expensive.”
Perhaps vendors are taking too narrow a view, suggests Schumacher. “In the design process, one or two doctors who hold sway end up designing a highly idiosyncratic system that can’t be generalized across a broad spectrum,” he says. “Then, to improve the software, vendors continue to add more and more features until the intended workflows break down. In a true development life-cycle process, product developers design the system with end users in mind from the very beginning, even before they write the very first lab code.”
Cueva sees user-centered design as focusing on understanding the caregiver’s mental model. “Designing positive human experiences with technology has to be done from the perspective of the people who are going to use the software,” she notes. “Successful outcomes are achieved when the infusion of the end user’s perspective happens before developing screens, labels, and proposed interactions. Effective design based on usability criteria ensures that primary tasks are easy to complete; buttons are positioned, phrased, and labeled effectively; and the software’s workflow aligns with the user’s mental model and care practices.”
Schumacher explains how user-centered design begins with a small group of users and then expands to involve larger numbers of clinicians as the design solidifies. Users are actively involved throughout the entire development process, not just through focus groups or opinion surveys. In addition, user performance has to be measured continuously and objectively throughout the development process to create accurate benchmarks for successive trials and iterations, he says.
“Before a line of code is written, once screens are designed, they have to be tested with end users to see if they can accomplish their key tasks,” says Cueva. “This process ensures that a multitude of different end users in hospitals and physician practices can efficiently accomplish a host of crucial activities. Research, comprehensive design, and usability testing can support solutions that reduce documentation time and training time so that healthcare professionals can focus on care. It’s not a question of building a system and saying, ‘Oh, it doesn’t matter if it’s complicated; we’ll train them to use it.’ Anyone lazy enough to rely on that outdated line of thought should make you very nervous. Imagine if even the most complex systems could be introduced and utilized without extensive training. It’s absolutely achievable, and user-centered design approaches make it possible.”
Zhang observes that when physicians don’t use the system or can’t use it efficiently, the vendor’s solution is to provide training, which does not solve the problem. “Now extra time has to be spent on training, and additional training is needed each time the system is upgraded,” he says. “The fundamental solution is human-centered design. If you design with the end user in mind from the beginning, training is minimal and efficiency and productivity increases dramatically.”
EHR vendors that fail to invest resources in usability from the very beginning will pay more in the end, warns Zhang. “Consumer software companies such as Apple, Microsoft, and Google devote a large percentage of resources to this effort, but for EHR companies, it’s sometimes zero,” he says. “Their shortsightedness leads to higher expenses for tech support and help desks, hurts their reputation as users complain, and costs them market share.”
“Let’s stop worrying about utility or function; let’s worry about usability for a while,” says Schumacher. “Many software vendors claim that they take usability into account, but how many companies have elevated it to senior levels by appointing a VP [vice president] of user insight?”
In addition, the cognitive burden of poor usability poses a risk to patient safety. “Imagine a doctor at the end of a long shift in a busy ICU,” he says. “If he or she has to deal with a tremendously complicated screen display, an error may happen. We have to improve display characteristics to minimize risks to patient safety in addition to improving productivity.”
EHR vendors can learn from medical device manufacturers, which once developed products with only anecdotal user input or input that came too late, after the device was on the market. Just this year, the FDA recognized a new best-practice standard, HE75, Human Factors Engineering, Design of Medical Devices created by the Association for the Advancement of Medical Instrumentation, which provides detailed human factors engineering design guidance, examples, checklists, and case studies. FDA regulators now want manufacturers to provide them with evidence of a human-centered design process, which promises to improve the usability and safety of medical devices.
One goal of UT Health’s research project is to revolutionize the paradigm for EHR data display and data retrieval, similar to the iPhone’s advance over previous smartphones. “The tables, spreadsheets, and text of existing EHRs don’t allow providers to see the big picture,” says Zhang. “We plan to build prototypes, mockups, and perhaps products that may fundamentally change people’s interactions with EHRs. Visualization will be a critical component. For instance, a doctor who sees a patient for the first time needs a quick way to grasp the big picture of the patient’s history and condition. Visualization allows users to see a large amount of data on one screen so they can recognize the patterns.”
Cueva believes the EHR space to date has overlooked huge opportunities for using data visualization schemas to improve quality of care. “EHR systems should allow physicians to quickly ramp up on a patient’s record, get an immediate understanding of the patient’s current condition and trends, and then quickly document and determine next steps,” she says. “Information graphics that effectively utilize color, data visualization summaries, the smart use of space, and erring on the side of simplicity or high-level information (with the capability to drill down for details) will all contribute to bringing EHRs into the 21st century. Many acute and ambulatory care facilities are still living in DOS. As more and more EHR developers adopt user-centered design and data visualization approaches, we’ll see a convergence of data availability and exponentially more positive healthcare outcomes.”
Overused colors, fonts, and typefaces inhibit visualization rather than help. Schumacher has seen EHR screens that had everything highlighted and others on which four or five colors, along with italics and bold, competed for the doctor’s attention. “We need to find ways to display information in rational, reasonable ways that are tuned to the capability of humans as information processors,” he says. “The average EHR makes you tired. Medical informatics professionals should look at cutting-edge designs in other industries such as finance.”
Expanded decision support is another area of opportunity for EHRs. “EHR technology has the potential to go much further than modules that alert doctors to drug-drug interactions and drug allergies,” says Zhang. “Say, for instance, an [emergency department] doctor who is gathering information from a patient is suddenly called away to attend to a critical injury. By the time he or she returns to the first patient, the prior discussion may have been forgotten. Decision support that helps doctors keep track of discussions, decisions, and actions could reduce their workload, the complexity of their environment, and potential medical errors.”
Ford suggests that another way to lighten physicians’ burden may be to increase patient engagement. “A medical practice’s website could serve as a portal where patients, prior to an initial consult, would enter their chief complaint, family history, insurance information, medication list, and immunizations,” he explains. “At the same time, they could schedule an appointment electronically. This would shift these time-consuming tasks from highly paid staff to the patient. Rather than computerizing data from the forms patients complete at check-in, staff would validate the accuracy of the information patients have entered through the portal. Currently, the cost savings that arise from using an EHR mostly accrue to patients or insurance companies while doctors incur the expense of purchasing the technology. That balance of benefits and costs has to be adjusted.”
More mature voice recognition technology would greatly enhance usability but only if the software could parse and structure the information into data fields. “Speech recognition technology is usable, but it just gives us more prose,” says Schumacher. “Getting that prose into the right place in the right format is still a challenge.”
“Intelligent machine design holds great promise but is still years away,” says Ford. “If the healthcare industry pioneered machines that understood what we were saying to them, it would add value to so many other disciplines.”
Ford cites examples of effective, user-friendly technology in the long-term care setting, which may serve as a model for EHR design. An example are touch screens in residents’ rooms where staff members enter information that has to be captured several times per day, such as residents’ performance of activities of daily living. Long-term care facilities also have technology that predoses and packages residents’ medications. “It’s a powerful system that cuts down on waste, saves time, and minimizes medication errors,” he says.
Policy and Practice
Studies indicate that next to cost, usability is one of the main reasons that EHR adoption has been slow. A 2009 report by Smelcer et al in the Journal of Usability Studies suggests that 30% of EHR implementations fail, often because physicians cannot use the EHRs efficiently.
Ford believes that approaching EHR adoption from a policy standpoint may be shortsighted. “Doctors are told to deploy EHRs because it’s the right thing to do and that they will be penalized if they don’t comply,” he says. “President Obama, and before him President Bush, routinely visited organizations that have successfully implemented EHRs (eg, Cleveland Clinic, Intermountain Health, Kaiser Permanente), and it’s true that those organizations have done yeoman’s work. But they are not typical. Their efforts were underwritten by government grants, and their doctors tend to be employees who spend a year or two learning the system. Most places don’t have that kind of wherewithal.”
At the same time, legislative pressure is crucial to move EHR adoption forward, and the Office of the National Coordinator for Health Information Technology (ONC) is well aware of the usability challenge. Speaking at the Agency for Healthcare Research and Quality Annual Health IT Grantee and Contractor Meeting in early June, ONC Chief David Blumenthal, MD, announced that the organization will focus more resources on EHR usability.
Schumacher was pleased to see the government agency address the issue. “The ONC has taken on the mantle of a large-scale human factors effort akin to the program that made nuclear power safer after the Three Mile Island disaster and the FAA’s [Federal Aviation Administration] human factors work to increase the safety of air traffic and aircraft,” he says. “The ONC and Dr. Blumenthal deserve tremendous credit for taking a leadership role and publicly embracing usability and human factors to this extent. If we can live up to that vision, we can deliver more usable tools to the community.”
— Alice Shepherd is a southern California-based business-to-business journalist specializing in healthcare topics.