Health Care’s Semantics Challenge
By Brian Levy, MD
For The Record
Vol. 26 No. 5 P. 26
A bird’s-eye view of the current federal health care initiatives reveals a synergistic effort to move the industry toward the triple goal of enhanced patient care, improved population health, and lower costs. But the health information exchange, Big Data, and improved analytics that facilitate these goals all hinge on health care organizations’ ability to accurately aggregate patient data and share them with stakeholders along the continuum. Interoperability is the largest obstacle to achieving this longitudinal view of patient data.
As the movement toward greater information exchange continues to mature, health care providers increasingly are realizing that data normalization is a fundamental component of the equation. A relatively new industry buzzword, data normalization refers to an organization’s ability to reconcile medical terminology across disparate health care systems and eliminate semantic ambiguity.
Many of today’s HIT systems feature their own vocabulary for clinical statements, whether they are lists of problems, diagnoses, procedures, medications, or lab results. The convergence of payer and provider systems increases the problem because of the inherent differences between data for payer-based claims and clinically based provider sources.
While meaningful use moves the industry a step closer to interoperability by introducing new standards, there are no definitive standards that encompass the entire realm of clinical and medical terminology. Raw data in the form of incoming and internal medical terminology must be normalized and mapped to standardized code sets and terminologies such as ICD-9/10, SNOMED CT, LOINC (for labs), and RxNorm (for drugs). Once this piece of the interoperability equation is reconciled, the industry can achieve a more accurate picture of performance for better reporting and analytics.
According to HIMSS, three levels of interoperability must be achieved to mature the data exchange movement. Most health care organizations that are progressively aligned with industry movements have achieved the first step of foundational interoperability, which allows electronic data exchange in understood formats.
There also is a widespread effort to achieve the next level of structural interoperability: providing the capability for IT systems to interpret data at field level. The final step to accurate data exchange demands that HIT systems achieve semantic interoperability—the highest level—to actually understand the information in the data fields.
In a nutshell, data normalization is the process of taking a source terminology and matching it to the most appropriate term in the destination technology. Medical terminologies coming from disparate systems, whether inside or outside an organization, must be normalized to standard terminologies before they can become actionable knowledge.
Because semantically normalized medical terminology is the foundation of accurate mapping for content standardization, the industry will be limited in its ability to achieve success without this complex step. Accurate mapping supports key initiatives such as evidence-based care protocols, population health management, and improved data sharing, all of which can lead to the success of accountable care organizations (ACOs), patient-centered medical homes, and value-based purchasing.
Consider a health care organization attempting to trend hemoglobin A1c levels for diabetes. The organization may receive a local code from Lab A: 1234/Hgb A1c Blood. Another code coming from Lab B may be a standardized as LOINC 17855-8. Both codes represent A1c lab results, but they must be normalized to one standard for accurate trending.
Another example is computing National Quality Forum measures as part of an ACO or population health management initiative where heart failure is the denominator. A hospital may use SNOMED code 134401001, Left ventricular systolic dysfunction; ICD-9 code 429.9, Heart disease, unspecified; or ICD-10 code I50.20, Unspecified systolic (congestive) heart failure. All these codes could identify someone with heart failure, but they must be accurately normalized and mapped to complete the measurement.
Medical Terminology Strategies
At any given time, health care organizations have inbound patient data entering from various sources, including laboratory, ambulatory, inpatient, home health, hospice, payers, and ACOs. These data originate from HIT systems such as EMRs, practice management systems, and laboratory information systems, many of which use their own inherent medical vocabulary.
As patient data move in and out of these systems, medical terminology strategies become paramount to normalizing disparate data into a common language that can be used to support analytics. HIT vendors are performing some of this normalization as part of their quest to meet meaningful use requirements. However, health care organizations still must build an IT infrastructure that will support effective mapping, modeling, and management of local content alongside a vast landscape of standards and clinical terminologies.
Data received from disparate sources must be entered into several hospital sources, including EMRs, data repositories, and quality reporting tools. Some of these data will be normalized to useful standards, but others will be constituted as proprietary codes or free text. Terminology services can be used to interpret the data and decide which still need to be normalized. They provide workflows to handle the necessary mapping from the local content to standards as well as identify content that might otherwise be missed or captured inaccurately.
For example, if a lab result for A1c sent from an external system already is coded to the LOINC standard, then it can be placed into internal HIT systems. But if a text string of “HBA1C” with a value of 8.1 is transmitted, the terminology service can normalize the information to the appropriate LOINC standard code. Thus, if a patient has a diabetes test performed at multiple outpatient and inpatient facilities, data normalization will allow results comparison.
Managing interoperability’s semantics challenges is a daunting proposition for resource-strapped IT departments already challenged with completing multiple initiatives. Industry tools such as the general equivalence mappings can serve as a starting point, but a full medical terminology management strategy must go deeper to ensure accuracy and position health care organizations for success in an evolving landscape.
At many organizations, it is easy to make the business case for leveraging expert third-party consultation and an advanced enterprise terminology management platform. Tools exist that automate the data normalization process and mediate the differences between disparate classification systems. By automating the incorporation of standard clinical terminology into health care software applications through real-time auto mapping and integration technologies, data can be safely exchanged and accurately analyzed.
Mapping strategies and medical terminology management will play a key role in moving data from setting to setting and use to use, from informing patient care to influencing national policy decisions. While not an end in itself, data normalization moves the industry closer to the interoperability level needed for better reporting, enhanced quality, and more robust analytics to support patient care.
— Brian Levy, MD, is vice president of global clinical operations at Health Language, part of Wolters Kluwer Health, which offers services for working with standard and enhanced clinical terminologies.