Home  |   Subscribe  |   Resources  |   Reprints  |   Writers' Guidelines

Spring 2023

Clinical Documentation Integrity’s Technological Evolution
By Susan Chapman, MA, MFA, PGYT
For The Record
Vol. 35 No. 2 P. 14

How Technology Is Transforming CDI

Natural language processing (NLP) has become more prevalent in clinical documentation integrity (CDI) over the last decade.1 This form of artificial intelligence (AI), or machine learning, is able to extract information from free text, standardize the format of clinical notes,2 and “electronically review notes within the EHR and apply system logic and standard coding rules to propose and group diagnostic-related group (DRG) codes based on the presence of diagnostic words and/or phrases.”1

Machine-learning models can be programmed to find any set of words in any context. “However, while NLP can find ICD-10 codes or the verbiage that is reflective of a specified diagnosis code for that particular diagnosis, there are limitations in that if those specific words do not appear, it might miss them,” explains Amy Campbell, RN, MSM, CCDS-O, formerly a clinical documentation improvement manager with Wolters Kluwer, who now works for Harmony Healthcare as a contract clinical documentation specialist in the ambulatory setting for Duke University Health System. “So, that’s where clinical natural language processing, or CNLP, comes in on the clinical side. Not only are we able to find diagnoses but we’re also finding the supporting documentation, the clinical indicators. Those clinical indicators are little pieces of information that a provider might include about the patient—what has changed in the patient’s condition or how the provider might treat a particular condition.

“Diabetes is something that is commonly discussed and can give a clearer picture of what I mean by ‘clinical indicators,’” Campbell continues. “With diabetes, retrospectively, there are things that help to establish the progression of the condition. It’s things that are found in the examination, but it also applies to labs that might be ordered by a provider to track how well the condition is being managed. If I send a query to a provider stating that the provider ordered to have the patient’s hemoglobin A1c or chemistry labs taken before their next visit, and it is possible that diabetes exists here, that wouldn’t be useful. I’m more concerned with the lab values prospectively because it’s more valuable to the provider to have the information that this particular lab has been ranging between these two levels during this time. So, those clinical indicators are able to give providers a better sense of what needs to be done to control the condition.”

AHIMA’s practice director and CDI and Clinical Foundations Accredited Provider Program Director Tammy Combs, RN, MSN, CDIP, CCS, CNE, an AHIMA-approved ICD-10-CM/PCS/CDI trainer, agrees that NLP and AI are indeed transforming the CDI industry. “Actually, the technology is advancing and growing from the ability to look at the documentation in the health record and identifying specific diagnoses and phrases to then alerting CDI professionals,” she explains. “There are several different things that are taking place. It’s reviewing all the documentation, scanning it based on logic that’s built in on the backside. The technology can review clinical evidence and compare that to what’s been documented. It can send alerts to the CDI professional to go in and determine if a query may be needed. Further specificity can be brought into the documentation. From that perspective, it’s helping to prioritize what needs to be reviewed.”

At the University of California, Davis Medical Center, where Tami McMasters-Gomez, BS, CDIP, CCDS, MHL, and AHIMA-approved ICD-10-CM/PCS trainer, is director of coding and CDI services, NLP and AI are being used to enhance her team’s ability to have additional reviews done without increasing their staffing needs. There, AI is essentially coming behind, or coming ahead of, CDI and performing clinical reviews, looking for opportunities to “nudge” the providers in real time and ask them to further specify documentation. “Whether it’s the acuity of congestive-heart failure or whether it’s the specificity behind the type of pneumonia the patient has, the technology is doing the work that CDI traditionally would have done and in advance of CDI even looking at the record,” McMasters-Gomez explains. “This then frees up our team to spend more time looking at some of that more complex clinical documentation you see, the more complex technical reviews that typically, in the past, we may not have gotten around to doing.”

The ability of the technology to address issues in real time, as they arise, according to McMasters-Gomez, also allows physicians to spend more time addressing patient care issues. “It brings patient care to the forefront again because the AI is pinging them at the time they’re documenting,” she says. “Once a patient is no longer in front of them after discharge, providers no longer have their cases in the forefront of their minds because they’ve moved on to new patients. So, having real-time capability at the point of care is very important.”

But Combs notes that just as there are many forms of the EHR produced by a wide array of vendors, so, too, is the AI technology used by CDI varied and different. CDI professionals indeed are able to review the documentation and compare it to the clinical evidence in the health record. But, according to Combs, if the logic has matured and is seen as reliable, auto-generating documentation queries when necessary, then CDI professionals do not even have to do that type of review and issue a query. “Hopefully, the CDI team has been involved in building out the logic on the backend and have indicated those elements the computer should be looking for because the technology is only going to look for the logic that’s been built into it,” Combs notes. “The technology can reduce the amount of time between when a provider would get a query in real time as opposed to having someone look at it and then send a query,” she adds. “And there is some technology out there that can, and does, operate in real time. But not every vendor and technology will follow that process where an alert just pops up as things are being entered. Most of the time, though, the documentation is there, and it’s being reviewed. It’s pretty close to real time, which is helping with the timeliness of queries coming through. There is a lot of potential out there, but I think there’s some work to be done yet to ensure the reliability, the compliance component, and fitting it into the workflow.”

Combs further explains that there are gray areas as evidence to support a diagnosis can be challenging since patients can present with different symptoms and laboratory findings. “All of that has to be taken into consideration when determining whether or not a query should be sent to the provider,” she says. “From discussions I’ve had, sometimes this can determine if it has a high reliability score, like the evidence that has been built in, if there is quite a bit to support the need for further specifics. There is a lot of functionality that’s being implemented, and it’s growing. It’s advancing and maturing and becoming more reliable now.”

Advancing the Technology
As Combs has pointed out, the technology is only as good as the logic behind it, and she believes that creating a comprehensive system requires input from all stakeholders. “One of the challenges with NLP and AI, as with any technology, is that the logic has to be built in and structured in such a way that the technology recognizes the correct terms,” she says. “The natural language processing component helps, of course, because it can learn, and it’s starting to learn from what’s done in the health record. But CDI leaders need to be aware of what they’re instructing the technology to ‘think about’—the different phrases and abbreviations that may be used whenever they’re talking about a diagnosis. One provider might write out heart failure; another one may use the abbreviation HF or CHF. So, when the technology is being created, programmers have to be aware of things like that.”

“We need a tool that can evolve and is able to capture all of the clinical indicators that are so important, even if the technology isn’t aware of them yet,” Campbell adds. “Along with capturing those clinical indicators, we want the tool to allow me to choose which clinical indicators I want to include for the provider to see. I can add additional ones. It could be that I see something in a note that maybe the tool didn’t see. Maybe we haven’t built that particular concept yet, but it’s always something that can be added to the tool to help track what the coder saw or what the CDI professional plans to use to present to the provider.”

The possibility that the technology can produce false findings or noise is something that can happen with NLP and AI and is an area Campbell is particularly focused on. “We’re looking at things that are noisy, things that might be a false finding, and trying to understand why that happens,” she says. “We want to figure out how to reduce that so that the technology works more efficiently with the CDI team. The analogy that I like to use with AI is that it’s almost like a conductor who is just directing multiple sections of an orchestra to work in concert. They all do something different, but they have to work together to make that sound be beautiful. The algorithms that are used in different models are all synced by AI, so it allows the models to work together, to organize and save data into relevant sets for the end user.”

A tool that’s central to Campbell’s organization’s technology is what they refer to as the coding workbench, which allows the CDI professional to click on a link within the clinical indicators that then leads to the location of that information within the document. “That is something that is really invaluable to a provider. Not only are they just taking my word for it but they can also click a link and go to it themselves. They can verify a test or a lab result and maybe improve upon the diagnosis or offer additional information. It really helps providers to streamline their thoughts,” she says.

To have a technology solution that’s as nimble as each organization requires it to be, Combs proposes that all stakeholders become involved at the very beginning of the process of choosing the appropriate AI technology and building in the logic that’s in line with their organizational needs. “The people involved have to do more than understand how AI works and just come in to create the technology and make everyone comply,” Combs explains. “They really need to involve everyone who will be a part of the process. Whenever you have incorporated a true team approach and ensure that the technology is as good as possible, that it’s really flowing into the workflow of your CDI team, your physicians, all of those involved, it’ll be successful. You want to have a high confidence level across the board in what the technology can do and do accurately and efficiently.”

McMasters-Gomez reports that she and her team have worked together in the way that Combs describes and have tried hard to ensure that their AI is not only physician facing but also CDI facing. “We’re working diligently to customize the nudges that we send out to the providers. Nudges are synonymous with queries. The vendor we’ve gone with had some ad hoc, out-of-the-box nudges that they recommend we turn on. We needed things that are specific to UC Davis’ culture, our clinical criteria,” she says. “So, we’ve gone on a journey to customize nudges and create our own custom nudges, using clinical content that aligns with what we’ve adopted organizationally. We want to further enhance the workflow from the physician-facing AI perspective to ensure that our clinical content aligns with what the physicians are using day in and day out and that we’re nudging them also to align with what we’ve adopted organizationally.”

As the HIM landscape continues to evolve, within CDI, there’s a movement to improve the quality of outcomes and efficiency of the process. “A clinically tuned NLP tool that can accurately capture all the clinical indicators of a condition, allows me to choose which indicators I want to include, and then take a clinical documentation specialist or provider directly to where that neatly organized information exists, is a game changer for my workflow,” Campbell explains. “It can take up to an hour to manually review a record thoroughly, but with the right technology, that time could be reduced by 25% to 50%. In CDI, we’re always looking for ways to expand our reach because we simply can’t get to every member of the target population. Further, the proportion of inpatient to outpatient encounters is shifting dramatically to the outpatient setting. Outpatient CDI is a small group of clinicians today with a rapidly expanding need, and AI and NLP technology offers a means of working smarter, not harder.”

— Susan Chapman, MA, MFA, PGYT, is a Los Angeles–based freelance writer and editor.

 

References
1. Poland L. The evolution of coding: understanding how technology is assisting us. AHIMA website. https://www.ahima.org/media/3a3axqt1/evolution_of_coding_whitepaper_agshealth_ada.pdf

2. Townsend H. Natural language processing and clinical outcomes: the promise and progress of NLP for improved care. J AHIMA. 2013;84(3):44-45.