Home  |   Subscribe  |   Resources  |   Reprints  |   Writers' Guidelines

September 2014

A True Reflection?
By Lisa A. Eramo
For The Record
Vol. 26 No. 9 P. 10

For better or worse, how health care organizations document care has enormous repercussions on patient safety and quality rankings.

When Consumer Reports recently released patient safety ratings for hospitals nationwide, Lake Cumberland Regional Hospital, a 295-bed acute care facility in Somerset, Kentucky, found itself with the state’s lowest safety rating. While the average hospital scored a 51, Lake Cumberland tallied a meager 20, prompting patients and others to view the facility as subpar and even unsafe.

Because it had been monitoring its PEPPER (Program for Evaluating Payment Patterns Electronic Report) data, officials at Lake Cumberland had been aware of the rating long before the Consumer Reports story was published, according to Chief Medical Officer Mike Citak, MD, MBA, who says the data indicated that the facility had fallen below the 20th percentile for surgical complications and comorbidities (CCs) and major CCs (MCCs). Knowing that Lake Cumberland patients were typically far more ill than what the data portrayed, Citak suspected that physicians weren’t documenting all relevant diagnoses in the medical record.

What ensued was the creation and formalization of a clinical documentation improvement (CDI) program that, over time, has grown to focus on quality and reimbursement. Within six months of launching the program three years ago, Lake Cumberland began to see a decrease in its risk-adjusted mortality rate as well as a decrease in morbidities such as wound and central line infections, and hospital-acquired conditions (HACs)—all of which have surely improved patient safety scores.

“Without really changing patient care necessarily, but changing how we indicate that care, we were able to get credit for how sick our patients actually were, which altered our risk adjustment and put us right in line with risk-adjusted mortalities,” Citak says.

Unfortunately for Lake Cumberland, Consumer Reports and other entities that rate patient safety often rely on MedPar data that is typically several years old. For example, The Leapfrog Group uses 28 national performance measures, many of which are drawn from Centers for Medicare & Medicaid data, to assign a single composite safety score denoting a hospital’s overall performance in keeping patients safe from preventable harm and medical errors. (To view these measures and Leapfrog’s scoring methodology, visit www.hospitalsafetyscore.org.)

Healthgrades, which also rates patient safety and honors organizations for excellent performance in safeguarding consumers from serious, potentially preventable conditions during a hospital stay, relies on MedPar data as well as information collected by the Agency for Healthcare Research and Quality. (To view its patient safety scoring methodology, visit https://d2dcgio3q2u5fb.cloudfront.net/e4/2d/179eeda24d739df9bbefcd9edabc/patient-safety-methodology-2013.pdf.)

“The majority of the data did not reflect the work that we were currently doing,” says Citak, adding that Consumer Reports pulled data from 2009 to 2012. Although the article was based on historical data, he says it continues to be an important reminder of the ongoing power of documentation. “I think the Consumer Reports article was a bit of a wake-up call to say, ‘Look docs, the coders can’t make this stuff up. They can only code what you write. If you don’t get it right, then we all look bad—all of us collectively.’ It was kind of a good motivation for them,” Citak says.

This summer, Lake Cumberland plans to engage an outside vendor to roll out an intensive weekly physician education initiative that Citak hopes will take its CDI efforts to an entirely new level. The facility’s current CDI program includes three CDI specialists and five quality analysts, all of whom are RNs. The quality analysts perform abstracting for core measures and verify quality measure data. The CDI specialists primarily look for CCs and MCCs.

Together, the quality analysts and CDI specialists have been able to improve documentation as well as the facility’s risk-adjusted mortality rates for various diagnoses. However, Citak says the raw mortality numbers have remained unchanged for more than a decade. “This shows me that the black eye we took was not because more people were dying. Our mortality rate wasn’t worse—it was the risk adjustment that was incorrect,” he says, adding that this finding bolsters support for the argument that documentation really does matter in terms of painting an accurate portrait of a facility’s quality.

CDI’s Role in Quality Measures
Experts agree that CDI must evolve as the industry moves toward value-based purchasing (VBP) in which organizations are rewarded for providing high-quality care. Government personnel, as well as private entities, are collecting and purchasing all types of hospital data to assess the quality of care provided. Some of these entities are more transparent than others in terms of the methodologies they use to rate hospitals.

However, the bottom line is that organizations must pay attention to the story that their data are telling the public. CDI specialists can certainly help in this regard; however, incorporating quality and safety into any busy CDI program is often easier said than done.

At Sisters of Charity of Leavenworth Health System, composed of hospitals in Montana, Kansas, and Colorado, integrity and revenue go hand in hand thanks to the efforts of Jane Hoyt, BA, RN, CCDS, CPC-H, CDIP, system director for clinical documentation integrity. “Integrity and revenue are so overlapped. If the integrity is right, the revenue is right,” she says.

Hoyt’s dedication to integrity inspired her to change the meaning of the acronym CDI from clinical documentation “improvement” to clinical documentation “integrity” when she joined the health system one year ago. “We’re not improving what physicians do—we’re making sure that the record reflects what happened to the patient during the stay at the hospital,” she says.

Before she joined Sisters of Charity of Leavenworth Health System, Hoyt says each of the system’s eight hospitals had different CDI programs, many of which were established by different vendors and featured disparate reporting structures and goals. Some programs reported to the HIM director and had more of a revenue focus while others reported to nursing, care management, or quality and had less of an understanding of documentation’s effects on revenue.

Hoyt was hired specifically to consolidate CDI leadership systemwide and to optimize performance. Currently, CDI specialists, each of whom is an RN with the exception of one individual who possesses a Master of Social Work credential, report to Hoyt, who, in turn, reports to the health system’s vice president of HIM.

In addition to identifying CCs and MCCs, CDI specialists also pose integrity queries to verify the severity of illness and risk of mortality for each case. More recently, they’ve started to develop a process for identifying HACs concurrently. “It hasn’t historically been part of the CDI role, but it’s all about the integrity of the record, and that’s what I keep going back to,” Hoyt says. “I see us growing into HACs, medical necessity, length-of-stay issues, and even core measures perhaps down the line.”

In addition, the health system created a CDI task force at each site that includes quality, case management, HIM, the chief medical officer, the CFO, and those involved in the recovery auditor process. The team meets monthly to discuss ongoing overlaps between coding and quality. Hoyt says this collaborative effort helps to ensure that the coded data tell an accurate story regarding the quality and safety of the care provided.

Still, patient safety and quality have yet to become as important as revenue integrity in many CDI programs, says Jonathan Elion, MD, CEO of ChartWise Medical Systems. “Many CDI programs are unfortunately still being asked to show their return on investment,” he says. “The patient safety indicators and quality indicators are soft benefits that are harder to measure. It isn’t until an organization sees a red flag or has unfavorable data published that it chooses to shift its focus.”

Laurie Prescott, RN, MSN, CCDS, CDIP, a CDI education specialist at HCPro, says she has noticed a shift in CDI as VBP programs have grown. “What I’ve found is that [CDI] is evolving, and it’s evolving really quickly,” she says, adding that CDI specialists are beginning to realize that although hospitals receive a diagnosis-related group payment, there are a lot of other available bonuses based on patient safety, quality of care, and patient satisfaction.

But should—or can—CDI specialists be looking for both coding and quality opportunities as they’re reviewing documentation? “My biggest concern is that it overextends the CDI specialist,” Prescott says. “I think organizations need to thoughtfully figure out how to allocate responsibilities. I’m not saying that CDI specialists can’t do all of it, but it’s a harder job to do.”

Hoyt, who plans to eventually expand the responsibilities of the CDI specialists at Sisters of Charity of Leavenworth Health System, says productivity decreases are a concern and will continue to be monitored going forward.

Considering CDI Software
CDI software can become a critical component of CDI programs that choose to integrate patient safety- and quality-related measures. Experts say any technology must include alerts that notify CDI specialists when patients present with diagnoses (eg, heart attack, heart failure, and pneumonia) that correspond with patient safety indicators or core measures. CDI specialists then can work with quality managers or case managers to initiate processes related to core measures and ensure that physicians document all complications and concerns. Using this type of concurrent software prevents CDI specialists from having to manually incorporate another checklist of quality- and safety-related items while reviewing the documentation.

The goal is to improve patient safety and quality scores as well as effect change in real time, says Mel Tully, MSN, CCDS, CDIP, senior vice president of clinical services and education at Nuance. “If you do identify patient safety issues, the whole point is to take this information and use it to prevent these issues in the future,” she says.

Without the aid of software that incorporates information related to quality and safety measures, CDI specialists would likely struggle to keep track of exclusions, numerators, denominators, and more. “You have to stay updated. That’s one of the biggest challenges,” Tully says. “CMS is constantly coming out with new measures, new composites of patient safety indicators, and quality measures. It’s a very dynamic program that’s always changing.”

Real-time monitoring of these measures is key, Elion says. “Quality indicators and measures are moving away from retrospective to concurrent,” he says. “Do you want to know that something happened in your intensive care unit six months ago or that something happened on your intensive care unit six minutes ago? It’s always difficult to be tracking these things down retrospectively.”

According to Citak, Lake Cumberland Regional Hospital plans to explore CDI software options. The hope is that the software will serve as a resource to help CDI specialists and quality analysts perform their duties more effectively and efficiently.

At Lake Cumberland, CDI specialists report to the HIM director, who reports to the CFO. Elsewhere, the quality analysts report to the quality director, who reports to Citak. Although the CDI software may further integrate the two roles, Citak doubts both functions will ever become fully integrated into one role. “There is a tremendous amount to know. I’m not sure that one person can do all of it,” he says.

Sisters of Charity of Leavenworth Health System also plans to use CDI software as it rolls out its concurrent HAC identification initiative. The software will prompt CDI specialists to contact a quality nurse or other provider to ensure that he or she properly documents the present-on-admission (POA) indicator. The software also will allow specialists to track the number of diagnoses that require provider follow-up regarding POA status and how much time it takes to perform this task.

Although the CDI program will continue to evolve, Hoyt hopes that quality reviews will continue to be a primary function of CDI. “I think it’s a necessary evolution of CDI. This is what we should be doing. We should be looking at the whole record,” she says.

As CDI continues to mature, CDI specialists must focus on data integrity. Regardless of its impact on safety or quality ratings, documentation must be accurate. For example, it’s inappropriate to falsify POA data to increase patient safety scores. Elion says although this behavior is rare, it’s something to keep in mind as software prompts specialists to indicate this information. “It’s not difficult to simply check ‘Yes, the condition is POA’ rather than ‘No, the condition developed after admission,’” he says.

Whether it’s a singular function or two departments working together with the help of software, successful CDI programs will find a way to unite CDI with quality in whatever format works best for the organization, Elion says. However, open lines of communication must be in place for the software to be effective. “Software does not solve personnel problems. It doesn’t take people who don’t talk to each other and make them talk to each other,” he says.

— Lisa A. Eramo is a freelance writer and editor in Cranston, Rhode Island, who specializes in HIM, medical coding, and health care regulatory topics.

 

Strategies to Improve Data Veracity
Clinical documentation improvement (CDI) specialists may or may not realize that the data they review and collect daily can have a significant impact on patient safety and quality scores. “The CDI folks have data that is so potentially valuable to the quality folks on a real-time basis,” says Jonathan Elion, MD, CEO of ChartWise Medical Systems. “The CDI professionals are out there every day in the charts looking for and mining the data that the quality professionals need.”

Consider the following strategies to ensure more accurate quality and safety scores:

• Ensure the correct capture of the present-on-admission (POA) indicator. Because it indicates whether a patient presented with a diagnosis or developed that diagnosis after admission, the POA indicator plays an important role in patient safety scores. If the POA indicator is not coded properly, the patient safety indicator (PSI) rate can be artificially inflated. CDI specialists must pay close attention to POA documentation for pressure ulcers, central venous catheter-related bloodstream infections, and deep vein thrombosis. Cases in which patients are transferred from another facility also must be properly documented.

• Review the record for coding integrity. For example, if an intentional procedure (eg, a laceration of plaque) is incorrectly coded as an accidental puncture or laceration (PSI 15), a CDI specialist must bring this to the coder’s attention. Also ensure that any diagnoses that could trigger a PSI actually occurred and were not ruled out. For example, rule out pneumothorax must not be coded as an actual pneumothorax.

• Ensure coding specificity. When reviewing documentation for surgeries, any events that occur after admission but prior to surgery must be documented thoroughly to avoid being incorrectly labeled as a postoperative complication or event. Pay close attention to PSI 4 (death among surgical inpatients with serious treatable complications), PSI 7 (central venous catheter-related bloodstream infection), PSI 13 (postoperative sepsis), and PSI 9 (postoperative hemorrhage or hematoma).

• Don’t stop at the first complication and comorbidity (CC) or major CC (MCC). Instead, Elion says CDI specialists must capture all relevant CCs and MCCs regardless of their impact on reimbursement because this information affects severity of illness and risk of mortality, both of which can explain poor outcomes.

Laurie Prescott, RN, MSN, CCDS, CDIP, a CDI education specialist at HCPro, says severity of illness and risk of mortality greatly affect the observed vs. expected death rate, which is often a more accurate measure of quality of care. “If we get the documentation that captures a higher severity of illness, that equates to a higher risk of mortality,” she says, adding that organizations must consider these data going forward as value-based purchasing initiatives continue to grow and more patients choose a provider based on comparative data.

This issue also is important when assigning chronic conditions or CCs to patients who expire. Organizations that don’t assign these diagnoses can inadvertently inflate PSI 2 (death in low mortality diagnosis-related group).

For more documentation and coding issues pertaining to each patient safety indicator, refer to tables 1 and 2 of the Agency for Healthcare Research and Quality’s “Documentation and Coding for Patient Safety Indicators” at www.ahrq.gov/professionals/systems/hospital/qitoolkit/b4-documentationcoding.pdf.

— LAE