Home  |   Subscribe  |   Resources  |   Reprints  |   Writers' Guidelines

January 2014

Risky Business
By Mike Bassett
For The Record
Vol. 26 No. 1 P. 12

Are health care organizations paying enough attention to proper HIPAA risk analysis? If not, it could prove costly.

Last summer, Affinity Health Plan, a managed care plan company based in New York, agreed to pay the Office of Civil Rights (OCR) $1.2 million to settle a HIPAA data breach case.

While data breaches may not be unusual, this case was unique. The incident occurred in 2010 and resulted in more than 300,000 individuals having their data compromised after they were left on the hard drives of copy machines that Affinity had leased and returned. According to the OCR, it was the first HIPAA settlement involving copiers.

The salient point is that not only did the breach cost Affinity a lot of money (along with the resulting bad publicity), it demonstrated how exposed organizations can be when it comes to issues of privacy and security, says Mike Semel, CEO and founder of Semel Consulting, which specializes in HIPAA compliance. “How many people even knew that copiers had hard drives?” he asks, pointing out that the Affinity breach demonstrates that when an organization needs to perform a HIPAA risk analysis or assessment, it must be thorough.

Under the HIPAA security and privacy rules, covered entities such as health care organizations are required to conduct a risk analysis to thoroughly assess “the potential risks and vulnerabilities to the confidentiality, integrity, and availability of electronic protected health information held by the covered entity.”

It’s a critically important step for an organization when it comes to forming a HIPAA compliance program. However, the problem is that organizations aren’t doing a good job of conducting risk analyses—if they are performing them at all, says Bob Chaput, CISSP, CIPP/US, CHP, CHSS, founder and CEO of Clearwater Compliance, who points out that in 2012, 68% of the 115 covered entities audited by the OCR had adverse findings related to risk analyses. “That is to say, they either didn’t do [a risk analysis] or they did it wrong,” he says. “And when it came to health care providers, 80% had adverse findings.”

Chaput also notes that since the OCR began dialing up enforcement of the privacy and security rules in 2009, every investigation that resulted in some kind of corrective action plan had the investigated organization cited for not having performed a complete risk analysis.

“When you look at the fines, penalties, and investigations that the Office of Civil Rights is conducting, most of them go back to the risk analysis,” Semel says. “And it will say that the reason there was a breach when a doctor lost his laptop while traveling or when a technician lost a hard drive while going from one office to another is that the organization never did a proper risk analysis to identify where data lived, and how it moved in, out, and within the organization, and that if a good job had been done with the risk analysis in the first place, the organization could have identified and managed the risks.”

But why is something so critical being done so poorly—if at all? There appear to be several factors at work. According to Chaput, even though the HIPAA privacy and security rules have been in effect for about a decade, “little or no enforcement had been going on. As a consequence, even though the regs have required this for a long time, organizations simply haven’t done them because there haven’t been any bad consequences.”

Eileen Elliott, JD, a partner at the law firm of Dunkiel, Saunders, Elliott, Raubvogel & Hand, which advises health care providers on HIPAA privacy and confidentiality issues, says providers don’t put the necessary effort into risk analyses because they are complicated and counterintuitive to the primary mission of providing medical care. “Most providers got into the business because they wanted to be involved in health care and weren’t thinking at all about the risks that technology brings to their profession,” she says. “And the security part sometimes is almost overwhelming.”

Lesley Berkeyheiser, cofounder of the consulting firm N-Tegrity Solutions Group, believes that the combination of HIPAA’s complexity and the multitude of project deadlines facing health care organizations could account for what appears to be a cavalier attitude toward risk analysis. “I’m speaking primarily of provider organizations here, and a lot of this isn’t necessarily their fault,” she says. “They’re busy trying to take care of patients, let alone trying to meet all the requirements and regulations required of small businesses. Then, on top of that, they’re getting slammed by meaningful use and [the move to adopt] EMRs. All of this incredible change is happening at the same time, and a lot of it is taking precedence over being able to think about and document how you are safeguarding health care information in all of its forms.”

Chaput thinks the problem is more fundamental, questioning whether organizations understand the basics of risk analysis and the definition of risk. “They think it [risk] is something where you can walk around a room looking for and finding it,” he says. “It’s not. It’s a derived value that may not require a knowledge of calculus but [needs] at least a little bit of algebra.”

Performing a Proper Risk Analysis
Having worked with Red Cross disaster services, Semel views himself as someone who can walk into a situation and identify risks that other people may not see. For example, he worked with a clinic that recently switched to an online cloud service to back up its servers after having previously done so on tape. “There are retention rules for medical data, but they’re no longer using those tapes,” Semel says. “In fact, they no longer have a tape drive that can read those tapes or the software that was used to actually create the backups, so the tapes are useless.”

As it turns out, the old tapes are being stored in boxes in the server room, with the rest residing in a storage facility. It’s a setup fraught with risk, according to Semel. “First of all, just having them sit around is a risk,” he points out. “What happens if someone decides to clean out the server room and ends up throwing the tapes in a dumpster? That’s a data breach.”

Whether the issue is a bunch of useless tapes or a copier machine with a hard drive, it’s clear health data sources can pop up anywhere. To help combat this likelihood, Chaput says it’s necessary to complete an asset information inventory. “We are looking for anything that creates, receives, maintains, or transmits protected health information,” he says.

Tracking down this plethora of data can be a laborious, time-consuming, manually driven task, Chaput says. “Never mind electronic health records,” he says. “Think about where all of this information exists. It can exist on paper records, in voice mails, on fax machines. A lot of times when organizations have a breach, it came from a source they didn’t even realize contained the information.”

Clearwater Compliance uses an asset inventory process that Chaput describes as an “exhaustive template” of possible locations of protected health information. The idea is to help “stimulate the thinking” about where all of an organization’s health care data are hiding. “It’s largely a manual process but, as with everything in life, you can put some order and discipline around it,” he says.

Semel says that when looking for data, “You have to get past the preconceived notion that all data is in the patient care record or the electronic medical record.” In that sense, he points out, the Affinity copier case is a useful reminder that data can live in the most unexpected places.

One Semel Consulting client, a centralized surgery center, recently sent out a memo directing that no more data can leave the facility on portable devices without authorization. “They immediately got back e-mails from their nurses saying that the doctors were all bringing in thumb drives and downloading their operative notes for importing into their own system, and we didn’t know that was happening,” Semel says. “This became part of the risk analysis, and now they’re trying to come up with a more secure way [of enabling the doctors to transfer that information].”

Chaput says once assets have been identified, organizations must determine possible threats, such as a laptop being lost or stolen, and the asset’s vulnerability, such as the fact that the laptop is unencrypted. “When all of these things exist—the asset, the threat, and the vulnerability—you have a risk that needs to be analyzed,” he says.

And that risk can be measured by the probability that a security issue will occur and its potential impact on the organization. There’s no risk if an organization locks away an unencrypted laptop in a secure site with no Internet access. But that same laptop in the hands of a physician who carries it from his office to his home presents a risk. “But most organizations don’t understand this is a detailed process,” Chaput says. “They think they can simply observe something and declare that there’s a risk, but they really don’t understand what risk is and what it’s not.”

Who Performs the Analysis?
Berkeyheiser recommends pulling information about the privacy and security regulations from the Web, whether from the Office of the National Coordinator for Health Information Technology (ONC) or the OCR. This is particularly true for organizations creating a risk analysis from scratch.

As for who makes an ideal candidate to handle the risk analysis task, Berkeyheiser says one aspect should take precedence. “Focus on breach. Breach is by far the most important thing that has happened in the last few years that continues to get a lot of public attention,” she notes.

Since 2009, the OCR has been recording and publishing information about large-scale health data breaches on its website. Berkeyheiser recommends risk analysis staff members peruse those incidents to determine whether any could possibly apply to their organization.

Elliott also suggests organizations, particularly smaller providers, visit federal websites for privacy and security information pertaining to issues such as administrative security and physical and technical safeguards. “But [performing a risk analysis] is still a big undertaking, particularly as you’re moving into electronic health records,” she says, adding that larger organizations probably have their own privacy and security officer and should be capable of hiring an outside firm to conduct a HIPAA analysis if necessary.

On its website, the ONC lists the top 10 myths of security risk analysis, one of which is that providers must outsource the function. It notes that “it is possible for small practices to do risk analysis themselves using self-help tools. However, doing a thorough and professional risk analysis that will stand up to a compliance review will require expert knowledge that could be obtained through services of an experienced outside professional.”

Although it suggests that small providers can perform their own risk analysis, Semel believes it is revealing that the ONC emphasizes that any analysis must be able to stand up to a compliance review. “These are the same people who are sending out audit letters to people applying for meaningful use money, so we know there is scrutiny. So I think what they are really saying is that because they are doing audits, and there is this scrutiny, it [the risk analysis] really should be outsourced to a professional because we are checking these things.”

Analyzing HIPAA compliance risk “is not a one-and-done” deal, particularly in light of the changes associated with EHR implementations and meaningful use, Berkeyheiser says, while noting that it is important to choose opportune times for an analysis. “You should be looking for significant changes in the way you handle your data, whether it’s an organizational change or a technological change, and then do a risk analysis,” she says.

For example, N-Tegrity Solutions Group recently enacted a corrective action plan involving a client that made configuration changes to an online application. “[The organization] neglected to check whether this was successful and, as a result, had a breach,” Berkeyheiser says. “The point is that it should be ingrained in your organization that any time you make a change, you should push back and test and see whether there is an issue from a privacy and security point of view.”

— Mike Bassett is a freelance writer based in Holliston, Massachusetts.