June 9 , 2008
Minding the Privacy Store
By Elizabeth S. Roop
For The Record
Vol. 20 No. 12 P. 12
Healthcare organizations can employ every security gadget known to man to protect electronic patient information, but it’s all for naught if the correct “people” policies are not in place.
Unauthorized snooping into the medical records of celebrities and politicians earlier this year set off a maelstrom of debate over the ability of health systems to maintain the privacy of patient information in an electronic environment that, to many, appears to have just as many vulnerabilities as the paper systems they are replacing.
Privacy threats go beyond the nosy employees, stolen laptops, and malicious system attacks that make the headlines. More often than not, the breach is unintentional; a file sent to the wrong printer, an unattended computer monitor displaying a patient’s record, or a helpful soul holding the door open for a stranger can all result in privacy violations regardless of how many state-of-the-art security features an electronic health record (EHR) system has.
Privacy breaches have affected millions of patients over the past three years, although few have made the front page. No organization specifically tracks the number of violations involving protected health information (PHI), although the Privacy Rights Clearinghouse does track breaches involving sensitive information such as Social Security, account, and driver’s license numbers.
Since January 2005, the clearinghouse has identified security breaches involving 224.5 million records. In the first half of April 2008 alone, more than 3.2 million records were compromised. Of those, more than 2.2 million involved a healthcare facility or payer, including 2.1 million patient records stolen from the University of Miami when computer tapes were taken from a van used by a private off-site storage company.
Breaches of such magnitude illustrate the conundrum of the electronic exchange of PHI.
“Technology gives us the tools to be even more private and secure than we could with paper. On the other hand, the movement of the information electronically, more rapidly, and in greater volume also magnifies the risk. Whereas, in the past, if a box of paper records ended up in the wrong room or next to a dumpster, it was 50 records rather than 100,000 that go out the door when someone hacks into a system,” says Deven McGraw, director of the Center for Democracy & Technology’s Health Privacy Project.
“You can have the greatest security [features] in the world, but if they allow unfettered use of information once someone has it in their hands, then the fact that you have great security won’t make much difference. If you have a lot of great privacy policies but lack security, you have an incomplete picture. You need both,” she adds.
Staying Off the Front Page
To avoid making headlines with privacy violations, facilities must take a close look at both their EHR security features and the privacy policies and procedures that guide human behaviors.
“The problem is you can’t have privacy if you don’t have security,” says Chris Apgar, CISSP, president of Apgar & Associates LLC, a privacy and information security consulting firm specializing in healthcare and financial services. “Most privacy breaches that make it to the front page of the newspaper occurred because someone didn’t do their job on the back end and it really was a security breach.”
Security will be determined largely by the EHR’s functional capabilities. For that, facilities need to ensure that the system offers a minimum set of features. A good starting point for that determination is the Certification Commission for Healthcare Information Technology (CCHIT), which has beefed up its requirements related to privacy and security and is in the process of adding more criteria for 2009.
Certification criteria is developed by a volunteer panel of industry experts and then vetted through numerous public comment periods before finalization. However, it is important to note that CCHIT certification applies only to an application’s functionality and not postimplementation compliance.
“There is privacy, and there is security. Both of those require not just technology—hardware and software—but also policies and practices that involve the people. Technology alone without the right training, policies, and enforcement is not going to protect privacy nor can people do it alone if they have inadequate technology,” says CCHIT Chair Mark Leavitt, MD, PhD. “We are not certifying the hospital or doctor’s office after implementation. We are certifying the product before they buy it to help them pick a good product.”
To that end, earning CCHIT certification means EHR products must successfully complete numerous scenarios that demonstrate the application’s access control and audit and authentication abilities.
The process also examines the availability of documentation providing guidelines for configuration and use of the EHR security controls, as well as technical services to support confidentiality of PHI that is delivered over the Internet or other known open networks through encryption and an open protocol.
“Inside the EHR, we address documentation integrity, if you will. The other side is security that is showing that the box—the system and structure of it—is maintaining [integrity] from a tracking perspective,” says Bonnie Cassidy, MPA, RHIA, FAHIMA, FHIMSS, the CCHIT’s strategic group leader for privacy and compliance. “Unless you have ongoing monitoring that ensures you’re identifying every time there is a breach, then you’re in a position of trying to defend the integrity of your operations. You can’t just say ‘once we saw it.’ It has to be that your trending it and that you’ve got flags in the system so you know what’s happening and can take action.”
In addition to CCHIT certification, Apgar recommends reviewing the flow of information through the EHR out to the Internet or other external destinations to ensure it will be properly encrypted throughout the transaction. He also notes that many organizations are going beyond HIPAA requirements and encrypting data at rest—in other words, encrypting the information while it is stored.
He says the EHR is only as secure as the organization’s network, as well as its administrative and physical safeguards. EHRs exist in a dynamic environment that must be examined with the appropriate security control implemented.
Finally, the presence of a specific security function within an application means nothing if those functions are not properly implemented. Audit logs are a prime example.
“Each application has a number of audit logs to turn on. The key gets back to what I should do as an organization. If you turn on an audit log, you darn well better look at it, write a report on it, and document the fact that you looked at it, otherwise you create a very specific liability for yourself,” says Apgar. “I’m not advocating turning off the audit log, but if you create an audit log and do nothing about it—if you don’t even look at it—you create a very specific liability for yourself.”
Environmental security is another area of concern and can range from the basic position of a computer screen to controlled access to areas where data are stored.
Barbara Demster, MS, RHIA, CHCQM, a senior consultant with Just Associates, which specializes in privacy and security, and the chair of the HIMSS Privacy and Security Steering Committee, recommends that facilities pay close attention to setting clear security expectations and then training staff in acceptable security behaviors.
Electronic isn’t necessarily paperless, meaning the data output from the EHR in paper form must be managed and monitored just as closely as it is when it resides electronically. Access to key areas must also be controlled, which includes community output devices such as printers and fax machines.
“The physical security of a place is frequently one of the toughest things to ensure. I’ve tailgated into most organizations that I have done surveys and audits on because people are basically friendly. They want to be helpful, so they hold the door and let in a string of folks” who shouldn’t be there or who should at least be escorted, says Demster.
Those kinds of human behaviors are best handled through comprehensive policies and procedures that dictate everything from who can access what areas of a patient record to how personnel should respond when a security breach is discovered.
And just as an EHR’s security functions are meaningless if they are not properly used, policies and procedures aren’t worth the paper they’re printed on if the staff are not fully trained on how to follow them.
“The thing about privacy is that it’s softer, so much of it is behavioral. Security is, too. They’re people behaviors,” says Demster. “You can program a system to respond the way you want, but it’s very difficult to program people to respond the way you want them to.”
Ongoing audits and risk analysis are important components of any facility’s privacy and security program. Regular reviews will identify any problem areas that require additional attention and should be conducted at least every two years, “but areas that are high risk should be looked at more frequently,” says Demster.
In fact, Demster says facilities should evaluate systems during the selection process to ensure that the EHR’s features and functionality fit their precise needs and any necessary process reengineering or environmental redesign can be undertaken prior to deployment. Not only will this save time on the actual implementation, but it will also help to avoid overcompensating in some areas, which can raise the price of an EHR install unnecessarily.
It also helps set expectations and gets people thinking about how the EHR will impact their daily routines.
“The electronic environment is going to change the way you do things—guaranteed. There is no way for it to just overlay existing processes,” says Demster. “Anticipation is the operative word. Anticipate the impact on your current business processes. You need to think through the privacy and security processes [and] think through the interaction required with all the other systems.”
— Elizabeth S. Roop is a Tampa, Fla.-based freelance writer specializing in healthcare and HIT.
Will Violators Ever Pay the Price?
Facilities needing a little extra motivation to lock down privacy and security should look no further than the potential penalties for violations. HIPAA violations carry civil penalties of up to $100 per violation and up to $25,000 for each violation in a single year. Criminal penalties range from $50,000 to $250,000 in fines and up to 10 years in jail.
But it’s not just HIPAA that organizations need to fear.
“The big, mean dog out there is the Federal Trade Commission [FTC],” says Chris Apgar, CISSP, president of Apgar & Associates LLC. “They not only levy very strict fines, but they can also require you to perform a regular audit of your security on an annual basis and provide those for 20 years. So the FTC is the organization you really don’t want coming in.”
Civil cases can also be brought against violators and can have severe financial implications, but Apgar says the price paid for privacy violations goes far beyond dollars.
“You have loss of trust, loss of revenue, a damaged reputation, and even the potential all the way up to closure,” he says. “If my reputation is such that people don’t trust me anymore and I can’t get patients to come into my front door, I’ll go out of business. There are pretty significant issues around privacy that aren’t necessarily dollars and cents.”
But even with the avenues for both civil and criminal punishments set forth in the HIPAA regulations and despite the sheer volume of breaches that have occurred in the past three years alone, no fines have ever been levied against a facility.
That may change with implementation of the Centers for Medicare & Medicaid Services’ program to randomly audit compliance with HIPAA security regulations and the Office for Civil Rights’ plans to step up enforcement of privacy regulations.
However, Deven McGraw, director of the Center for Democracy & Technology’s Health Privacy Project, fears that five years of no enforcement or penalties may have sent the wrong message.
“I don’t know that any sort of surprise audits … by enforcing agencies will do a lot of good if there still isn’t any action taken at the end of the day. The impression that ends up leaving with the public and the [regulated] entities is that there are not any real penalties for noncompliance,” she says. “This is not to suggest that [organizations] feel like they’re free to be completely lax. But if you’re not really facing a stiff penalty for noncompliance, you might prioritize your institutional resources in some other way.”
Another, perhaps greater issue is the fact that while civil penalties and disclosure of breaches may create financial hardships and public relations problems for the offending organizations, they cannot undo the damage that is done to the patient whose medical information has fallen into the wrong hands.
“The individual organization’s reputation might be harmed, but who is responsible for the collective harm when people are now afraid to seek care?” asks McGraw. “To what extent do [organizations] have to be held responsible for their contributions to the chilling effect these breaches, when taken together, have on people’s willingness to seek care and be completely truthful with their doctors?”