Home  |   Subscribe  |   Resources  |   Reprints  |   Writers' Guidelines

June/July 2019

The Labyrinth That Is HIPAA
By Elizabeth S. Goar
For The Record
Vol. 31 No. 6 P. 18

Whether it’s phishing attacks or the introduction of another ‘voice’ in patient rooms, the law often finds itself in the middle of sticky situations.

In general, complying with HIPAA is a challenge for health care organizations. But some aspects are more complicated than others—and technological advances are quickly testing the limits of the law. In some cases, it’s HIPAA’s nuances that create compliance challenges with protecting patient information. In other cases, it’s the security challenges created by the increasingly sophisticated measures taken to steal that information, often through something as simple as tricking employees into opening e-mails.

The latter can be seen in the findings of a recent study published in JAMA Network Open, which found that phishing represents a major cybersecurity risk for hospitals after 1 in 7 simulated phishing e-mails were clicked on by employees.

Phishing Vulnerabilities
Hospitals are increasingly falling victim to cyberattacks, which carry a high price tag consisting of not only service disruptions but also potential fines for HIPAA violations. According to Proofpoint, e-mail fraud attacks on health care organizations increased a jaw-dropping 473% in 2017, with 77% of organizations seeing five or more employees targeted by e-mail fraud attacks. What’s more, the 2019 HIMSS Cybersecurity Survey found that “negligent insiders” were one of the biggest threats.

This increasing vulnerability to phishing scams in particular was the catalyst behind researchers’ interest in determining just how great a threat these e-mail attacks are in the industry.

“We’ve recognized for some time that cybersecurity is far more important than an administrative burden or behind-the-scenes hospital issue, and that increasingly we see cybersecurity as a risk for actual clinical care delivery,” says lead researcher William J. Gordon, MD, MBI, with the division of general internal medicine and primary care at Brigham and Women’s Hospital. “We focused on phishing given how commonly it is employed as an attack vector.”

For the retrospective study, a convenience sample of six US health care organizations provided data obtained through 101 simulated phishing campaigns involving nearly 3 million e-mails sent over a seven-year period. Overall, 1 in 7 employees clicked on a phishing e-mail for an overall media click rate of nearly 17% across all campaigns and organizations. These results, notes Gordon, represent a serious cybersecurity risk to hospitals.

“We had some inclination that click rates would be high, but I think we were surprised by the magnitude of the cybersecurity risk,” he says. “Cybersecurity awareness is increasing in general, yet our study shows that people are still clicking on phishing e-mails.”

Researchers identified a number of factors contributing to the high click rates, including high employee turnover within hospitals, which creates a constant stream of new employees who may be coming in without cybersecurity training. Hospitals are also vulnerable due to significant “end point complexity,” which is created by a large number of IT devices such as employee smartphones connected to the network that could be targeted in an attack.

Hospital information systems are also interdependent. For example, EHRs are dependent on laboratory systems that require a network connection to the laboratory analyzer system to process results. This creates a situation in which attacking one system can impact all other downstream systems.

Finally, researchers note that it is extremely difficult to lock down hospital systems because of the vast network of users. Just one of those users clicking on one phishing e-mail can take down an entire organization.

However, there was a bright spot in the findings. The regression model showed that “repeated phishing campaigns were associated with decreased odds of clicking on a subsequent phishing e-mail,” Gordon says. “The more campaigns an institution ran, the less likely employees were to click on subsequent e-mails in general.”

Among the strategies recommended in the study for preventing or mitigating the consequences of phishing attacks were to prevent e-mails from being received in the first place by using filtering technologies or to minimize the value of username and passwords by requiring multifactor authentication or special access controls for specific systems. Another recommendation was to increase employee awareness and training through phishing simulation campaigns and educational campaigns with posters, decals, and training.

“Our study suggests that simulated phishing campaigns can reduce click rates. However, phishing simulation is just one component of a larger information security program. It is important to stay vigilant and invest in information security programs to minimize risk,” Gordon says.

Minors’ Health Records
Falling under the “nuances that complicate HIPAA compliance” category is parental access to a child’s minor medical records. The privacy laws surrounding minors and their health information received national attention in a recent Newsweek article about a father who was informed by the University of Iowa Medical Centers that he would no longer have access to his 12-year-old daughter’s medical records without her consent.

Shocking as that sounds, it is within the letter of the law. HIPAA grants parents full access to their children’s medical record when it doesn’t conflict with state laws governing age of consent. Unfortunately for uniformity’s sake, state laws are all over the place when it comes to the age at which parents must have their child’s permission to view their medical records.

Iowa has no set age, leaving it up to individual hospitals to determine when consent is required. In this case, the University of Iowa Medical Centers has set that age at 12, although the decision is under review after several parents complained that is too young to make independent medical decisions.

“When you consider the piecemeal of federal laws and state regulations that must be navigated to make a release of information decision, it is complicated on a good day,” says Susan M. Lucci, RHIA, CHPS, CHDS, AHDI-F, a senior privacy and security consultant with tw-Security and a member of For The Record’s Editorial Advisory Board. “Add to that, the facility set their own policies regarding the age of eligibility to withhold records from their parents that aligns with the state of California and Vermont. All this complexity leads to little wonder that the OCR [Office for Civil Rights] has taken a special interest when it comes to patient access.”

Adding to the confusion, while minors have long been considered incompetent to give consent for medical treatment, most states now give them the right to do just that in certain situations. Among these are court-ordered and/or situational emancipation, such as when minors are married or have their own children. In several states, including California and Vermont, minors as young as 12 are allowed to consent to certain treatments, including those for substance abuse, mental health, and birth control.

In the University of Iowa Medical Centers case, the facility was acting within the law to set the age of required consent at 12, despite parental objections. However, there are steps the health system and others can take to reduce the fallout from justifiably upset parents wanting to monitor their children’s care.

Lucci suggests that notice of privacy practices could go a bit further to ensure regulations are understood. For example, while it shouldn’t be necessary to go into details about specific privacy requirements at the time of registration, an additional form could be provided for signature that explains the age for record release decision making for substance abuse or mental health in particular, as they are considerably different from the age of majority.

“Another thing that would be extremely helpful is to be certain that people coming in for treatment understand their rights beyond the notice of privacy practices,” Lucci adds. “For example, encourage the parents to speak with the privacy officer regarding how privacy may be different with addiction than if it were a standard medical admission. This would be true also for other types of admissions, their concerns, and what they can expect.”

Voice in Patient Rooms
As technology advances, so do the challenges associated with HIPAA compliance. One of the most recent issues surrounds the use of Alexa and other voice-activated devices in patient rooms.

In April, Amazon announced that the Alexa Skills Kit was HIPAA eligible, enabling selected covered entities (CEs) and their business associates (BAs) to build Alexa skills that transmit and receive protected health information (PHI) as part of an invite-only program. The company also announced that six new Alexa health care skills were already operating in its HIPAA-eligible environment.

But hospitals interested in deploying voice-activated devices in patient rooms will find HIPAA compliance to be something other than a smooth path, says David Finn, executive vice president of strategic innovation for CynergisTek.

“We’ve heard a lot … about HIPAA-compliant Alexa. The problem with that is that Amazon didn’t say that; Amazon said they are providing a ‘HIPAA eligible environment’ for select developers,” Finn says. “Amazon recognizes that there is no such thing as ‘HIPAA compliant,’ only that a device may have the technical capability to be deployed in a fashion to meet the HIPAA Security Rule requirements. It will always depend on the people using and the processes surrounding the implementation of the device that bring you into conformance with HIPAA.”

Therefore, when it comes to leveraging the latest technologies such as voice-activated devices, the best answer anyone can give when it comes to the question of what the potential compliance pitfalls might be is, “It depends.”

“It depends on how the device is being used, what data are being captured, where they are stored, and who has access to those data,” Finn says.

He notes that in some cases, the implications may be trivial. For example, if a patient asks the device what is on the day’s lunch menu, that’s clearly not a privacy issue. However, if the doctor goes into that same patient’s room, verifies the patient’s identity and then proceeds to deliver a diagnosis, treatment, and prognosis, “you have a very different scenario, particularly if the voice assistant heard—or thinks it heard—a trigger word and it’s now ‘listening,’” Finn says.

“No one has clearly and definitively answered all the questions about data collection, storage, or sharing. Is this a powerful and important tool for improving the patient experience? Absolutely, but we can’t get ahead of privacy and security concerns,” he continues. “Providers should experiment, but they should start small with simple information requests and they should question the data at every step of the workflow.

“Last, but not least, these are Internet of Things devices—anything that can send to the internet can also receive from the internet. You need to know what you are sending, certainly, but just as importantly, you need to know who is ‘talking’ back to your personal assistants. If the device is on your production network, anyone connecting to the device is too.”

But what of the protections offered by BA agreements? Again, it’s murky. While the device itself comes from one company, such as Amazon and Alexa, the device itself isn’t the issue—it’s who can access PHI that may be captured, transmitted, stored, or used via that device. If access to the data being collected is limited to the CE, then no BA agreement is needed with the device maker.

“But typically, and by design, these devices listen for ‘trigger’ words … and then begin capturing the voice, transmitting that in a digital format to the digital assistant provider’s system where a human or an application accesses the data. CEs should understand if those data are then stored or have secondary uses,” Finn says. “The premise of digital assistants includes operating in an ‘always on’ state, that they have the capability to monitor and capture all communications from their placement in the clinical environment and do not discriminate in the origin of the voice communication or if the data contain PHI. The digital assistant provider is monitoring the device in service to the CE and if that information includes PHI, then they are a BA.”

Until more of these issues are addressed, Finn recommends organizations move slowly when it comes to adopting the latest and greatest technology for purposes that might or will involve PHI. He offers the following suggestions when considering the use of voice-activated devices:

Define the scope of the implementation. What problem are you trying to solve for the provider or patient?

Keep it focused. Choose locations wisely; don’t start in areas that are noisy or where PHI is often exchanged verbally.

Involve patients. It is important when using a new patient-facing technology to keep those patients in the loop—not only in “scoping” but also in explaining the issues and understanding their concerns.

“Voice assistants require a good bit of training and so will users. Many organizations using digital assistants are still not comfortable sharing care information with patients through these devices,” Finn says. “Focus on use cases that don’t involve sensitive information yet still enhance the patient experience. That builds comfort for both the user and the privacy and security teams as they learn to use this important new tool.”

Proceed With Caution
More than two decades after HIPAA’s passage, compliance has not gotten any easier. Technology continues to advance, creating new issues. Meanwhile, society also continues to evolve, with laws rushing to keep pace with new expectations and outlooks.

At the end of the day, the key to compliance is to understand how data are being collected and used, and by whom—and how that impacts privacy.

“There are many issues around patient consent to what data get accessed—how and by whom. Today, that is not even confined to digital assistants but even the EMR itself,” Finn says. “Now we’ve layered on yet another technology that can capture and store data and can be used to access the same data. That is why I recommend going slow and simple. Make sure you understand how things work, where data flows, how they are stored, captured, and retrieved.”

— Elizabeth S. Goar is a freelance writer based in Tampa, Florida.