Home  |   Subscribe  |   Resources  |   Reprints  |   Writers' Guidelines

September 10, 2012

Big Data Creates Big Privacy Concerns
By Elizabeth S. Roop
For The Record
Vol. 24 No. 16 P. 10

There are riches to be found inside mountains of patient information, but at what cost?

The relationship is young, but healthcare has embraced “big data” as a solution to many of the cost, quality, and safety issues that plague patient care. In particular, larger academic medical and health systems are applying analytics to massive volumes of patient data, performing predictive modeling, and identifying clinical risk factors, utilization patterns, population health trends, and much more.

Few will argue that big data can bring value to healthcare. The question for many is just how high that value will be—and at what cost.

“I don’t know if we have a good sense yet of all the value that big data will bring to healthcare over the years. Some early indications, especially around safety and optimizing healthcare organizations from an operational perspective, are that it holds a lot of promise,” says Jeremy Maxwell, vice chair of the EHR Association Privacy & Security Workgroup and an application security architect with Allscripts.

Uncertain, Unequal Value
The true potential of big data lies in its ability to reveal patterns and trends that will enable healthcare organizations to deliver better care at lower costs. Its use for business analytics also helps drive more informed decisions on resource allocations and service line and other investments.

“Some organizations are already realizing or starting to realize the value [of big data]. Some of it depends on how far along they are in collecting and capturing electronic data,” says Deven McGraw, director of the Center for Democracy & Technology’s Health Privacy Project. “You can’t do big data on paper, so the big data revolution is coming to clinical care in a broader context, and institutions that are further along the curve in adoption and use of electronic records are starting to reap its benefits.”

But what is the value of those benefits? According to a May 2011 report from McKinsey Global Institute, the answer is $300 billion. That is the amount researchers say will be generated by the creative, effective use of big data by the US healthcare industry. Two-thirds of that amount would come from an 8% reduction in expenditures.

Some are skeptical of those figures and unconvinced that patients will benefit more than tangentially from big data’s application in healthcare.

“Who knows how McKinsey arrived at that huge amount of money saved, but statements like that are designed to hype big data and McKinsey’s consulting services to decision makers,” says Deborah C. Peel, MD, founder of Patient Privacy Rights. “It also fits perfectly with the government mantra for the healthcare system, [that being] if we can save money on healthcare we should keep throwing billions at health IT. For big data, that means a windfall for the few corporations and academic institutions capable of analyzing it.”

Peel and other privacy advocates charge that the only true beneficiaries of big data as it is currently managed are the organizations selling and mining the information. Indeed, IDC reports that the big data business generated more than $30 million in revenues in 2011 for companies with extensive data-mining experience. The firm further predicts big data will hit $34 billion in 2012, driven largely by its increased use within healthcare.

Peel says the value of big data to healthcare businesses appears to be unlimited. However, it is consumers who will pay the price unless informed consent is secured for all data collection and use. She notes that US patients currently have no control over how their data are used or what information can be gleaned from them and sold to employers or others who may use them for negative purposes. As evidence, she points to a survey conducted in 1999—when most medical records were paper—that found 35% of Fortune 500 companies looked at medical records before making hiring and promotion decisions.

“The entire premise of big data is that the collection of more and more terabytes of the most deeply personal, intimate information about each individual’s mind and body, geolocations, exercise, eating habits, etc will be used for research or data analytics,” Peel says. “But collecting big data and analyzing it without informed consent is illegal and unethical. Today, everything about us is collected … but by whom? How does industry get big data? Surreptitiously, by theft. The nation has no idea how much data is collected about them or how many millions of entities collect and sell it. Big data, as practiced today and envisioned by the industry in the future, equals Big Brother and the end of democracy.”

Real Concerns
Some brush off data privacy concerns as overblown or even outdated, particularly with the strengthening of many aspects of the HIPAA rule under the HITECH Act and the stringent security requirements under meaningful use.

However, unauthorized disclosure of protected health information (PHI) remains a serious issue. According to Redspin, a security audit company, the total number of patient records breached increased 97% between 2010 and 2011. In all, more than 19 million patient health records have been breached, with the average number of records per breach increasing by 80% in 2011 to nearly 49,400.

Ninety-seven percent of the healthcare organizations that participated in a survey from the Ponemon Institute reported suffering a data breach in the previous two years at an annual average cost to the industry of $6.5 billion since 2010. The most common causes for the breaches were lost or stolen equipment, errors by third parties, and employee mistakes.

It is a trend that some say will continue as big data makes greater inroads into healthcare, an industry that has failed to adequately protect its small data despite widespread availability of affordable, user-friendly encryption and other security technologies.

“Certainly to the extent that some of the approaches to big data are creating big databases, that is problematic,” McGraw says. “It’s mostly been carelessness and failure to encrypt portable media. The healthcare industry seems to have an aversion to encrypting data, and that’s a problem. They have every excuse under the sun … but they’re empty excuses.”

McGraw notes that the expanded disclosure regulations may ultimately serve to accelerate adoption of more stringent security measures as organizations weigh the damage negative publicity surrounding breaches can have on their reputations. However, she says, “We’re in a period where it isn’t [changing] quickly enough in an environment where we should be trying to build patient trust.”

Peel, who concurs that the pace at which PHI protections are being adopted is too slow, says there is plenty of blame to spread around. She calls the improvements made to HIPAA modest at best, adding that “massive security and design defects remain. And patients’ long-standing rights to control who can see and use personal health records, from prescriptions to DNA to diagnoses, was not restored.”

Peel also points to the Consumer Privacy Bill of Rights, which focuses on delivering data privacy protections to win consumers’ trust in technology but specifically excludes healthcare data.

“Health data security is abysmal, and 80% of health data is not even encrypted. More large databases plus ever more widespread secondary and tertiary uses of data will equal a massive, exponential increase in data breaches,” says Peel, adding that not only is insufficient attention being paid to protecting PHI in an era of big data, but consumers are woefully uninformed about the risks of electronic systems.

“Government and industry have colluded to present a false sense that all is well,” she says. “Denial and magical thinking reign. There is just too much money to be made. Other factors? Who wants to miss out on the next big wave of innovation? Who wants to be seen as out of date?”

Moving Beyond the Traditional
Not everyone agrees that big data goes hand in hand with an increase in data breaches. In fact, Maxwell points out that there are actually two views on the issue. One is that big data poses a greater risk because it is a single point of failure. The other holds that big data provides “greater security because big data collectors can afford more expensive and comprehensive security controls. It’s a tradeoff between those two philosophies,” he says, adding that “pundits have been predicting that malicious [breaches] and hackers are the next big thing. They’ve been predicting that for several years but … it remains to be seen.”

There are several ways to keep breach predictions from coming to fruition without lessening the potential value the healthcare industry can realize from big data. It is possible to “maintain data so useful access is preserved while maintaining security. There is a lot of room for innovation,” Maxwell says.

The key, he says, is to follow industries such as banking and finance that long ago found the right balance between big data security and usability. The foundation is a multilevel approach. For example, most banks have security alarms on doors and windows, armed guards, time locks on safes, security cameras, and teller-activated entry-point mantraps and distress buttons.

“Similarly, all healthcare organizations have to have levels. It starts with policies and procedures to ensure appropriate access at the starting point. From there, it goes into technical controls like encryption with intelligent key management,” Maxwell says. “Encryption is often looked at as a silver bullet, but the key is how you manage those encryption keys that control how secure that encryption is.”

Healthcare organizations that wish to benefit from big data also must accept the fact that stellar security is the price of entry. They must be willing to invest in a minimum set of controls and be ready to adapt to the changing security and privacy landscape. It is no longer sufficient to deploy what Maxwell dubs the M&M model of security: a hard shell of perimeter security wrapped around a gooey interior protected only by soft user policies that don’t provide adequate controls. Firewalls and virtual private networks are no longer enough.

“We can’t afford to do business the way we have been,” Maxwell says. “Regardless of the big data discussion, what is clear is that connected systems are here to stay. We are more connected than we ever have been and that will only increase in the future. Because of that connectivity, layers are required. Securing the perimeter is no longer sufficient. The best defense is multiple layers, including firewalls, intrusion detection, and encryption.”

McGraw recommends avoiding the single point-of-failure risk by utilizing a distributed network when big data is being aggregated from multiple locations. She also advises using data in their least identifiable form to maintain usability.

“It doesn’t necessarily have to be deidentified to have greater privacy associated with it. Data minimization is the key tool of privacy. [Entities should also] expose the data only to those people who actually have to use it. It [requires] access control and clear data retention policies that must be adhered to,” McGraw says. “It’s the three-legged stool: good data architecture, good technology, and good policies.”

Peel notes that deploying state-of-the-art, comprehensive, and meaningful data security and privacy protections for big data will go a long way toward establishing a higher level of consumer trust in how PHI is being used. Establishing that confidence will ensure that the entire system—patients included—will be able to realize the full value of big data.

“Those who embrace the rights of patients and individuals to control information about themselves will just ask for the data, explain why they want to use it, and keep their word to use it only in the ways individuals want will thrive, get better data … and be paid for providing real services to the public while upholding the law and medical ethics,” she says. “Each patient in America has the right to decide how to balance his privacy and whether his or her data should be used for any purpose. … In the realm of healthcare, individuals have very strong rights to decide for themselves where to draw the lines. If we do not provide real privacy exercised via control of data use, millions will continue to refuse treatment every year. The lack of privacy causes bad outcomes not better care.”

— Elizabeth S. Roop is a Tampa, Florida-based freelance writer specializing in healthcare and HIT.

 

Does Big Data Require a HIPAA Redo?
Between the rise in patient data breaches and the momentum big data is gaining in healthcare, some question whether HIPAA has what it takes to keep pace with a rapidly evolving HIT environment. Does the law need a makeover?

For most, the answer is no. According to Jeremy Maxwell, an Allscripts application security architect and vice chair of the EHR Association Privacy & Security Workgroup, “The core framework of the HIPAA Security Rule is good and the right framework. It is a risk-based framework that scales with the size and type of organization.”

However, others suggest the law could be refreshed, with the addition of some sharper teeth. “HIPAA and HITECH already require very strong data security measures, risk assessments, etc, but industry just ignores them. HITECH added more enforcement, but it is far from enough,” says Deborah C. Peel, MD, founder of Patient Privacy Rights. “HIPAA burdens industry with conflicting regulations and delayed rulemaking while failing to protect patients’ rights to privacy and control over personal health information. HIPAA fails industry and the public.”

According to Deven McGraw, director of the Health Privacy Project, what is needed is not an overhaul of HIPAA but rather some guidance for healthcare organizations on how best to comply with what is a complex set of regulations.

In fact, she says the Office for Civil Rights, which enforces HIPAA, has indicated an interest in providing more concrete guidance for healthcare organizations and is currently taking suggestions on an informal basis on what that guidance would entail.

“The problem is not the law but interpretation of the law,” McGraw says. “Better guidance from the regulators is always helpful to give healthcare institutions comfort that if they follow a set of policies and best practices for secondary uses of big data they will be in compliance. What is ideally needed is guidance that provides safe harbor.”

— ESR

 

Mapping Big Data
Do you know where your data are? For the vast majority of US healthcare consumers, the answer is a resounding no.

“In the US, there is no chain of custody for any sensitive personal information and no way to control who gets it. There is no way to track or prevent the flow of health information to hidden data users and thieves,” says Deborah C. Peel, MD, founder of Patient Privacy Rights.

That is why Patient Privacy Rights has teamed up with Latanya Sweeney, PhD, founder and director of the Data Privacy Lab, part of Harvard University’s Institute for Quantitative Social Science, to develop theDataMap (www.thedatamap.org). The project is enlisting consumers, technology experts, whistleblowers, and geeks to act as “data detectives” to report and vet hidden health data-sharing arrangements. Peel says it’s a way to map hidden data flows and ultimately reveal how everyone’s personal health information is being used and exchanged by the various organizations that hold it.

“US patients can’t weigh the risks vs. benefits of using electronic health systems without knowing who has copies of personal health records, from prescription records to DNA to diagnoses,” she says. “We don’t know if it is sold as intimate health profiles, used for research or data analytics, for fraud, for extortion, or for ID or medical ID theft.”

The hope is that theDataMap, once launched, will reveal how data are being used and by whom so patients can regain control.

— ESR