Home  |   Subscribe  |   Resources  |   Reprints  |   Writers' Guidelines

April 2018

Shopping for Data Analytics Solutions
By Susan Chapman
For The Record
Vol. 30 No. 4 P. 20

Industry experts offer ideas on the proper approach and advice on how to handle buyer's remorse.

For many health care organizations, data analytics is becoming a higher priority. Step one in realizing the benefits of this concept is the purchasing decision. Given that such technology comes at a significant cost, there are many factors organizations must consider before selecting the data analytics platform that's right for them.

Decisions, Decisions
Health care leaders in search of a data analytics platform have plenty of options, each offering a variety of functions. For instance, some platforms are for operational use, others focus on research, and then there are hybrid programs created from a range of development tools.

Which path to pursue depends on an organization's institutional objectives.

"If the objective is to be able to create regular reports for the purpose of evaluating, say, alarm traffic within a critical care unit, then that implies a platform that can create customized and tailored reports using a turnkey approach by clinical staff members," says John Zaleski, PhD, CAP, CPHIMS, chief analytics officer at Bernoulli Health. "Such users would not want a tool that requires tinkering, but rather one that can create a specific type of report and do so repeatedly. Staff members may wish to create customized reports by care unit or may want flexibility to interrogate data for specific purposes as well. On the other hand, if an organization has a staff of dedicated analysts, it may be able to use off-the-shelf development tools and build out their own customized capabilities from those."

"While often spoken of together, data and analytics are not a single platform. There are a lot of components involved, including but not limited to structural data warehouses, big data platforms, data governance, data quality, data visualization, reporting, and predictive analytics," says Christer Johnson, health care analytics advisory services leader at Ernst & Young. "More and more, people are starting to look for user-friendly data and analytics tools that allow business users to easily gain access to the insights they need to do their jobs more effectively and efficiently."

One of the biggest challenges health care organizations face is relying solely on the software package to address their data analytics needs. "We found that some of the real difficulties are more often on the people side, not the software side. Not just people to do analysis but people to clarify how the analytical insights will be used," Johnson says.

Greg Nelson, MMCi, CPHIMS, CEO and founder of ThotWave and author of The Analytics Lifecycle Toolkit, agrees. "Essentially, we have to view software and technology as part of the resources that help deliver a capability. If we have a strategy, then we understand the capabilities that are needed. Then we can choose a resource informed by the lineage of strategy," he explains.

In order to develop an effective strategy, health care organizations are advised to include perspectives from across its culture, including payers and providers. "We work across the health care and life sciences spectrum. In our own organization, we gain experience from other sectors to help us design our software tools and adapt our platform," says Francis "FX" Campion, MD, chief medical officer at Ayasdi.

While technology is important and a significant enabler of the decision-making process, workflow considerations must be addressed. "We must view data and analytics in the context of the decision lifecycle. The traditional data pipeline ideally contains all of the data needed for analytics. That works in a perfect world if you use a historical perspective of data warehousing," Nelson says. "Most organizations have data warehousing and data strategy to help feed the needs of the organization. In analytics, we rely on those data but there is always more needed. In the insurance space, for example, there is a need to be able to use consumer data to help understand transitions in life to assess marketing strategies. There will likely always be data that were not considered when the data warehouse was designed. Advanced analytics are adaptive and need care and feeding. We have to take into consideration the workflows in order to make an informed decision."

Nelson believes flexibility is a critical factor when choosing a platform. "If you have an organization that is built on innovation and experimentation, the bane to that culture is a huge monolithic system that doesn't adapt to the way that people want to work. What we need is a platform or framework that balances each of the three factors at play: the data pipeline, the analytics lifecycle, and the decision lifecycle," he says.

Jonathan Symonds, chief marketing officer at Ayasdi, believes performance, usability, and standards are critical to selecting the appropriate data analytics platform. He suggests decision makers address the following questions when assessing their organization's requirements:

Symonds elaborates, "In particular, when considering AI [artificial intelligence], there needs to be a framework of five things. First, you have to have the ability to look at and understand relationships and patterns in that data without questions, which is a key capability because medical data are complex. Next, you have to have prediction. If you're able to combine discovery and prediction, you'll get even better prediction. Justifiability is also important. We need to do more than simply explain how we arrived at our decision. Next, the ability to go into action. You have to be able to encapsulate that intelligence in the first three steps into an application that is consumable by practitioners. Finally, the ability to learn over time. Applications that aren't getting smarter are getting less intelligent. If you can do all of those things, you can move the needle in this environment."

Yohan Vetteth, chief analytics officer at Stanford Health Care, believes the purchasing process for data analytics platforms is similar to those made for other HIT solutions. "I think there are two ways to look at analytics applications," he says. "The software tools that you use to develop analytic solutions, and analytic solutions that provide additional value in the way of leading practice visualizations, benchmarking of metrics across organizations, or advanced algorithms to help predict actions. With the first set of tools, price plays a more significant role because those tools are more generic. But as you look at more advanced analytics solutions, a specific kind of dashboard or benchmarking or predictive analytics, you start to look at the potential improvement in clinical or operational outcomes. These potential improvements can be evaluated by results at other organizations, published papers supporting the outcomes, or if they directly address a problem that we have been trying to solve."

Addressing Dissatisfaction
Despite organizations' best efforts to choose the appropriate data analytics platform, there can be times when the software does not fit the bill.

"I believe in the old adage, 'Try before you buy,'" Zaleski says. "Write expectations into the contract regarding capabilities, support, response time, expected reporting capability, and features. It's important to remember that every health system has business and clinical needs that make them different from other organizations. A vendor partner with deep knowledge regarding the unique aspects of your organization not only will help you avoid common mistakes but will also keep you focused on detailed integration points and workflows."

Zaleski adds that an effective vendor partner will provide guidance and ensure that all parties are accountable. "A positive and fruitful collaboration allows hospitals to establish benchmarks and ensure that configuration and interoperability are optimized and seamless," he says. "An excellent vendor also acts as a consultant and educator, making hospital staff comfortable with new technology and uncovering strategies for optimizing workflow."

The quality of the vendor partner is just as important as the product's virtues, Zaleski says. "The importance of evaluating the vendor as much as the product they are delivering cannot be stressed enough. Vendors that lack expertise, training capabilities, and clear steps toward go-live and beyond are critical red flags," he says.

Nelson offers a different perspective. "Depending on the vendor, there is likely not a 'try it before you buy it' scenario because the installation takes so long, and there is disruption to the system. Organizations take years reengineering their systems," he says.

Nelson has seen organizations try to develop workarounds when they have purchased an ill-fitting product. "Some organizations either do these workarounds or they just stay the course," he says. "We see those scenarios manifest in employee turnover if they aren't being utilized. A talent shortage in analytics then becomes a big problem."

Johnson says an inability to achieve the desired value from data analytics is rarely the result of buying the wrong software. Rather, he cites an industry talent shortage for most failures.

"Where I think companies get into problems is that they listen to a vendor that tells the organization the software will solve this or that business problem, but then it turns out to require a lot of work to prepare the data for analysis and then integrate the analytical insights into operational or strategic business processes." he says. "In analytics, the goal is to create insights that inform key business decisions in a timely manner. You need cross-functional teams with the necessary business, analytics, and technology skills and experience to design and develop analytical solutions that efficiently and effectively inform a company's key business decisions."

Symonds classifies dissatisfaction into two categories: the software itself and the buyer. If it's the former, he says, "You have to ask yourself why you're dissatisfied. Is it performance based? Is the software not doing what it's purported to be able to do? If that's the case, you probably have some contractual outs. The first thing you have to do is engage with your vendor to make sure that it understands what you're not happy about. There may be capabilities that help you get what you want that may not necessarily be in the documentation."

Dissatisfaction with the software may lie at the feet of the organizations, some of whom believe they're more sophisticated than they actually are, Symonds says. "That can happen for a number of different reasons—turnover of key technical assets, perhaps confidence. When you encounter those situations, it's difficult," he says. "It requires a moment of self-reflection: 'Maybe I'm not as far along the analytic pipeline as I thought I was.' There has to be some reckoning as to where the true blame lies. The solution can be services based or with training. Generally, the tools are stronger than the talent, which is particularly true in the medical field. You have to find the middle ground."

Vetteth believes most organizations are more comfortable purchasing analytics solutions based on an evaluation of the business case. "With a few strategic analytics partners, we have structured an at-risk component that depends on the value that the solution drives," he says.

Vetteth adds that a clinical or operational implementation strategy must be in place prior to committing to a vendor. "The key drivers in selecting analytics solutions are whether or not they have a proven approach to solving a problem that [the organization is] facing, proprietary algorithms, and … benchmarks embedded in the solution," he says.

Vetteth explains that by employing a thoughtful strategy around the selection of analytic solutions, the risk of dissatisfaction is greatly diminished. Nevertheless, there are hurdles to clear. "Some organizations may have a more dispersed selection criteria in selecting analytic point solutions that become difficult to manage over a lot of different areas. It is also difficult to be able to draw insights and value from the data that ends up being distributed across a lot of different point solutions," he says.

Staff Involvement
Zaleski points out that failed HIT implementations are not uncommon. "HIT projects either fall short of business and clinical goals or are completely abandoned at an astonishing rate," he says. "Overrun budgets and functionality problems are often cited as culprits. From the HIT perspective, however, the failure to include direct-care clinical staff—particularly nursing—in the evaluation, implementation, and training of new technology should not be overlooked."

End users not involved in the technology's selection, adoption, and implementation are unlikely to become owners of the product, Zaleski points out. "Achieving measurable progress in HIT adoption requires that hospitals identify and support internal champions in all relevant departments," he explains. "The formidable task list that comes with any technology implementation requires the input and expertise of a project team, which ideally should comprise leadership from myriad stakeholders, including IT networking, facilities, patient safety experts, educators, informatics nurses, laboratory staff, pharmacists, electrical engineers, biomedical engineers, quality improvement specialists, vendors, and direct-care clinical staff."

This wide-ranging team is ultimately responsible for every phase of deployment—evaluation, acquisition, rollout, implementation, and transition to live operations. "These staff members will determine the hospital's objectives and integration goals as well as vendor evaluations, business and clinical requirements, risk management concerns, patient safety goals, and costs," Zaleski says. "Designating a nursing champion—or superuser, for instance—at the outset allows other nurses and direct-care clinical staff to receive information, training, and support during all phases of adoption. These superusers would be working closely with the interdisciplinary team assembled for the project's implementation."

While this team approach does not guarantee success, it does increase the project's chances of being sustainable, Zaleski says. "Today's direct-care clinical staff have neither the desire nor the option to be passive consumers of HIT. The seamless integration of technology requires that direct-care clinical staff have influence in the design and testing of equipment and applications. Involving end-users in the early stages of system analysis and design specifications can lead to better adoption of new technology as well as identify how current technology can be adapted for greater user acceptance," he says.

Moving forward, Campion says the process of accessing and utilizing data is only going to get more complicated. "The data we have are substantially more than what we had 20 years ago, and they continue to expand," he says. "As an organization, you have to be confident that you have the right analytics partner and that partner is in it for the long haul."

Nelson says, "By adopting a framework, technologies become enablers of invention rather than their albatross."

— Susan Chapman is a Los Angeles-based freelance writer.