The Rise of the Chatbots
By Keith Loria
For The Record
Vol. 31 No. 2 P. 18
A look at how these conversational agents work in a health care setting.
One of the most influential pieces of technology in recent years has been the chatbot, software applications or artificial intelligence (AI) that mimic written or spoken human speech via auditory or textual methods.
The chatbot goes by many different names—smartbot, talkbot, chatterbot, bot, interactive agent, conversational interface—but regardless of what it’s called, the purpose is the same: to simulate a conversation or interaction with a real person. It’s an attribute that could be a game changer in the health care industry.
A Frost & Sullivan report says that by 2021, 85% of all consumer interactions with enterprises will be be automated—and health care is expected to play a major role in that development.
There are several service categories in health care where this sort of conversation form of AI can be useful. Jon Elion, MD, founder and chief innovation officer at ChartWise Medical Systems, cites patient engagement, where chatbots can help with ongoing monitoring.
“This might involve prompting and verifying that medications are being taken, checking daily weights, verifying that symptoms are improving in response to a new medication, etc,” he says. “The second [category] is more clerical, helping, for example, to make it easier and less intimidating to make appointments. We all get frustrated with waiting on hold as call centers get busy. With a chatbot, wait times can be reduced or eliminated and the interaction can be far more inviting than repeated requests to press buttons on the handset.”
A third service area—and probably its most challenging application—where chatbots can be useful relates to triage, or the ability to perform an assessment of symptoms to determine whether the patient’s immediate problem should be handled by a phone call, an office visit, or even a trip to the emergency department.
“This does not need to involve actual diagnosis, but since triage is difficult even for a highly trained human professional, this remains a young and growing application of the technology,” Elion says.
Kyle Cooksey, president of CareThrough, creators of the LifeLink text-enabled chatbot, says chatbots have emerged as a useful tool to engage with patients. For example, he says a text-enabled chatbot can reduce wait times, preappointment variability, and unprepared patients. For post-op care, it can improve patient care plan literacy and remote monitoring.
“A lot of times hospitals will give a patient a piece of paper that has a lot of really important information that most of the time ends up in the trash, and so the chatbot automates that discharge’s position and helps drive outcomes,” Cooksey says.
According to Cooksey, 92% of patients do not make it past the first page of a hospital website. One of the goals of leveraging LifeLink is to improve conversion rates with simple, conversational bots that make it easier for patients to find physicians and make appointments.
Additionally, just 26% of patients can explain the risks and benefits of upcoming operations. In that regard, the use of chatbots can improve patient comprehension, maximize on-time operating room starts, and decrease patient anxiety.
Northwell Health, the largest integrated health system in New York State, recently launched Northwell Health Chats (powered by Conversa Health’s conversational AI platform), which optimizes patient engagement, care team satisfaction, and clinical and financial outcomes through targeted patient outreach and coordinated care management.
Vish Anantraman, MD, Northwell’s chief innovation architect, says with technology such as Alexa invading homes, people are becoming more accustomed to “conversing” with voice interaction platforms. As a result, Northwell considered experimenting with the technology from both a patient-centric and provider perspective.
Anantraman says the hope is that chatbots can reduce physician and nurse burnout.
“It’s a well-known fact that doctors and nurses are spending a lot of time—in some cases, almost 50% of their day—in front of a computer, and they are getting to the point where they are spending more of their day with computers than patients,” he says. “Our inspiration for doing this was that we needed to start thinking about how we can save some of the time of the doctor or nurse and make it much more easier for them to do what they do best, which is interact with patients and spend more time at the patient’s bedside.”
Chris Edwards, chief marketing and experience officer for Conversa, says automated patient engagement is becoming a significant driver in health care.
“We know providers and patients need better communication around chronic condition management, postdischarge, medication adherence, patient education, even pre- and postsurgery communication,” he says. “We are not trying to be the Terminator machine, but more of the Iron Man machine, coming alongside care teams, and helping and augmenting care vs taking doctors’ jobs.”
Thus far, Northwell Health has reported positive outcomes using Health Chats. According to Edwards, the health system has realized 97% patient satisfaction and a significant reduction in postacute care expenses across several of its hospitals. Based on the positive results, Northwell is rolling out these personalized, automated conversations to other service lines, including radiation for head and neck oncology patients, marketing’s health risk assessments (HRAs), community health for colonoscopy prep, emergency medicine for postdischarge, and general surgery for pre- and postsurgery.
The Patient/Physician Relationship
When examining the effect chatbots have on the patient-physician relationship, one of Elion’s favorite expressions comes to mind: “high touch supported by high tech.”
“It would be a mistake to think that a chatbot could ever replace all aspects of a patient-physician interaction or relationship. But it might be able to relieve the provider from some of the more mundane aspects of data gathering, follow-up, and education, and allow a more meaningful interaction in the time allotted for an appointment,” Elion says. “There is always the risk that the technology gets in the way of the relationship rather than augmenting it. The burden is on the providers to ensure that technology is used properly and to be aware that one size does not fit all, meaning that it may not be suitable for all patients.”
Cooksey says chatbots can reduce the amount of administrative and clerical work taking place in the care setting.
“In the current environment, a nurse practitioner would complete the HRA with the patient, which takes a lot of unnecessary time, and then the physician would review it with the patient in person, do a lot of clerical work, and then get to where they would start with LifeLink on engagement to begin with,” he says. “So we are helping them perform top-of-license work so they can move away from the computer and move back into the patient’s presence.”
Northwell is investigating ways it can push information through a chatbot to allow physicians to interactively ask questions. For instance, a physician who wanted to check on a lab result or explain to a patient what is happening with their test could make a specific request such as “Show me the last blood glucose.”
“They wouldn’t need to log onto a system or turn their back to the patient or be on their phone,” Anantraman says. “That’s what we are working on. It improves patient/physician communication.”
Patients and nurses also can benefit from the presence of chatbots. For example, instead of hospitalized patients buzzing the nurse to adjust the room temperature, a verbal command of “I’m cold” can do the trick. “The chatbot would connect to the nurse who can instantly act on the information, like bring a blanket, and it would come much faster,” Anantraman says.
Conversa helps facilitate patient profile-driven and clinically based conversations, Edwards says, noting that follow-up occurs to gauge patient satisfaction. For example, a patient may be asked, “Were these chats helpful to you today in helping you manage your care?”
“What’s great about what we’re seeing is having these more clinically intelligent conversations that are data driven, integrating insights from patient data that are actually helping provide value to these patients,” Edwards says. “They are providing value to the health systems that are acting and managing the patients.”
Impact on Medical Records
In today’s health care environment, patient data are no longer compiled strictly from face-to-face encounters with physicians. Telehealth visits, e-mails, and texts can all contain fodder for a patient’s medical record. Where does the information gathered from chatbots fit into this equation?
Elion says it’s possible to add the transcript of a chatbot interaction verbatim into the chart, adding that any documentation must be reviewed by the provider.
Cooksey says chatbot conversations can be linked to an EHR. “Instead of waiting on test results, which generally is a big source of stress and tension, [patients] can see a trigger from the EHR that would say to the patient, ‘Hey, your test results are ready and a care team provider will be discussing them with you shortly,’” he says. “It kind of automates that approach to move as a concierge kind of experience and it helps them understand what’s going on even when no one is telling them.”
Natural language processing (NLP) plays a key role in the chances of chatbots being successful in a health care setting. The ability to accurately translate chatbot exchanges into clinical terms hinges on the technology.
In its simplest form, NLP finds parts of speech (POS), trying to identify nouns, verbs, negations, etc. This is not an exact science, Elion says.
For example, in one of his earliest experiments with NLP, Elion fed the computer a sentence about a man who presented to the emergency department with shortness of breath. The NLP translated that to be the patient had “shortness” (lack of height) and he was breathing.
“It did not yet know the medical expression ‘shortness of breath’ or that this was referred to as dyspnea,” Elion says. “The next step in NLP is to follow the POS recognition with training on medical nomenclature. There are additional steps to look at statistical analysis of words and phrases, as well as using context to remove ambiguities, but POS and nomenclature training are the strongest parts of the process.”
Depending on how numerators and denominators are used, quantifying NLP’s accuracy and precision can be elusive. Generally speaking, experts agree that NLP is 80% to 90% accurate. Therefore, any documentation generated via NLP must be reviewed.
Still, Elion says, a credible conversation can be had even with some misinterpretations. However, he points out the importance of reviewing the interaction prior to the transcript entering the chart.
It’s not the conversation itself that must be HIPAA compliant, but rather the technology used to transmit and record the information, Elion explains.
“If an app is used on a smartphone, for example, the transmission—also called ‘data in motion’—must be encrypted,” he says. “The security of the public networks cannot be assured; therefore, all by itself a chatbot conversation may not be HIPAA compliant.”
Citing the requirements necessary to navigate the correct demands and the security checkpoints to launch a chatbot, Cooksey believes the technology is HIPAA compliant.
“When the chatbot texts itself to the patient, the patient clicks on the text, and it launches in the browser. The patient still has to go through those compliance factors for a security checkpoint, regardless of what engagement they’re using,” he says. “But even when it’s an integrated model into the EHR, no protected health information is ever being shared, so results would never be delivered via the chatbot. You would never go to get your discharge summary or your abstract for your release of information. None of that is ever communicated via the chatbot.”
With major health systems such as Atrium, Centura Health, and Ochsner as partners, Conversa takes security seriously. “We need to go through several different verifications. Part of the process in our deployments and in our infrastructure and our technology is that it ensures compliance with the HIPAA security and privacy rules,” Edwards says. “As you can imagine, health care systems wouldn’t be using us if we weren’t.”
As a fairly new technology, especially in the health care space, chatbots have several issues to overcome.
For voice-based systems, Elion says the most obvious challenge is dealing with voice recognition issues, especially when it comes to accents. He also notes that, when it comes to technology, there are still luddites out there.
“Even in the character-based typed world, not everyone is proficient with a keyboard or comfortable with the technology,” he says. “We need to realize that we cannot have one size fits all, and be able to adjust and adapt to patients’ comfort levels.”
Anantraman notes that chatbots may not be able supply physicians with all the information they require. It may reach a point where an encounter becomes complicated and the chatbot will not provide enough value.
While chatbots appear to have a future in health care, some believe it may not be as influential as others are proclaiming. Plus, questions are being raised about chatbots’ effectiveness.
A recent Forbes article examined British app maker Babylon, which introduced an artificially intelligent chatbot that promised to give diagnostic advice on common ailments. However, company physicians are claiming the advice offered is often faulty.
No doubt, the entire concept of chatbots needs to be carefully monitored and realistic expectations set.
“There is a classic analysis called ‘hype cycle’ that is likely to apply here, where there is an [innovation trigger] followed by a peak of inflated expectations—probably where we are now—then the trough of disillusionment, the slope of enlightenment—as realistic expectations are set and [technology] adjusted to catch up—and finally the plateau of productivity,” Elion says. “Chatbots are no panacea, and we need to be careful not to set magical or unrealistic expectations.”
— Keith Loria is a freelance writer based in Oakton, Virginia.