Home  |   Subscribe  |   Resources  |   Reprints  |   Writers' Guidelines

February 2016

mHealth in the Mental Health Sector
By Dava Stewart
For The Record
Vol. 28 No. 2 P. 18

On the heels of the explosion in wellness tools come solutions for mental health. What do consumers and clinicians need to know?

Software, devices, and applications that track physical metrics, such as heart rate and number of steps taken, have been shown in various studies to help consumers make positive lifestyle changes. While this technology has become almost commonplace, its use in the behavioral health arena is considerably newer.

Software developers, research scientists, and clinicians are still exploring the possible uses of applications designed to help patients with mental health issues, while attorneys are assessing the myriad legal implications.

Any time technology is used in a new way in health care, there are questions to answer, problems to solve, and puzzle pieces to fit together. There are the ubiquitous privacy issues, but also quality control questions that must be answered, as well as potentially thorny ethical problems associated with the use of mobile technology in the realm of behavioral health.

Research and Quality Control
Along with the questions and problems, new applications also are surrounded by hype and excitement, and the apps and tools being developed for behavioral health are no different. "The research shows there is lots of reason for excitement. If these tools are developed well, they will help people. When people use these things, you can get a big impact," says Lisa Marsch, PhD, director of the Center for Technology and Behavioral Health at the Dartmouth Psychiatric Research Center.

Marsch, whose work focuses on using science to inform the development of technology targeting mobile health, says the key is making sure tools are well developed and based on sound scientific research. "One key thing for me as a research scientist is to make sure these things really work. We need to have a good sense of what is really going to be effective and have an impact, and then we have to test it," she says.

Jacques Habra, CEO of SelfEcho, developers of Mobile Therapy, agrees that sound research based on scientifically rooted questions is paramount to providing an effective tool. "Somehow we have to find a way to extract psychometrics using scientifically validated methods such as EMA [ecological momentary assessment]. Phrasing and timing of questions must be considered," he says.

Others are less convinced that apps can be appropriate tools for mental health. Irene Gorodyansky, PsyD, who practices as a psychological assistant at Live Free Psychotherapy in San Francisco, says, "In terms of research, as long as the researcher conducting the study has a vested interest in a particular outcome, the study will be biased. This doesn't mean people should only use a product or service that is backed by sound research, but it is unethical for companies to make claims about efficacy based on biased studies."

Consumer vs Clinical Products
One complication that may not be immediately apparent is distinguishing between apps and tools designed for consumer use and those that are for clinical use. Those that are consumer facing often find more support among clinicians than others. There are legal considerations for both types of products as well.

Many consumer-facing apps are available on various platforms. Some, such as Happify and Personal Zen, are game-based and operate in a similar way to so-called brain training apps such as Lumosity. Others, like Spire, track metrics such as breathing and skin temperature to alert users when they are exhibiting symptoms of stress.

"There is a whole series of things that happen before a person ends up at a psychiatrist's office. There are lots of points along the way. Direct-to-consumer apps can be helpful for people during those contemplation stages," says Jennie Byrne, MD, PhD, a board-certified adult psychiatrist who practices at Cognitive Psychiatry of Chapel Hill in North Carolina, who believes tools to manage mental health can be useful, just as activity-tracking tools can help people manage their weight and fitness.

Other practitioners are less inclined to think that any app can be useful for establishing good mental health. "I don't believe that long-term mental health can be achieved through an app by itself. An app is a blunt instrument that isn't built for the individual," Gorodyansky says.

On the other side of the coin, apps developed for clinicians to use as part of their practice require a much more stringent level of quality control. "Any time you are having contact with a patient, whether it be direct or through a remote tool, you have to be very careful," Byrne says, adding that psychiatrists are trained to be careful in person, but not in the use of third-party tools. That may change, but as of now, many clinicians agree that there are too many risks and too much at stake to fully embrace unproven tools.

The Legal Perspective
"What is true today might not be true tomorrow—or even really true today—it may just look like it," says Jeffery Drummond, an attorney and partner at the Dallas law firm Jackson Walker LLP. Legally, apps are medical devices, according to the FDA. The process for getting them approved depends in large part on how "deeply" they affect the health of end users. "If a device is closer to helping a patient keep their own records, it's not covered. If it's connected to a true medical device then it probably is covered," Drummond says. "There's a whole bunch of stuff in the middle that may connect with a medical record."

The FDA document "Mobile Medical Applications: Guidance for Industry and Food and Drug Administration Staff," issued in 2015, contains "nonbinding recommendations" which, Drummond says, "means they can change their mind anytime they want to." At the most basic level, the recommendation is that developers should decide if their devices are closer to calorie counters or pacemakers, and proceed accordingly. If the device lies somewhere in the middle, the developer should contact the FDA for guidance. The FDA will then issue a letter of clearance or require the device to be approved as any other medical device would be. "That's really all we have as far as regulation for those things," Drummond says.

Usability and Acceptance Issues
Regardless of the app's purpose, usability is key. A technology that is complex or difficult to use is unlikely to be effective. In the case of apps aimed at mental health, both patients and clinicians must be willing and able to use the app.

One issue for clinicians is that there is often too much information available. Physicians who see large numbers of patients don't have time to analyze volumes of information or interpret hundreds of data points. "Clinicians don't want this much data or the responsibility," Marsch says.

Even Byrne, who describes her practice as being "a little different in that we are relatively low volume," says that having more information is not always better. For example, the benefit of having a check-in through a third-party tool may not outweigh the risks of using it.

Cognizant of these concerns, software developers are taking steps to alleviate the overload, as well as to make the information accessible and useful. For example, the Mobile Therapy app provides clinicians with an overview. "It takes the clinician two to three minutes to easily see the best and worst," Habra says. "The advanced reports allow the clinician to look at a specific span of time." The tool also has the capacity to set up alerts based on a client's state of mind according to the app.

Privacy and Mental Health
For patients, the issue of usability may be entwined with privacy concerns. Federal regulations do not differentiate between physical and mental health, protecting it all equally, but Habra says, "We have to go beyond HIPAA. In order to gain the trust of clinicians and hospitals, we must have the most stringent security imaginable."

Drummond says apps must comply with both HIPAA and the regulations set forth by the Federal Trade Commission. "The issue is whether the device is appropriately guarding privacy, and whether or not it protects the privacy of the individual," he says.

The Mobile Therapy app serves as an example of how tools designed to help clinicians and patients manage mental health may cross boundaries that other types of health-related apps don't need to cross. Mobile Therapy works on two levels: a relatively simple, question-and-answer level, where the patient is pinged at different times throughout the day with questions that can be answered with a slider; and passive information collection that uses GPS, data mining, and linguistic analysis.

The app mines all of the patient's mobile data, including social media posts, e-mails, and GPS information, in 12-hour increments. The program then analyzes the patient's language to determine his or her state of mind. "The clinician gets all the data and can really understand the impact of who they [their patients] are with, where they are, and help them optimize their life and schedule," Habra says.

The question of how willing patients are to having their data examined in such a manner may be similar to visiting a medical doctor's office, according to Habra. He points out that most healthy people would be unwilling to undress in front of others, while those who are ill allow physicians and nurses to see them undressed in the hopes of getting well. Similarly, those seeking help with mental health issues may not object to having their mobile data examined, Habra says, adding that the app does not store any of the information.

"Some people are really willing to have this stuff captured about them because it can be put into Big Data models and nothing personal is identified," Marsch says. In some cases, she says capturing these data can be beneficial to the user. "Maybe an older person who doesn't want to go to an assisted-living facility would want to have their data captured to show that they are capable of continuing to live alone," Marsch says.

According to Drummond, it's important that patients understand exactly how their data will be collected and communicated. Taking Habra's analogy a step further, he asks, "Would it be OK if your doctor installed cameras in your bedroom? You are consenting to a different level of sharing your private information."

One way developers handle privacy concerns is to create end-user agreements that spell out how the app collects, communicates, and stores data. Drummond says patients can share their private health information as they see fit. "You could put your medical information on a billboard. People go on shows like The Jerry Springer Show and Dr. Phil and talk about very personal stuff that you couldn't dig up on somebody and talk about without their consent," he notes.

Clinicians have legal concerns as well. Byrne says many of those in the industry, including software developers and vendors, don't always understand the life-and-death consequences of the work performed by mental health professionals. "We are liable for everything," she says. "If the software says someone is happy, but they are suicidal, we are liable."

Although this area of liability has not been fully developed, Drummond says it's possible that a software developer could be held liable in such a situation. The clinician may have some protection under the learned intermediary doctrine. "Let's assume the app has gotten FDA clearance, and that they actually went through all the steps and got the full-blown clearance. Is there a difference between the clinician using the app and a cardiologist implanting a faulty pacemaker?" Drummond asks. "Of course anyone can sue anyone for anything, but usually the physician has some protection if the device has FDA clearance."

The Future
As technology continues to play a large role in health care, the various legal issues are likely to become more settled. For Marsch, it's imperative that the tools, both those designed for consumer use and those favored by clinicians, are properly vetted. "One key thing for me as a research scientist is to make sure these things really work," she says. "We need to have a good sense of what is really going to be effective and have an impact, then we have to test it."

Drummond says it's important that any new tools, including apps, are "the icing and not the cake," and that patients are aware that communicating directly with their physicians is typically the best course of action. Practicing clinicians such as Byrne and Gorodyansky recommend a cautious approach to introducing the latest technology to patients as the market becomes saturated with new products.

— Dava Stewart is a freelance writer based in Tennessee.