The combination of big data, machine learning and other quantitative approaches has promised to bring cost-effective and personalized healthcare to the broadest segment of the human population.
In the words of Marc Andreessen, “Software is eating the world,” and despite being one of the slowest industries to adapt to new technologies, healthcare is no exception.
Software is the medium to translate scientific discoveries, predictive models and algorithms, enabling data driven decisions to be made at the point of care. However, the adoption rate of healthcare software and technology, especially tools that leverage predictive models and algorithms to shed insight on patients and impact patient care, has been excruciatingly slow. For example, in the realm of precision dosing, despite it being well accepted by the academic community for nearly two decades that model-based dosing is superior to standard, “one-size fits all” dosing guidelines, very few institutions have adopted or are even aware of the existence of this approach to individualize treatment.
For these powerful approaches to become mainstream, it is important to understand the underlying obstacles that stand in the way. There are several reasons and limitations that, if addressed, will significantly increase the use of predictive algorithms via clinical decision support systems (CDSS) in healthcare, leading us towards broadly practiced precision medicine:
User / interface experience (UI/UX)
Healthcare software has historically been plagued by poor UI/UX for several reasons, including the disconnection between end users and software developers, complex clinical workflows, and the fact that Electronic Health Record (EHR) software was originally intended to manage and submit insurance claims. Since product usability was not prioritized during the early stages of EHR development, these legacy EHR systems inadvertently set a precedence of poorly designed and non-intuitive user interfaces for healthcare software that has endured for years.
Integration into clinical workflow
Healthcare practitioners perceive that diverting their attention to a different screen in order to use a tool is too time-consuming and burdensome. This makes intuitive sense. Imagine you were writing a research article on your laptop using Microsoft word but had to divert your attention away to a different computer screen in order to use a dictionary or thesaurus. The chances that you’ll actually use a thesaurus is greatly diminished simply due to how onerous it is focus your attention on two devices. Similarly in healthcare, we quickly discovered that clinicians and clinical pharmacists do not want to manually input data into a third party application, or even reach for a tablet to use software to perform complex dosing calculations. This ultimately makes sense, since the clinician’s job is to take care of patients, not to deal with external software. Therefore, integrating software into existing or improved EHR systems will be one of the primary drivers of software adoption in healthcare.
The need to validate and improve underlying models and algorithms
Despite the fact that predictive models and algorithms are usually externally validated before clinical deployment, there is a significant need to put in place a mechanism to continuously validate and improve the underlying models. A mechanism that continues to improve the accuracy of predictions increases trust from the end user, and ultimately leads to an improvement of downstream patient outcomes.
Not all data is always available
If the data required as an input for the software are not available within the medical record database, the software either cannot be used or some critical assumptions about the missing data needs to be made. A perfect example of this is genetic information, which typically does not exist in an EHR.
Lack of cost-benefit analysis / ROI
We need to show on an economic level that machine learning-based software tools create financial benefit to one or more stakeholders within the healthcare system. This might be the most important factor in the long run for achieving widespread adoption. Aligning clinical outcome improvement with gains in ROI will lead to buy in from key decision makers within the health system. If economic evidence becomes overwhelmingly compelling, payers and accountable care organization (ACOs) may also be interested in reimbursing providers to use the product.
The value of such tools, along with the underlying approaches needs to be communicated in a manner that is understandable to the end user, both to the advanced clinical user with specialized knowledge and the typical user without domain expertise. At the same time, it is important to strike a fine balance between explaining terminology and concepts simply without having the term lose its contextual meaning. We need to translate esoteric terminologies in a way that makes sense. For example, instead of using the term ‘pharmacokinetics’, we can provide contextual meaning and say something like ‘drug exposure’ instead. Education extends beyond simplifying terminology to teaching the underlying principles and product value to the spectrum of users. For example, we have taken steps to introduce our product into the educational curriculum of pharmacy schools – allowing future end users to learn advanced pharmacological concepts through platform use.
The regulatory climate when it comes to clinical decision support tools is still evolving. This uncertainty creates hesitation from the health system, creating an additional friction point for widespread use. Healthcare innovators along with regulatory agencies will need to work together to design a regulatory process that suits agile software development, as the current FDA Precertification pilot program aims to accomplish.
Addressing these barriers
Focusing on two or three of the aforementioned roadblocks will minimize the effort required to overcome some of the remaining ones. Achieving an optimal user experience and scalable integration capabilities will increase user adoption to a point in which validating and improving the underlying models and algorithms can become automated since data collection will be far easier.
Integrating a predictive tool into existing EHR systems will create a data network effect, which in turn will improve the predictive performance of the tool each time it is used. We can also readily expand the subset of variables pulled from the EHR, allowing researchers and clinical investigators to uncover the influence of other patient factors such as genetic polymorphisms on a particular patient outcome (e.g. drug response).
A successful integration with the EHR allows for efficient cost-benefit analysis of the software. For example, we can start to look at the influence of the tool on diagnosis codes, and tie that information to concrete economic gains. A clear value proposition can then be made to key purchasing decision makers to implement model-based approaches as well as other CDSS tools for precision medicine at the point of care.
Addressing these limitations will require a concerted effort from the greater community of clinicians, entrepreneurs, healthcare technology specialists, software developers, designers, scientists, and regulatory agencies. Although we have a ways to go, I am confident that we are heading in the right direction - building towards a future where a learning health system that enables precision medicine is nationally integrated as part of routine care.