To date, most of the progress in AI has been in relatively narrow decision tasks, such as evaluating a patient's risk of developing sepsis, predicting readmissions, or reading a chest X-ray for signs of pneumonia. Modern AI has proven itself capable of outperforming human experts across many of these narrowly-defined tasks, but attempts to deploy the resulting models to front-line operations often fall flat. While much of the focus in the public press has been on the AI technology itself, successful deployment requires starting from clearly defined goals and measures of success, a careful focus on workflow and process integration, and diligent monitoring of progress toward targeted outcomes (see figure below).
How to put AI into practice
The focus on workflow is critical
Deployed wisely, AI can augment the intelligence of human decision makers and care processes, improve the accuracy and consistency of decision making, reduce cognitive burden, and free up staff to focus on higher-level problems. If workflows and user interfaces aren't adjusted to accommodate AI's unique strengths and challenges, the guidance may be ignored or even contribute to alert fatigue and potential patient harm.
When considering an AI-enabled process improvement or decision support tool, involve front-line decision makers in the process from the start. Their experience with the interaction between patients, providers, and technology will help determine the most appropriate way to deliver AI-fueled insights. The principles of incorporating AI into workflows have close parallels to the five rights of clinical decision support, where AI needs to deliver:
- The right information;
- To the right person;
- In the right format;
- Through the right channel; and
- At the right time in their workflow.
An illustration of the importance of workflow comes from the Mayo Clinic's efforts to improve clinical trials matching in their breast care clinics. Mayo Clinic partnered with IBM Watson Health to develop an automated matching technology that suggests clinical trials a patient may be eligible for based on the patient's clinical chart, including unstructured notes. Initial indications were that the matching technology worked well, suggesting appropriate trials with high accuracy—but when the technology was rolled out at the point of care, front line staff declined to use it.
The Mayo Clinic team dug in and assessed the reasons for the poor adoption. Among other factors, they discovered that the technology was interrupting clinical staff at an awkward point in the patient encounter, surprising them with information they were not ready to act on. Process engineering experts were brought in to rework the integration and adapt the broader workflow, allowing clinicians to review the guidance before they met with patients. The same core matching technology was redeployed six months later and quickly won widespread adoption and delivered significantly higher clinical trial enrollments, fulfilling the goal of the initiative.