October 5, 2021

Mayo Clinic's 8 lessons on using data to improve surgical outcomes

Daily Briefing

    Writing for Harvard Business Review, Mohamad Bydon and Fred Meyer, both professors of neurosurgery at Mayo Clinic, explain how data can improve surgical outcomes and the eight steps they took to create a successful performance measurement program in their own department.

    Case study: How Mayo Clinic reduced length of stay for complex patients

    How Mayo Clinic leverages clinical registries

    In 2016, leaders at Mayo Clinic's Department of Neurologic Surgery recognized a need to accurately collect and measure the quality of surgeries throughout the department's six campuses across the country, Bydon and Meyer write.

    To do this, they turned to the department's clinical registry, which collects information from each patient encounter—including comorbidities, surgical procedures, post-operative complications, and patient-reported outcomes. According to Bydon and Meyer, data from clinical registries can be used to provide a comprehensive overview of a clinic's performance in terms of cost, quality, and volume for different procedures.

    Using data from clinical registries, Bydon and Meyer helped develop a dashboard to help improve the neurosurgery department's clinical and financial performance. A pilot program testing the dashboard was completed in 2019, and Bydon and Meyer write that it helped the department identify areas of improvement, negotiate more favorable contracts with payers, and improve discussions and decision-making between patients and physicians.

    Based on the success of the pilot, other departments at Mayo Clinic are launching similar programs. To help guide other health care organizations, Bydon and Meyer outline the eight steps they took when developing their department's program.

    1. Identify the need for a performance measurement program and potential barriers to success.

    According to Bydon and Meyer, they began their project by identifying a need to measure performances in their highly specialized neurosurgery department.

    However, the task was "daunting" due to Mayo Clinic's complex and highly integrated structure. Bydon and Meyer point out that geographic distance complicated the project's development. Mayo Clinic's neurosurgery department has six different campuses and 51 neurosurgeons in Florida, Arizona, and various states in the Midwest.

    2. Get widespread support for the project.

    According to Bydon and Meyer, widespread support of the project—not only at the departmental level, but also at an institutional level—is necessary for it to be successful.

    Bydon and Meyer say they were able to gain support by holding educational sessions at each of the department's six locations, where they discussed the need to gather data from clinical registries and have user-friendly solutions to measure performance. They emphasized that the project would be essential to ensuring high-quality, cost-effective care and keeping the organization competitive.

    3. Secure adequate funding.

    A significant investment was necessary to effectively transform clinical registry data into a dashboard for business operations decisions and payer contracts. In total, the project cost $1.5 million over five years, to cover time, personnel, and active abstraction.

    4. Select important variables of interest.

    As they developed their dashboard, Bydon and Meyer decided on specific variables that would allow them to evaluate performance several areas, including:

    • Operations, which were represented by case volumes, outcomes, and relative value units (RVUs), a measure used by commercial insurers, Medicare, and Medicaid to determine reimbursement
    • Patients, which were represented by post-operative readmissions, surgical complications, mortality, duration of hospital day, returns to the operating room, and discharge disposition for all surgical patients
    • Financial outcomes, which were represented by total hospital cost and charges for all operative surgical admissions

    The information on the dashboard provides a "balanced scorecard," Bydon and Meyer write, which allows providers to access insights about the quality and cost-effectiveness of the department quickly.

    5. Form a team with talent in all essential areas.

    The project team included experts from a wide range of areas, including clinical, administrative, financial, IT, quality reporting, supply chain, and practice improvement.

    In addition, Bydon and Meyer write that it was important to have both "data experts" and "data novices" on the team so the end product would be usable across skill levels.

    The team spent months "build[ing] bridges of understanding and common purpose" among all the different specialists so everyone could work together and understand each other's needs and processes.

    6. Continue to improve and enhance the dashboard through feedback.

    According to Bydon and Meyer, the dashboard needed data that met three requirements:

    • It could be imported from multiple sources automatically.
    • It summarized key performance measures by location, surgeon, and procedure that could be displayed visually.
    • It could be exported for critical review of underlying patient information, such as when a specific patient number is needed to review a surgical case.

    Stakeholders continuously reviewed the dashboard, Bydon and Meyer write, and made changes to improve its operational capabilities, clinical appropriateness, and relevance for market strategy.

    7. Add predictive analytics to the dashboard.

    In addition to descriptive analytics that allowed providers to better understand changes that have occurred, Bydon and Meyer also included a predictive tool they called "the neurosurgical risk calculator."

    The calculator uses real-time evidence to estimate the operative risk for a particular patient profile, which providers and patients can then use to discuss surgical options and estimate the risks of surgery, including potential complications.

    8. Use the data.

    Ultimately, the dashboard helped the neurosurgery department improve in many areas, Bydon and Meyer write. For example, system-wide quality improvement projects were launched to address areas of underperformance.

    In addition, the dashboard data was used to gain competitive contracts, including bundled care arrangements, from employers and insurers. Being able to compare patient outcomes to national benchmarks also led to more patient referrals from payers, which increased the department's volume.

    Finally, the data was used allow providers and patients to make surgical decisions together and inform patients about expected outcomes.

    "After significant investment and commitment, we developed a database and dashboard that is accomplishing its goals of providing hard data on quality, cost, volumes, and outcomes," Bydon and Meyer write. "It has been helping us transform our practice." (Bydon/Meyer, Harvard Business Review, 9/30)

    Have a Question?

    x

    Ask our experts a question on any topic in health care by visiting our member portal, AskAdvisory.

    X
    Cookies help us improve your website experience. By using our website, you agree to our use of cookies.