Algorithms that are widely used to guide care for millions of Americans incorporate race in ways that could exacerbate inequities in health care, according to a study published Wednesday in the New England Journal of Medicine.
Researchers identify racial adjustments in commonly used health care algorithms
For the study, researchers examined 13 algorithms—including digital risk calculators, lab tools, and common risk-adjustment algorithms—that were designed to assess patients' potential health risks and outcomes in nine different areas of medicine.
David Shumway Jones of Harvard Medical School, the study's senior author, said he and his team of researchers found that "race adjustments" developers included in the algorithms typically were based on studies that showed a correlation between patients' race and certain medical outcomes.
However, instead of considering the role of other factors that may have driven the correlations between patients' race and certain medical outcomes—such as access to primary care, discrimination, or socioeconomic status—many developers incorporated adjustments as solely based on a patient's race, which can be a crude proxy for the variables that truly affect risk.
Further, the adjustments often were based on "outdated science or biased data," the researchers found. According to the researchers, these adjustments sometimes can lead to black and Hispanic patients being less likely to receive needed care.
Here are some of the algorithms the researchers explored:
- A thoracic surgery calculator identifies blacks as having higher risk of post-operative complications. The researchers found that a risk calculator used by thoracic surgeons indicated that black patients face a higher risk of post-operative complications. According to Jones, the increase in assessed risk could "make surgeons steer black patients away from bypass surgery, mitral valve repair and replacement," and other potentially life-saving procedures. "If I have a black patient and the risk calculator tells me he has a 20% higher risk of dying from this surgery, it might scare me off from offering that procedure," Jones said.
- Kidney algorithms could make black patients less likely to receive specialty care or transplants. One algorithm used to determine patients' kidney function includes a so-called "race adjustment" that can make black patients' kidney filtration rates appear higher, or healthier. The adjustment could reduce the likelihood that those patients would receive specialty care or a kidney transplant. A separate algorithm used by transplant surgeons identifies kidneys from black donors as more likely to fail than kidneys from other donors, the researchers found. Because black patients are more likely to receive organs from black donors, the algorithm could reduce the number of kidney donations to black patients.
- Black and Hispanic patients are assigned higher risk scores for vaginal birth complications. An algorithm used by obstetricians to determine whether a woman is at high risk of complications during vaginal birth automatically identifies black and Hispanic women as having a higher risk than other women. According to the researchers, that risk adjustment factor is based on a study that found women who are unmarried and lack health insurance have an increased risk of complications from vaginal birth, but the algorithm doesn't include specific adjustments for either of those socioeconomic factors.
How should providers and researchers move forward?
According to Brian Powers, a physician and researcher at Brigham and Women's Hospital who was not involved in the study, many of the algorithms examined for the study "are widely used and have a substantial impact on patient care." Powers said that while adjusting for race can sometimes "lead to better patient care or even be a tool for addressing inequities," in some cases the adjustments lead to an even deeper gap in care quality.
But Jones, the study's lead author, said "[m]odern tools of epidemiology and statistics could … show that much of what passes for race is actually about class and poverty" when it comes to patients' risk factors.
Some in the medical community have pushed for providers to reconsider using the algorithms that incorporate race. For instance, a few years ago, Melanie Hoenig, a kidney specialist at Harvard Medical School, and a group of students at Harvard University approached executives at Beth Israel Medical Center about one of the kidney algorithms examined in the study.
"We went to the chief of medicine, we went to the head of clinical labs, we went to the head of the kidney division, we went to a lot of people and spun our story," Hoenig said.
In 2017, Beth Israel removed the race adjustment from the algorithm, and some other providers have since followed suit.
However, Jones cautioned that the tests and the research behind them are still "woven into the fabric of medicine."
"Well-meaning individuals acting without racist intent can still produce work with racist consequences," he said. "Despite mounting evidence that race is not a reliable proxy for genetic difference, the belief that it is has become embedded, sometimes insidiously, within medical practice" (Begley, STAT News, 6/17; Kolata, New York Times, 6/17).