Google has developed an artificial intelligence (AI) system that has shown, in some cases, it can detect breast cancer better than radiologists, according to a study published recently in Nature.
Slide deck: What 'Google Health' will look like in 5 years
Challenges in breast cancer diagnosis
According to STAT News, "Interpreting the signs of breast cancer can be equal parts art and science: Only an experienced radiologist can look at a series of blurry, black-and-white mammograms and distinguish cloudy, nefarious masses from clusters of healthy tissues." At the same time, diagnostic errors aren't uncommon, STAT News reports.
According to the American Cancer Society (ACS), clinicians in about 20% of cases miss a breast cancer diagnosis. ACS data also shows that about half of women who get yearly mammogram will, over the course of a decade, receive a false positive.
In an effort to improve those results, researchers at Google Health and its British subsidiary, DeepMind, developed a new AI tool designed to be used to detect breast cancer.
To test the product, Google's teams worked with experts from Northwestern University and two UK-based organizations, Cancer Research U.K. Imperial Centre and Royal Surrey County Hospital. The researchers trained the AI system with about 91,000 de-identified mammograms from the United States and the United Kingdom. In these mammograms, a biopsy had already confirmed breast cancer diagnoses.
Once the algorithm was trained, researchers gave it a different set of 25,856 mammograms from the United Kingdom and 3,097 mammograms from the United States and compared the AI's ability to diagnose breast cancer to that of radiologists.
AI system outperforms doctors (in some cases)
The researchers found that, compared with radiologists, the AI system reduced missed cases of breast cancer in the United States by 9.4% and in the United Kingdom by 2.7%. The system also reduced false-positive readings in the United States by 5.7% and in the United Kingdom by 1.2%.
Further, in the United Kingdom where mammograms are typically read by two radiologists, the researchers found that the AI system did not perform worse than the second radiologist and had the potential to reduce those radiologists' workload by 88%.
As part of the study, the researchers also asked six radiologists in the United States to look at 500 mammograms and compared their responses to that of the AI. They found the AI system generally outperformed the radiologists in determining whether a woman would develop breast cancer.
That said, the radiologists did outperform the AI system in some instances in that testing exercise. For instance, all six radiologists at some point caught a cancer case that the AI missed, according to Mozziyar Etemadi, a research assistant professor in anesthesiology and biomedical engineering at Northwestern University and a co-author of the study. Still, the opposite also occurred, according to Etemadi.
Typically, the cancers caught by the AI were more invasive than those caught by the radiologists. The researchers said they had no explanation for this.
Experts look to the past and urge caution
Dominic King, UK lead at Google Health and a co-author on the study, said the results represent "another step towards this technology actually making a difference in the real world."
Separately, Connie Lehman, chief of the breast imaging department at Massachusetts General Hospital who was not involved in the study, said the technology has the potential to "exceed human capacity to identify subtle cues that the human eye and brain aren't able to perceive."
However, Lehman also expressed concerns about the study's methods, saying the patients in the study may not be truly representative of the general population, as the racial makeup of the participants wasn't noted and a large share of them had cancer.
Similarly, Etta Pisano, chief research officer at the American College of Radiology, in an editorial on the study wrote, "The real world is more complicated and potentially more diverse than the type of controlled research environment reported in this study."
Lehman said that computer-aided detection (CAD), a technology approved by FDA in 1998 for use to help radiologists read mammograms, should serve as a cautionary tale.
When CAD was approved, it became widely adopted, and some hospital administrators even pressured radiologists to use it regardless of their feelings on the technology because patients could be charged more for it, Lehman said.
However, research later showed that CAD actually made radiologists' performance worse.
"We can learn from the mistakes with CAD and do it better," Lehman said. She added that AI has grown more powerful than CAD was and continues to improve as it gets more data. "Using computers to enhance human performance is long overdue," Lehman said (Abbott, Wall Street Journal, 1/1; Steenhuysen, Reuters, 1/1; Grady, New York Times, 1/1; Brodwin, STAT News, 1/1).