Many so-called "breakthrough" studies you may read about in the news don't hold up in further studies, according to new research published in PLOS One.
For the study, researchers reviewed 4,723 primary studies, included in 306 meta-analyses, that examined risk factors related to psychiatric, neurological, and somatic diseases. Those studies and meta-analyses were classified as either relating to lifestyle factors—such as smoking—or non-lifestyle factors, such as genetic risk.
The researchers then reviewed 5,029 news articles included in the Dow Jones Factiva database, which excludes online-only publications, as well as television broadcasts and NPR stories. Of those articles, the researchers found 1,561 that covered a total of 156 primary studies—including 63 lifestyle studies—and five meta-analyses, which according to Vox, typically "provide a more comprehensive answer than a single study."
Overall, the researchers found that initial studies were about five times more likely than follow-up studies to be covered in newspapers, while meta-analyses garnered hardly any newspaper coverage. That's a problem, Brian Resnick writes for Vox, because the researchers also found that just 48.7 percent of the 156 studies covered by newspapers were later confirmed in a meta-analysis. That percentage fell to 34 percent when researchers extrapolated just for initial studies.
The researchers also found that newspapers far preferred to cover studies that had positive findings, even though studies with negative findings are "equally valuable," Resnick writes. Overall, the researchers found that the newspapers they assessed "never covered initial studies reporting null findings and rarely reported subsequent null observations.
In addition, the researchers found that journalists generally preferred to cover studies relating to lifestyle choices, such as diet or exercise—particularly when they were published in prestigious journals. Studies on non-lifestyle topics, such as brain imaging or genetics, were less frequently covered.
There's 'little proof' flossing improves dental health, AP investigation finds
The researchers found newspapers generally provided equal coverage of initial (12.8 percent) and subsequent lifestyle studies (9.7 percent). But, for non-lifestyle studies, newspaper coverage favored initial studies (13.1 percent) over subsequent studies (1.2 percent).
The researchers highlighted a 2003 study on depression, originally published in Science, as an example. While 50 newspaper articles were written about the initial 2003 study, not one newspaper in the Dow Jones Factiva database covered any of the 11 subsequent studies that tried and failed to reach the same conclusions as the original. And when scientists finally published a meta-analysis of those subsequent studies in JAMA in 2009, there were only four newspaper articles about it—and newspapers overall continued to cite the findings of the 2003 study.
Researchers and experts said the study shows how the media and scientists fall short in conveying the significance of medical research to the public. They also say the study suggests "that most journalists from the general press do not know or prefer not to deal with the high degree of uncertainty inherent in early biomedical studies."
NYT debunks the 'mystical' powers of breakfast
The researchers recommended that journalists covering new studies contact the study authors to determine whether the study is an initial finding and, if so, specify in their coverage that the study findings are tentative and need to be validated by further research.
As Resnick explains, part of the issue is that "journalists have a need for digestible headlines that convey simple, accessible, and preferably novel lessons." But university press releases also contribute to the problem, Resnick writes, because they are more likely to highlight a "striking and dramatic single study" than a definitive "meta-review." Such press releases or the scientific papers themselves also can "spread hype" about research, he writes.
The study's lead author Estelle Dumas Mallet, a biologist at the University of Bordeaux in France, said in an email to Vox that the study was only preliminary—and limited to a comparatively small sector of science media coverage. "Our result only refers to a small sample of the scientific research," she wrote. "Also, we cannot extrapolate these results to other domains such as physics and chemistry" (Harris, "Shots," NPR, 3/6; Resnick, Vox, 3/3).
Are you leading an evidence-based organization?
Despite the shift toward broad acceptance of evidence-based practice (EBP) among medical staff, over half of physicians report not actually using guidelines day-to-day when they are available. As a result, organizations continue to see tremendous variation in clinical practice—as well as in costs and outcomes.
Our infographic outlines four principles you can use to support EBP at your organization, along with action steps to implement each one and pitfalls to avoid along the way.
GET THE INFOGRAPHIC
Next in the Daily Briefing
2 reasons why patients aren't using EHR data—and what to do about it