A major new study on the health effects of fish oil and vitamin D prompted seemingly contradictory headlines from news outlets—illustrating how easily a study's findings can be misunderstood or overhyped in the press, Mary Jaklevic writes for Health News Review.
'Effective' or 'ineffective?' One study leads to two very different storylines.
The study enrolled 25,871 healthy people ages 50 and older and randomly gave some of them a prescription for omega-3 fish oil capsules, vitamin D supplements, both, or placebos. One paper was published examining the results for fish oil, while another was published with results for the vitamin D intervention.
The researchers concluded that fish oil "did not result in a lower incidence of major cardiovascular events or cancer than placebo" and that vitamin D supplements "did not result in a lower incidence of invasive cancer or cardiovascular events than placebo."
Even so, headlines from major news media outlets seemed to suggest otherwise, Jaklevic writes. The Washington Post reported that fish oil was "effective" at protecting against heart events and that vitamin D was linked to "a decline in cancer deaths." Similarly, Reuters reported that fish oil could "dramatically reduce the odds of a heart attack" and that the benefits of vitamin D "seem to come from lowering the risk of death from cancer."
On the other hand, the New York Times reported that both fish oil and vitamin D supplements were "ineffective" while NPR reported the supplements "mostly disappoint."
How some news outlets may have gone astray
A major cause of the confusion is that some news outlets focused on the study's secondary findings, which Jaklevic writes "aren't the primary questions researchers set out to answer."
Specifically, researchers observed that people who took fish oil had 28% fewer heart attacks, and those who took fish oil but didn't eat much fish experienced an even larger 40% reduction. But heart attacks alone weren't the primary outcome measurement used in the study; rather, the researchers lumped heart attacks in with strokes and deaths from cardiovascular causes as "cardiovascular events"—and they didn't see a significant improvement on this broader metric.
Jaklevic argues that secondary findings need to be "reported cautiously," as they don't "have the same statistical authority as primary findings and are more likely due to chance."
Some news organizations did act cautiously in reporting these secondary findings. Kaiser Health News, for example, said that data in smaller chunks could produce "unreliable" results, and that "links between fish oil and heart attacks—and vitamin D and cancer death—could be due to chance." Other outlets, however, were less circumspect.
Christopher Labos, a cardiologist, said, "You can talk about the secondary endpoints if you want, but not to the point where you obscure that the overall study was negative."
Brigham and Women's Hospital, which conducted the study, in a statement argued that its own characterization of the secondary conclusions in a press release was accurate and "supported by the findings." JoAnn Manson, lead researcher on the study and the hospital's chief of preventive medicine, added that the secondary findings were planned before the study began and were in the published research.
Some experts attributed the conflicting media reports to a broader bias against covering negative results. Matt Shipman, a public information officer at North Carolina State University, described that bias as unfortunate, as negative findings "can have tremendous value," and that, in this study, they "are the larger story."
Martha Gulati, chief of cardiology at the University of Arizona, said the study findings "didn't need to be spun," and that they can help consumers make more-informed choices. "As a health consumer myself, I wouldn't want to be spending my hard-earned money on something that doesn't give a benefit," she said (Jaklevic, Health News Review, 11/15).
Learn more about why study design matters
Been awhile since your last statistics class? It can be difficult to judge the quality of studies, the significance of data, or the importance of new findings when you don't know the basics.
Download our cheat sheets to get a quick, one-page refresher on some of the foundational components of evidence-based medicine.
- Evidence-based practice (EBP)
- Observational studies
- Randomized control trials (RCT's)
- Systematic reviews
- Statistical significance