A recent study published in JAMA found that consuming three or four eggs per week was associated with an increase in all-cause mortality, but if you read headlines last week you might have been given the false impression that "eggs actually cause people to die," Craig Garthwaite, a health care economist at Northwestern University, writes for Forbes.
Cheat sheets: Evidence-based medicine 101
Study finds association between dietary cholesterol intake and all-cause mortality
For the study, researchers examined data from over 29,000 people followed for an average of 17.5 years. The researchers determined that consuming an additional 300 milligrams of dietary cholesterol a day—equivalent to about three to four eggs a week—was associated with a 3.2% higher risk of heart disease and a 4.4% higher risk of all-cause mortality. The study did not say, broadly, that eggs are bad for you, Garthwaite writes.
What the media reported (and what it missed)
However, last week media headlines suggesting that eggs were actually causing people to die flooded the internet, Garthwaite writes. For example, CNN reported, "Three or more eggs a week increase your risk of heart disease and early death." The Boston Globe and the New York Times both posed questions asking if eggs are bad. USA Today had the least offending headline, noting that there was only a "link not a direct causal relationship: "Eggs linked to higher risk of heart disease and early death, study says."
These were "reasonable" conclusions to draw "if you casually skimmed this … study (as many lay readers likely would) or the press release describing the results," Garthwaite writes, but those conclusions are misguided.
This study suggested a correlation between eggs and all-cause mortality—not a causation between the two, Garthwaite writes.
This is further evidenced in the appendix of the study, which shows the negative health effects of eggs related to coronary heart disease go away once the researchers controlled for other foods such as poultry, fish, and red meat, Garthwaite writes. "This suggests it's quite possibly not the delicious eggs but rather some other unobserved variable driving the observed relationship."
'Some studies frankly just shouldn't be written'
The problem with the egg study is one the scientific community has seen many times over in dietary research that often rely on a method known as data dredging, or p-hacking.
Studies like these often follow a similar pattern, Garthwaite writes. First, study authors find a statistically significant correlation between a type of food and a health outcome using large datasets. Then, the paper is written using words like "associated with" instead of "caused," which allows the authors to deny suggesting a causal relationship if the study results are misinterpreted, Garthwaite writes.
"Given this well-known pattern, which has only been exacerbated in our current world of 'hot takes' and social media, academics have a duty to analyze how our work will be interpreted and amplified by the media," he writes. "In such a world, some studies frankly just shouldn't be written."
If these papers were about making small steps toward progress on a difficult issue, that would be acceptable, Garthwaite writes. But studies like the egg study "don't take such small steps forward," Garthwaite contends, "Instead, they are plagued by systematic biases that make them unable to ever deliver on the promises of the paper's writing and packaging."
In instances in which researchers either can't find the correct causal data or don't have the right empirical setting to find that data, "it's best we just take a pass," Garthwaite writes. "[A]s academics we must have the discipline to avoid providing misleading answers to these questions until the right setting or data appear." He concludes, "Continuing to write studies with caveats buried deep in the paper that predictably result in misinterpretation by the media is either nefarious or reminds me of a certain definition of insanity" (Garthwaite, Forbes, 3/20; Scutti, CNN, 3/15; Bakalar, "Well," New York Times, 3/15; Miller, USA Today, 3/15).
Learn more about why study design matters
Been awhile since your last statistics class? It can be difficult to judge the quality of studies, the significance of data, or the importance of new findings when you don't know the basics.
Download our cheat sheets to get a quick, one-page refresher on some of the foundational components of evidence-based medicine.
- Evidence-based practice (EBP)
- Observational studies
- Randomized control trials (RCT's)
- Systematic reviews
- Statistical significance