October 24, 2019

5 pieces of conventional wisdom in health care, debunked

Daily Briefing

    Advisory Board experts read hundreds of articles about health care every month—and occasionally they come across commonly held facts that don't pass muster. Here are five pieces of conventional health care wisdom that fall flat today.

    Cheat sheets: Evidence-based medicine 101

    1. Conventional wisdom: Women make 80% of health care decisions in the U.S.

    Shay Pratt

    Shay Pratt, Executive Director, Research

    Well, actually: women make an estimated 80% of health care decisions for their kids. This statistic, which seems to date back to a 2014 report from U.S. Bureau of Labor Statistics, is often inappropriately expanded across all health care decisions. And while other research has supported the initial claim about women's role in their children's health care decisions, it has also shown that men are often far more engaged than expected in all health care decision-making. Therefore, it seems best that we look beyond gender when trying to understand the complex decisions people make about their health and the health of their family.

    Infographic: What your patients expect from their care—from millennials to the silent generation

    One note: It's worth mentioning that one thing that does vary significantly by gender is time spent on medical caregiving. According to several studies, women make up anywhere from 57-81% of family caregivers—a role which can often place them at risk for depression, anxiety, and burnout. Efforts specifically targeted at easing the burden of female caregivers can help advance gender parity in the home. 

    2. Conventional wisdom: The biggest nursing workforce concern is a shortage of nurses

    Marisa Deline

    By Marisa Deline, Practice Manager, Nursing Executive Center

    For years, we've been warned about a upcoming nursing shortage which is poised to beset the United States—hurting margins, compromising patient care, and accelerating burnout along the way. The problem with this idea? Projections show that future RN supply is actually set to outpace demand, resulting in an excess of almost 300,000 nurses by 2030.

    It is true that some states—seven, according to the last projections—will face RN shortages (and some markets within the other states could also see shortages). But at the national level, concerns of a RN shortage, based on the overall numbers, seem overblown. But there is a key shortage that is not overblown: a shortage of nursing experience.

    How to overcome nursing's newest challenge: The 'experience-complexity gap'

    As retirement-age nurses exit the workforce and the influx of new nurses continues to grow, the net effect is a decline in the overall experience of the nursing workforce. There simply aren't enough Generation X, mid-career nurses to fill the roles left by retiring Baby Boomers. At the same time, the patient population is becoming older and more complex, with more chronic comorbidities. Simply put, this is leading more novice nurses to have to deliver more complex care.

    If left unaddressed, the experience-complexity gap will widen, and could put clinical quality and safety at risk. And the answer isn't simply to look out for more experienced nurses—there just aren't enough of them in the market. Instead, your organization has to learn how to prepare novice nurses to provide the complex care they need to deliver today and redistribute experience across the organization. To learn the three key strategies you should use, download our infographic on the topic, or for nursing members, download our full new report on the Experience-Complexity Gap.  

    3. Conventional wisdom: You should take a multivitamin every day, according to research  

    By Jackie Kimmell, Senior Analyst

    The number of Americans taking multivitamins decreased over the past decade—falling from 37% in 2000 to 31% in 2012—in part because the science supporting taking them has largely been inconclusive.

    Catherine Price, a science journalist, notes that there are 13 vitamins scientists believe are associated with good health. Those vitamins can be gleaned from everyday food, but most Americans don't eat the recommended amounts of fruits and vegetables, so many aim to consume their vitamins and minerals in a pill form instead.

    But to date, there's been limited evidence that a daily multivitamin has a causal impact on long-term health. A meta-analysis conducted for the U.S. Preventive Services Task Force found that there was no clear evidence of dietary supplements impacting total mortality, cardiovascular disease, or cancer. 

    Cheat sheets: Meta-analyses—and 5 other components of evidence-based medicine

    Another study found no impact of multivitamins on cognitive decline, and another found no link to the risk of repeat cardiovascular events.

    Other studies have found that multivitamins could, in fact, be harmful. One study of 38,000 women from ages 50 to 60 found that daily multivitamin use was actually associated with a slightly (6%) increased risk of total mortality.

    This is might be because, while vitamins seem intuitively appealing (if minerals and vitamins are good for you, taking them in a pill should be good for you too), the excess amounts often found in supplements can be too much for the body. For example, studies have shown that too much beta carotene and vitamin E can lead to cancer, and excess vitamin A can cause liver damage, coma, and mortality.

    The fact that Americans spent $28 billion on dietary supplements in 2010 led an editorial in the Annals of Internal Medicine to declare: "Enough is enough. Stop wasting money on vitamin and mineral supplements."

    However, for those at risk of deficiencies, multivitamins can be useful. Those with inflammatory bowel disease, celiac disease, or those who have had gastric bypass surgery may need a vitamin to meet their dietary needs.

    4. Conventional wisdom: The United States overutilizes health care compared to other countries  

    Matt Morrill

    By Matt Morrill, Senior Consultant, Imaging Performance Partnership

    It's an open secret that the United States spends more per capita on health care than any other industrialized nation—about double the average of OECD countries. We also don't necessarily achieve the highest performance given our spending.

    The reasons for our high spending are multifactorial, and U.S. policymakers have for years tried to address it through various programs. Overutilization has been a common target: we've seen stories about how the United States has too many physicians ordering too many tests and procedures. Programs ranging from accountable care organizations to bundled payments and clinical decision support have all been driven by a desire to reduce unnecessary utilization.

    Infographic: A snapshot of health systems around the world

    However, when you look at the data, the United States really does not use care more often than other countries. A recent JAMA study showed that overall U.S. health care utilization was on par with other advanced economies. The only areas in which U.S. utilization tended to be higher was in CT and MRI imaging.

    So what is the cause of high spending in the United States? Well, as the late Uwe Reinhardt once remarked, "It's the Prices, Stupid." In fact, a recent study from the Health Care Cost Institute found that while U.S. health care utilization stayed steady or declined, prices paid by commercial payers grew substantially in recent years. Private payers now pay hospitals about three times what Medicare pays for the same outpatient services. Perhaps for this reason we're seeing payers try to aggressively steer patients towards lower-cost sites of care and reduce the payment discrepancy between hospital-based and freestanding outpatient providers.

    5. Conventional wisdom: Early exposure to peanuts increases the risk of peanut allergies

    By Heather Bell, Editor-in-Chief, Daily Briefing

    Back in 2008, pediatricians formally dropped recommendations that advised against introducing high-risk infants under age three, such as those with egg allergies or a severe form of the skin rash eczema, to peanuts. However, many parents today remain wary of early exposure.

    That's in part because several studies have shown peanut allergy prevalence in U.S. children has risen over the past two decades. One study found peanut allergy prevalence grew threefold between 2001 and 2007 among infants under one-year-old. Today, researchers estimate about 1.2 million children and teens, or about 2% of the non-adult U.S. population, is allergic to peanuts.

    But research has found no correlation between early exposure to peanuts and an increased risk of developing an allergy. In fact, an expert panel sponsored by NIH's National Institute of Allergy and Infectious Diseases in 2017 concluded that babies should be exposed to peanut-containing foods before age one to help build a tolerance to peanuts.

    The panel advised that parents of infants deemed high-risk consult with a pediatrician or an allergist as early as age four to six months to determine if they should be introduced to peanut-containing foods at a physician's office or at home under the supervision of a parent. Parents should monitor their children for signs of an adverse reaction, and if none are observed, peanut-based foods should be eaten about three times a week to build a tolerance. However, the panel cautions that this early introduction should only take place when infants are healthy.

    X
    Cookies help us improve your website experience. By using our website, you agree to our use of cookies.