September 30, 2021

YouTube just banned vaccine misinformation

Daily Briefing

    YouTube on Wednesday banned several high-profile anti-vaccine influencers and announced that, moving forward, it would remove all content that falsely claims approved vaccines are dangerous.

    Radio Advisory: Aaron Carroll on how clinicians can combat medical misinformation

    YouTube cracks down on vaccine misinformation

    YouTube said it would take down any content that makes false claims about approved vaccines, including claims that the vaccines cause autism, cancer, infertility, or don't reduce the transmission or contraction of disease.

    The policy applies to all vaccines approved by health authorities, not just those for Covid-19.

    "We've steadily seen false claims about the coronavirus vaccines spill over into misinformation about vaccines in general," the company said in a blog post. "We're now at a point where it's more important than ever to expand the work we started with Covid-19 to other vaccines."

    Critical content regarding vaccines that are still undergoing clinical trials will be allowed, as will personal stories about reactions to vaccines, as long as such stories aren't posted by an account with a history of posting vaccine misinformation.

    YouTube said that since 2020, the site has removed more than 130,000 videos for violating its policies regarding Covid-19 vaccines. It has now permanently removed several channels associated with high-profile spreaders of vaccine information, including those associated with Joseph Mercola and Robert F. Kennedy, Jr.

    A person familiar with YouTube's policymaking process told the New York Times that, after the company developed rules surrounding Covid-19 vaccine misinformation last year, it began to consider broadening the policy.

    The company found that many videos that included Covid-19 vaccine misinformation also incorporated broader misinformation, making it difficult to narrowly tamp down on Covid-19 vaccine misinformation without forming a broader policy.

    However, developing the rules took months due to the difficulty of addressing content across many languages and determining the boundaries of what users would be allowed to post, the person told the Times.

    Reaction

    YouTube's new policy broadly aligns it with other major social media sites, such as Facebook and Twitter.

    Facebook in February announced it would remove posts with false claims about vaccines, including those for diseases other than Covid-19. In March, Twitter implemented a new policy explaining penalties for sharing vaccine misinformation—although that policy focused primarily on Covid-19 and provided a "five strikes" approach of escalating penalties for violations.

    Experts on misinformation have said that anti-vaccine content on social networks is a factor in vaccine hesitation. YouTube videos are often the source of that content, going viral on platforms like Facebook and Twitter, the Times reports.

    "One platform's policies affect enforcement across all the others because of the way networks work across services," Evelyn Douek, a lecturer at Harvard Law School who focuses on online speech and misinformation, said. "YouTube is one of the most highly linked domains on Facebook, for example."

    "It's not possible to think of these issues platform by platform," Douek added. "That's not how anti-vaccination groups think of them. We have to think of the internet ecosystem as a whole." (Sebastian, Wall Street Journal, 9/29; Alba, New York Times, 9/29; Seitz, Associated Press, 9/29)

    Aaron Carroll on how clinicians can combat medical misinformation

    Listen to the Radio Advisory episode

    Radio Advisory, a podcast for busy health care leaders.

    Medical misinformation has been a significant problem for a long time, but amid the Covid-19 pandemic, the problem has become even more widespread. In this episode, host Rachel Woods sits down with Dr. Aaron Carroll, author, professor, and Indiana University chief health officer—to discuss what all clinicians should do to combat medical misinformation.

    Plus, Advisory Board experts Solomon Banjo and Pam Divack offer their take on clinician’s role in online spaces (with patients and with each other) and translate those same principles for the rest of the industry.

    Listen now

    Have a Question?

    x

    Ask our experts a question on any topic in health care by visiting our member portal, AskAdvisory.

    X
    Cookies help us improve your website experience. By using our website, you agree to our use of cookies.