YouTube on Wednesday banned several high-profile anti-vaccine influencers and announced that, moving forward, it would remove all content that falsely claims approved vaccines are dangerous.
YouTube cracks down on vaccine misinformation
YouTube said it would take down any content that makes false claims about approved vaccines, including claims that the vaccines cause autism, cancer, infertility, or don't reduce the transmission or contraction of disease.
The policy applies to all vaccines approved by health authorities, not just those for Covid-19.
"We've steadily seen false claims about the coronavirus vaccines spill over into misinformation about vaccines in general," the company said in a blog post. "We're now at a point where it's more important than ever to expand the work we started with Covid-19 to other vaccines."
Critical content regarding vaccines that are still undergoing clinical trials will be allowed, as will personal stories about reactions to vaccines, as long as such stories aren't posted by an account with a history of posting vaccine misinformation.
YouTube said that since 2020, the site has removed more than 130,000 videos for violating its policies regarding Covid-19 vaccines. It has now permanently removed several channels associated with high-profile spreaders of vaccine information, including those associated with Joseph Mercola and Robert F. Kennedy, Jr.
A person familiar with YouTube's policymaking process told the New York Times that, after the company developed rules surrounding Covid-19 vaccine misinformation last year, it began to consider broadening the policy.
The company found that many videos that included Covid-19 vaccine misinformation also incorporated broader misinformation, making it difficult to narrowly tamp down on Covid-19 vaccine misinformation without forming a broader policy.
However, developing the rules took months due to the difficulty of addressing content across many languages and determining the boundaries of what users would be allowed to post, the person told the Times.
YouTube's new policy broadly aligns it with other major social media sites, such as Facebook and Twitter.
Facebook in February announced it would remove posts with false claims about vaccines, including those for diseases other than Covid-19. In March, Twitter implemented a new policy explaining penalties for sharing vaccine misinformation—although that policy focused primarily on Covid-19 and provided a "five strikes" approach of escalating penalties for violations.
Experts on misinformation have said that anti-vaccine content on social networks is a factor in vaccine hesitation. YouTube videos are often the source of that content, going viral on platforms like Facebook and Twitter, the Times reports.
"One platform's policies affect enforcement across all the others because of the way networks work across services," Evelyn Douek, a lecturer at Harvard Law School who focuses on online speech and misinformation, said. "YouTube is one of the most highly linked domains on Facebook, for example."
"It's not possible to think of these issues platform by platform," Douek added. "That's not how anti-vaccination groups think of them. We have to think of the internet ecosystem as a whole." (Sebastian, Wall Street Journal, 9/29; Alba, New York Times, 9/29; Seitz, Associated Press, 9/29)