Facebook and YouTube are taking new steps to reduce the spread of misleading health claims after a Wall Street Journal investigation found the websites are being "flooded" with misinformation, the Journal reports.
The announcements are part of a broader effort by technology and social media companies to crack down on misleading health care information, the Journal reports. Earlier this year, Amazon, Google, Facebook, and Pinterest announced steps they have taken to reduce anti-vaccine content on their sites, as measles outbreaks have continued to spread across the United States.
About the investigation
According to the Journal, the latest moves by Facebook and YouTube came after a Journal investigation found that deceptive and "harmful" health information—particularly misinformation about cancer therapies—is widespread on Facebook and YouTube and appears alongside advertisements for proven treatments.
For the investigation, the Journal quantified the spread of scientifically unproven cancer treatments and spoke with dozens of company representatives, oncologists, patients, and privacy experts. The Journal found that some of the misleading content on Facebook and YouTube had received millions of views.
For example, the Journal found that YouTube videos promoting the use of an ointment called black salve to treat skin cancer were among those that had been viewed millions of times. David Gorski, a professor of surgery at Wayne State University School of Medicine, said there's no evidence to suggest that the ointment kills cancerous growths beneath the skin, as the video suggests, and in fact it can burn or kill healthy skin cells.
In addition, the Journal found that videos published on a Facebook account featuring Robert Young, a salesperson who sold discredited cancer treatments, such as baking-soda injections and juicing regimens, received more than 900,000 views. According to the Journal, Young in 2016 was convicted in a San Diego county court for practicing medicine without a license, but he was released from jail in November 2017, at which time he got back on Facebook to tout his discredited theories.
According to the Journal, it shared the contents of its investigation with Facebook and YouTube.
How Facebook is pushing back against false medical information
Travis Yeh, Facebook's product manager, in a blog post published Tuesday wrote that Facebook last month updated its News Feed algorithms to minimize the spread of deceptive health information on the website.
Yeh wrote that Facebook's updated algorithm reduces the number of times posts with sensational health claims will appear on a Facebook user's News Feed. For example, the algorithm will affect posts claiming there is a miracle cure for a condition or promoting baking-soda injections as a cancer therapy, the Journal reports.
Yeh wrote, "Misleading health content is particularly bad for our community," adding that Facebook will "continue working to minimize low-quality health content on Facebook."
How YouTube is pushing back against false medical information
Meanwhile, a spokesperson for Alphabet's YouTube said the company has teamed up with doctors to identify content promoting unproven medical treatments and has adjusted its algorithms to reduce the number of times users see that content.
The spokesperson said the company removed 8.3 million videos—including those promoting inaccurate health information—during the first three months of the year for violating its policy barring videos that can result in immediate harm. The spokesperson acknowledged that YouTube's searches for cancer information are not perfect, but they have improved, the Journal reports (Hernandez/McMillan, Wall Street Journal, 7/2; Owens, "Vitals," Axios, 7/3; Chakraborty, Reuters, 7/2; Ohlheiser, Washington Post, 7/2; Yeh, Facebook, 7/2).