Business

YouTube Expands Crackdown On Vaccine Misinformation, Targets Specific Accounts

Photo by SAM YEH/AFP via Getty Images

Daily Caller News Foundation logo
Ailan Evans Deputy Editor
Font Size:

YouTube is expanding its vaccine misinformation policies to include all vaccines approved by health agencies, pursuing more aggressive enforcement against anti-vaccine content and deleting specific accounts.

The social media company will remove any content that “falsely alleges” vaccines approved by the World Health Organization (WHO) are dangerous or ineffective at reducing transmission, as well as content that claims the vaccines contain certain substances, YouTube said in a blog post Wednesday announcing its new guidelines. Examples include claims that vaccines cause autism, infertility, or include microchips and tracking devices, the company said.

YouTube will also remove conspiratorial vaccine content, such as claims that vaccines are part of a depopulation agenda. The tech company said it implemented these new policies after consulting with health experts. (RELATED: Glenn Greenwald Blasts YouTube For Suspending Rand Paul, Says ‘We’re Now At The Point’ Almost ‘No Dissent Is Allowed’)

YouTube is also removing specific accounts deemed to have a pattern of sharing vaccine misinformation, YouTube executives told several outlets Wednesday. These accounts include anti-vaccine advocates Joseph Mercola, Erin Elizabeth, Sherri Tenpenny, and Robert F. Kennedy Jr., all of whom were featured in a report titled “The Disinformation Dozen” by the Center for Countering Digital Hate which was referenced by the White House in its urging of Facebook to crack down on COVID-19 misinformation.

A picture taken on August 13, 2021, shows empty Pfizer-BioNTech COVID-19 vaccine vials at a Health Services vaccination clinic in the Palestinian neighbourhood of Beit Hanina, in Israeli-annexed east Jerusalem. (Photo by AHMAD GHARABLI/AFP via Getty Images)

A picture taken on August 13, 2021, shows empty Pfizer-BioNTech COVID-19 vaccine vials at a Health Services vaccination clinic. (Photo by AHMAD GHARABLI/AFP via Getty Images)

“There is information, not from us, but information from other researchers on health misinformation that has shown the earlier you can get information in front of someone before they form opinions, the better,” Garth Graham, YouTube’s global head of health care and public health partnerships, told reporters.

YouTube said it would make exceptions for certain types of content that express skepticism about vaccines, such as personal testimonies. However, the content would be removed “if the speaker then goes on to generalize and make calls for all parents not to vaccinate or makes broad claims about vaccines not being safe or effective,” the company’s vice president of global trust and safety Matt Halprin told reporters.

The company will also permit content about “vaccine policies, new vaccine trials, and historical vaccine successes or failures” in the interest of scientific discussion and debate.

YouTube has already removed over one million videos for promoting COVID-19 misinformation, including misinformation related to the COVID-19 vaccine. The company attracted controversy in August for suspending Sen. Rand Paul’s account over claims related to the efficacy of masks in curbing the spread of the virus.

All content created by the Daily Caller News Foundation, an independent and nonpartisan newswire service, is available without charge to any legitimate news publisher that can provide a large audience. All republished articles must include our logo, our reporter’s byline and their DCNF affiliation. For any questions about our guidelines or partnering with us, please contact licensing@dailycallernewsfoundation.org.