Tech

YouTube Says It’s Taken Down 30,000 Videos For COVID-19 Vaccine Misinformation

(Sean Gallup/Getty Images)

Varun Hukeri General Assignment & Analysis Reporter
Font Size:

YouTube has removed more than 30,000 videos containing misinformation about the COVID-19 vaccine since October 2020, the company said Thursday.

The videos taken down “included claims about COVID-19 vaccinations that contradict local and global health authorities,” YouTube spokeswoman Elena Hernandez told the Daily Caller. “Overall, since February 2020, we have removed over 800,000 videos related to dangerous or misleading coronavirus information,” she added.

The Google-owned company announced the creation of a COVID-19 medical misinformation policy in May 2020, which prohibits content that contradicts local health authorities or the World Health Organization (WHO). Its guidelines affect videos that deny the existence of COVID-19 or contain misinformation about treatment and prevention.

YouTube expanded its medical misinformation policy in October 2020 to include content about the COVID-19 vaccine that contains misinformation. the company said. Examples of misinformation include claims the vaccine causes death or contains microchips implanted in recipients.

MIAMI, FLORIDA - DECEMBER 15: A healthcare worker at the Jackson Health Systems receives a Pfizer-BioNtech Covid-19 vaccine from Susana Flores Villamil, RN from Jackson Health Systems, at the Jackson Memorial Hospital on December 15, 2020 in Miami, Florida. Jackson Memorial Hospital began the vaccination of frontline healthcare workers joining with hospital systems around the country as the COVID-19 vaccine is rolled out. (Photo by Joe Raedle/Getty Images)

A healthcare worker in Miami, Florida receives a Pfizer-BioNtech COVID-19 vaccine (Joe Raedle/Getty Images)

Content suspected to violate the platform’s misinformation policy are flagged and reviewed either by AI systems or live humans, the company said. Accounts in violation of the policy are subject to a “strike” system, which could result in a permanent suspension.

Misinformation about COVID-19 and vaccines have proliferated on social media platforms during the pandemic. Facebook expanded its user policy last month to include new rules affecting accounts and pages that promote vaccine misinformation, NPR reported. Twitter similarly announced an update to its vaccine misinformation policy earlier this month.

An Axios-Ipsos poll released in January found Americans are more receptive to taking the COVID-19 vaccine but roughly 30% remain hesitant or suspicious. (RELATED: In One State, Coronavirus Vaccines Are Available To All Residents 16 And Older)

Some public health experts believe doubts about the vaccine have been fueled by misinformation and conspiracy theories promoted on social media, according to multiple studies.

The Centers for Disease Control and Prevention said Wednesday roughly 62.5 million people have received at least one dose of a COVID-19 vaccine and an additional 32.9 million people are fully vaccinated.