Tech

A Month After The Election, YouTube Decided To Crack Down On Political Misinformation

(Photo by Robyn Beck / AFP/Getty Images

Alex Corey Contributor
Font Size:

YouTube announced Wednesday it would be removing any content that alleged voter fraud or disputed Joe Biden’s victory in the 2020 presidential election — over a month after the former vice president was declared the winner.

The Google-owned service said in a blog post that it has updated its information panel to include a “2020 Electoral College Results” page from the Office of the Federal Register that confirms Biden’s win.

YouTube had already been linking users to The Associated Press and the “Rumor Control” page from the Cybersecurity and Infrastructure Security Agency. Since Election Day, the service also said it has been using  “relevant fact check information panels” to help debunk misinformation, citing “Dominion voting machines” and “Michigan recount” as examples.

“Yesterday was the safe harbor deadline for the U.S. Presidential election and enough states have certified their election results to determine a President-elect,” the video platform said in the post. “Given that, we will start removing any piece of content uploaded today (or anytime after) that misleads people by alleging that widespread fraud or errors changed the outcome of the 2020 U.S. Presidential election, in line with our approach towards historical U.S. Presidential elections.”

YouTube said since September it has removed over 8,000 channels and thousands of “harmful and misleading” elections-topic videos for violating its existing guidelines. However, the company said existing videos that violate the new policy would be allowed to remain on the platform.

The platform is not the only tech company that has vowed to fight misinformation. (RELATED: Twitter Starts The Week By Repeatedly Dinging Trump’s Claims, Including 2 Tweets Where He Said He ‘Won The Election’)

Twitter said last month in a blog post that from Oct. 27 to Nov. 12, it labeled 300,000 tweets as “disputed and potentially misleading” accounting for 0.2% of all US election-related Tweets sent during that two-week period.

In September, Facebook announced steps to help slow the spread of misinformation on the platform, which included refusing to accept campaign ads the week of the election and attaching labels to content it felt could “delegitimize” the outcome of the election.

The tech giant also announced last month it planned to restrict posts on both Instagram and Facebook that its system flags as “misinformation ” to ensure they are seen by fewer users, according to a report from Politico.  Facebook also said it would limit the distribution of election-related live videos.

YouTube said it plans to continue to provide recommendations to only reputable sources, such as news outlets like ABC News or Fox News.  While the company revealed more than 70% of recommendations on election-related topics came from authoritative news sources, they acknowledged they can do better going forward to ensure the 1% of “problematic misinformation” watched by users is diminished.

“We welcome ongoing debate and discussion and will keep engaging with experts, researchers and organizations to ensure that our policies and products are meeting that goal,” YouTube said.  “And as always, we’ll apply learnings from this election to our ongoing efforts to protect the integrity of elections around the world.”