Facebook Makes ‘Considerable Progress’ Downgrading Anti-Vax ‘Misinformation’ This Year


Kira Mautone Contributor
Font Size:

Facebook claims to have made “considerable progress” this year in regard to minimizing the proliferation of vaccine “misinformation,” specifically regarding content from those questioning the COVID vaccine.

Amidst the plethora of claims disputing the safety and efficacy of the new vaccines filling social media feeds, Facebook employees felt they had found a way to halt the spread of anti-vaccine sentiment, according to the LA Times.

Despite the proposed solutions, Facebook exhibited delayed action on the issue, raising the question of whether Facebook prioritized controversy and division over the health of its users, according to the outlet.

According to the documents released by the former employee turned whistleblower, Frances Haugen, several Facebook employees are on record to have suggested solutions to halt the spread of anti-vaccine misinformation. One of the ideas put forth by a Facebook researcher was to potentially disable comments on vaccine posts until the company had a better handle on dealing with unfavorable messages, according to the Times.

However, this proposal was ignored and critics suggest that Facebook was slow to act on this due to potentially impacting the company’s profits, according to the outlet.

“Why would you not remove comments? Because engagement is the only thing that matters,” said Imran Ahmed, the CEO of the Center for Countering Digital Hate, according to the LA Times.

Spokeswoman Dani Lever said the internal documents “don’t represent the considerable progress we have made since that time in promoting reliable information about COVID-19 and expanding our policies to remove more harmful COVID and vaccine misinformation,” reported the Times.

Employees work in Facebook's "War Room," during a media demonstration on October 17, 2018, in Menlo Park, California. - The freshly launched unit at Facebook's Menlo Park headquarters is the nerve center for the fight against misinformation and manipulation of the largest social network by foreign actors trying to influence elections in the United States and elsewhere. The war room, which will ramp up activity for the November 6 midterm US elections, is the most concrete sign of Facebook's efforts to weed out misinformation. (Photo by NOAH BERGER / AFP) (Photo by NOAH BERGER/AFP via Getty Images)

Employees work in Facebook’s “War Room,” during a media demonstration on October 17, 2018, in Menlo Park, California. NOAH BERGER/AFP via Getty Images

In the midst of the pandemic, Facebook carefully investigated how its platform spread information regarding the COVID-19 vaccines, as well as how to reduce misinformation, according to the documents, the Times reported. For more than 6,000 users in the U.S., Mexico, Brazil, and the Philippines, the posts in their feeds were not chosen by typical engagement metrics but rather posts selected based on trustworthiness, according to the Times. By implementing these subtle algorithm changes, Facebook was able to alter how vaccine-related posts were ranked in users’ newsfeeds to prioritize posts from legitimate sources, such as the WHO, reported the outlet.

This study is in line with Haugen’s call for Facebook to reform its algorithms to prevent the spread of COVID-related “misinformation” and a regulatory agency overseeing content decisions allowing for “soft interventions.” Haugen reportedly previously worked as a product manager for Facebook’s Civic Misinformation team, where she oversaw policies dealing with misinformation and hateful content. (RELATED: Facebook’s Whistleblower Could Be The Best Thing To Ever Happen To Big Tech)

While Mark Zuckerberg has, since Haugen’s claims, expressed that many of her statements were “illogical,” he agreed with her that Congress should take action to address social media companies.

Facebook’s COVID-19 and Vaccine Policy currently reads, “Based on input from experts in health communication and related fields, we are also taking additional steps amid the pandemic to reduce the distribution of content that does not violate our policies but may present misleading or sensationalized information about vaccines in a way that would be likely to discourage vaccinations.”