Tech

Facebook Is Causing A Spook By Asking People If They Are Worried About An ‘Extremist’ Friend. Here’s What That’s About

Credit: Daily Caller [Adobe/Free Image] [Screenshot/YouTube/Courage the Cowardly Dog]

Font Size:

In a pilot initiative to combat “extremism,” Facebook began notifying users if they were exposed to extremist content and whether they were concerned someone they knew was becoming an extremist.

The notification, which Facebook said is in the testing phase, either notifies users that they have potentially been exposed to extremist content or asks them if they are concerned about a friend becoming an extremist. A Facebook spokesperson told CNN Business that the feature is part of Facebook’s commitment to combatting extremism on its platform. (RELATED: Federal Judge Orders Florida To Halt Bill Aimed At Stopping Big Tech Censorship Of Conservatives)

Facebook has had issues in the past of terrorists recruiting on the platform. A report from 2020 found that ISIS and other terrorist groups had found ways to evade content moderators, such as disguising their content with reports from legitimate news outlets, BBC reported. The groups were able to get tens of thousands of views on their recruiting materials before Facebook was able to remove them.

Facebook categorizes content that violates its policies into several categories; violence and incitement, dangerous individuals and organizations, coordinating harm and publicizing crime, regulated goods, and fraud and deception. Each category has specific descriptions for what falls into them, but some have argued that it’s not enough.

“This test is part of our larger work to assess ways to provide resources and support to people on Facebook who may have engaged with or were exposed to extremist content, or may know someone who is at risk,” a Facebook spokesperson told CNN. “We are partnering with NGOs and academic experts in this space and hope to have more to share in the future.”

The notifications ask users if they would like to receive “confidential support” for themselves or a friend.

“Are you concerned that someone you know is becoming an extremist?” one notification asks. “We care about preventing extremism on Facebook. Others in your situation have received confidential support.”

Another alert warns users about “violent groups” trying to “manipulate” them.

“Violent groups try to manipulate your anger and disappointment,” it says. “You can take action now to protect yourself and others.”

Screenshots of the alerts quickly went viral on social media, with some commentators saying that the initiative took things too far.

Freelance writer Jack Hunter said that Facebook is becoming extremist, not its users.

Republican New York Rep. Elise Stefanik slammed Facebook’s “unconstitutional censorship.”

2020 Libertarian party candidate Spike Cohen joked about Facebook refusing to share what content was allegedly harmful with users.

The warning is part of Facebook’s response to people calling for the platform to do more to moderate content. After accusations that President Donald Trump incited the Jan. 6 riot using Facebook, the platform – along with most major social media platforms – suspended his account. During a November hearing on social media and misinformation, Democratic lawmakers put pressure on Facebook and Twitter to censor more content, including “hate speech.”

A report by The Associated Press said that QAnon and militia movements were able to “promote violence” during the election using Facebook groups. The Guardian cited experts who said that policy changes to no longer recommend political groups to people don’t “do enough to combat the long history of abuse that’s been allowed to fester on Facebook.” Morning Consult said that Facebook has “neither the capacity nor the will to comprehensively remove violent extremist content and misinformation.”

Prominent accounts on Twitter have also called for Facebook to remove more content.

Facebook did not respond to the Daily Caller’s request for comment.