In a pilot initiative to combat “extremism,” Facebook began notifying users if they were exposed to extremist content and whether they were concerned someone they knew was becoming an extremist.
The notification, which Facebook said is in the testing phase, either notifies users that they have potentially been exposed to extremist content or asks them if they are concerned about a friend becoming an extremist. A Facebook spokesperson told CNN Business that the feature is part of Facebook’s commitment to combatting extremism on its platform. (RELATED: Federal Judge Orders Florida To Halt Bill Aimed At Stopping Big Tech Censorship Of Conservatives)
Facebook has had issues in the past of terrorists recruiting on the platform. A report from 2020 found that ISIS and other terrorist groups had found ways to evade content moderators, such as disguising their content with reports from legitimate news outlets, BBC reported. The groups were able to get tens of thousands of views on their recruiting materials before Facebook was able to remove them.
Facebook categorizes content that violates its policies into several categories; violence and incitement, dangerous individuals and organizations, coordinating harm and publicizing crime, regulated goods, and fraud and deception. Each category has specific descriptions for what falls into them, but some have argued that it’s not enough.
“This test is part of our larger work to assess ways to provide resources and support to people on Facebook who may have engaged with or were exposed to extremist content, or may know someone who is at risk,” a Facebook spokesperson told CNN. “We are partnering with NGOs and academic experts in this space and hope to have more to share in the future.”
The notifications ask users if they would like to receive “confidential support” for themselves or a friend.
“Are you concerned that someone you know is becoming an extremist?” one notification asks. “We care about preventing extremism on Facebook. Others in your situation have received confidential support.”
Another alert warns users about “violent groups” trying to “manipulate” them.
“Violent groups try to manipulate your anger and disappointment,” it says. “You can take action now to protect yourself and others.”
Screenshots of the alerts quickly went viral on social media, with some commentators saying that the initiative took things too far.
Freelance writer Jack Hunter said that Facebook is becoming extremist, not its users.
I’m more concerned Facebook is becoming extremist.
— Jack Hunter (@jackhunter74) July 1, 2021
Republican New York Rep. Elise Stefanik slammed Facebook’s “unconstitutional censorship.”
Big Tech strikes again! Facebook is now sending notifications to conservatives saying they’ve been exposed to “harmful extremist content”.
STOP Facebook’s unconstitutional censorship and take a stand against BIG TECH!
Sign and Share: https://t.co/Dpc2kqSU5i pic.twitter.com/laXCqHUtxZ
— Elise Stefanik (@EliseStefanik) July 2, 2021
2020 Libertarian party candidate Spike Cohen joked about Facebook refusing to share what content was allegedly harmful with users.
Facebook: Hey you’ve been exposed to harmful extremist content.
User: Oh no! What content was that?
U: Who did it came fromm
FB: Would you like some help to protect yourself and others?
U: Yes can you tell me what it was th-
FB: This was a test.
— Spike Cohen (@RealSpikeCohen) July 2, 2021
The warning is part of Facebook’s response to people calling for the platform to do more to moderate content. After accusations that President Donald Trump incited the Jan. 6 riot using Facebook, the platform – along with most major social media platforms – suspended his account. During a November hearing on social media and misinformation, Democratic lawmakers put pressure on Facebook and Twitter to censor more content, including “hate speech.”
A report by The Associated Press said that QAnon and militia movements were able to “promote violence” during the election using Facebook groups. The Guardian cited experts who said that policy changes to no longer recommend political groups to people don’t “do enough to combat the long history of abuse that’s been allowed to fester on Facebook.” Morning Consult said that Facebook has “neither the capacity nor the will to comprehensively remove violent extremist content and misinformation.”
Prominent accounts on Twitter have also called for Facebook to remove more content.
Time and time again, Facebook makes it OUR responsibility to stop our own harassment, instead of making systemic changes to protect Black users. And not only that, but their algorithms consistently amplify hate speech, misinformation and extremism.
— ColorOfChange (@ColorOfChange) July 8, 2021
Online hate speech against women can, and is, preventing #women from participating in healthy debate online.
What can social media networks do to stop it?
Marisa Jiménez, from @Facebook, tells us what they are doing.
— EPP Group (@EPPGroup) July 1, 2021
oh bullshit! Facebook has allowed Conservatives to get away with murder in terms of TOS because they know that their users tend to skew older, hence Conservative so they havent clamped down on bullshit/hate speech or misinformation hardly at all. Meanwhile Trump violated Twitter
— Nik Carter (@TheNotoriousNIK) July 2, 2021
Facebook did not respond to the Daily Caller’s request for comment.