Did This Fake News Experiment Go Horribly Wrong For Facebook?
Facebook recently concluded a test in which it promoted comments mentioning “fake news” on certain posts to the top, making it way more likely to be viewed, according to a BBC report published Tuesday.
The experiment was meant to triage “comments that indicate disbelief,” the social media company reportedly told BBC, which says that means feeds from news outlets like itself and The New York Times were affected. The modification appeared on a variety of news stories, but was only available for a portion of users.
People across the Twittersphere expressed their displeasure with the new feature.
— joanna barrett (@jobrigitte) October 23, 2017
Can @facebook explain why the featured comment on every article contains “fake news”? 1000’s of comments. It has to be on purpose.
— Derek Spent (@derekspent) November 5, 2017
Facebook is showing “fake news” comments on the top, even if they don’t have enough likes pic.twitter.com/8ZMKyURhx9
— Héctor Bonilla (@hectorlbonilla) October 21, 2017
The fact that users noticed an inordinate amount of comments with “fake news” in it, and thus thought it was likely Facebook’s doing, is telling since it means that the prevalence was so blatant — even if such a phrase has become quite popular in the past couple of years.
The trial is quite an interesting move given the aforementioned contemporary tendency of people to clamor over misleading or fraudulent news. A large amount of posts are centered around political issues — whether its a news organization or an old acquaintance from high school — and will therefore almost inevitably cause strife and virtual bickering in this hyper-polarized political climate. One of the newer favorite tactics in online arguments is shouting “fake news,” so a lot of content’s top comment would then center around that trite criticism.
“We’re always working on ways to curb the spread of misinformation on our platform, and sometimes run tests to find new ways to do this,” Facebook told The Daily Caller News Foundation. “We wanted to see if prioritising comments that indicate disbelief would help. We’re going to keep working to find new ways to help our community make more informed decisions about what they read and share.”
The test is just one of many “fake news”-fighting endeavors from Facebook. The company has been pressured by both public officials and parts of the public to crack down on disinformation, all while risking censorship and the abandonment of its ostensible free expression ethos.
Send tips to firstname.lastname@example.org.
Content created by The Daily Caller News Foundation is available without charge to any eligible news publisher that can provide a large audience. For licensing opportunities of our original content, please contact email@example.com.