Business

Facebook Is Testing A New Feature That Provides ‘Context’ To News Articles

(Photo by David Ramos/Getty Images)

Daily Caller News Foundation logo
Eric Lieberman Managing Editor
Font Size:

Facebook announced Thursday that it’s starting to test a new feature that provides “additional context” to articles seen on users’ News Feeds.

The goal, Facebook says, is to “provide people some of the tools they need to make an informed decision about which stories to read, share, and trust.”

The extra context, which can be triggered with the click of a unique button, includes “information from the publisher’s Wikipedia entry,” related articles on the topic, and how many people are sharing the story, among others.

The tech conglomerate developed such an idea with feedback from the community, but more importantly the help of certain publishers who have worked with the company on its Facebook Journalism Project.

But which of these organizations and people that work with Facebook may be unsettling to people who want impartiality in news story selection, and a diverse set of viewpoints on any given topic.

Facebook CEO Mark Zuckerberg, for example, said last year that he would be partnering with Snopes as one of its third-party fact checking organizations, despite the fact that it almost exclusively employs left-leaning reporters and has suspect verification skills. (RELATED: Snopes Deliberately Omits Key Details To Protect Kerry’s State Dept.)

Facebook also hired former CNN anchor Campbell Brown, who is an adamant member of the “never Trump” faction.

Facebook continues to push ahead with its News Feed initiatives, as well as its fight against “fake news,” even though Zuckerberg recently declined shareholders’ demands to be more aggressive during a board meeting. It’s likely because Facebook, and Zuckerberg specifically, have been constantly pressured in recent months to decipher and remove deceiving or false news stories on its platform, despite subjectivity being liable to the most seemingly scientific processes. (RELATED: Obama Gave Zuckerberg A Talking-To About Facebook’s ‘Fake News’ Problem)

Facebook tried to get rid of its subjectivity by firing its trending news team after a former Facebook worker told Gizmodo it routinely suppressed conservative news. It subsequently replaced the biased human workers with algorithms, which can also be biased as John Giannandrea, Google’s chief of artificial intelligence recently said.

But algorithms, which are created by humans, can also be altered by humans if bias or other problems occur.

A recent BuzzFeed story made it seem as if Facebook’s algorithms are uncontrollable and will keep feeding users misleading or fraudulent news. But not only did several tech experts refute that algorithms (portrayed by the author as an irrepressible monster) can’t be changed, fake news on Facebook barely affected people’s opinions on the presidential candidate, according to Stanford and NYU economists, and certainly did not ultimately change the outcome of the election.

Algorithms are not directly mentioned in Facebook’s press release on the new feature, but will likely be used, at least partially, for the contextual feature. Direct assistance from “publishers,” however, may be worrisome to those that fear of partisan or slanted censorship.

Follow Eric on Twitter

Send tips to eric@dailycallernewsfoundation.org.

All content created by the Daily Caller News Foundation, an independent and nonpartisan newswire service, is available without charge to any legitimate news publisher that can provide a large audience. All republished articles must include our logo, our reporter’s byline and their DCNF affiliation. For any questions about our guidelines or partnering with us, please contact licensing@dailycallernewsfoundation.org.