Tech

BuzzFeed Wants You To Fear Facebook’s Algorithms

Reuters Pictures

Daily Caller News Foundation logo
Eric Lieberman Managing Editor
Font Size:

A Buzzfeed story arguing Facebook is run by monstrous, inhuman algorithms ignores the simple truth that humans are the ones who came up with the algorithms in the first place.

As news of Facebook selling a relatively small amount of ads to a Russian firm unearthed in recent weeks, clamoring over the tech giant’s influence continues to heat up. A number of articles addressing Silicon Valley’s growing power in general, including two from BuzzFeed, point out a palpable shift with the industry, specifically its relationship to politics and society.

But BuzzFeed’s Sunday piece titled “Mark Zuckerberg Can’t Stop You From Reading This Because The Algorithms Have Already Won,” appears to be sowing seeds of fear without concentrating on how algorithms come to be: human direction.

In general, an algorithm in regards to computer science is a process or set of rules dictated by a series of calculations.

“The trouble comes in with humans, who can frequently be unpredictable in their awfulness,” Caleb Watney, technology policy associate at the think tank R Street, told The Daily Caller News Foundation. “Whether it’s saying nasty words, or attempting to manipulate the information flows of Americans to influence the election, humans are generally the problem – not algorithms.”

BuzzFeed’s Charlie Warzel asserts that Zuckerberg will likely be forced to read at least the headline of his story because the algorithm used for Facebook’s trending news section would probably detect his own name and company and compare it to his internally recorded interests.

“And there’s little the Facebook CEO can do to stop it, because he’s not really in charge of his platform — the algorithms are,” Warzel claimed.

The large majority of algorithms are created by humans, and thus can be controlled and altered by humans.

“The algorithms are not all powerful, and we humans are not powerless,” Bret Swanson, visiting fellow at the American Enterprise Institute, and fellow at the U.S. Chamber of Commerce Foundation, told TheDCNF. “We KNOW that more of our world is seen and processed through code, and we can deploy even more technology and human ingenuity to leverage, adjust to, and mitigate that bad effects of that code.”

Warzel also contends that Zuckerberg wouldn’t know how to explain the intricacies of Facebook’s algorithms if he was called to testify in front of Congress, something that seems to be increasing in likelihood. Regardless, if Zuckerberg, the leader of his company, couldn’t explain the algorithm’s nuances in detail, surely someone can at the company, whether it’s the software engineers or the CTO.

Warzel goes even further, saying not only would Facebook probably be unable to properly describe the algorithms during a congressional testimony — which is actually very possible due to the federal government’s technology knowledge gap — but that they’re also “enormously difficult to monitor.”

“The idea that the algorithms can’t be monitored is, quite frankly, ludicrous,” Richard Bennett, one of the original creators of the WiFi system, told TheDCNF. “It’s good software engineering practice to create logs of system decision-making so the flow can be monitored, tweaked, corrected, and optimized.”

Bennet said the reason for the apparent rise of fake news inside certain echo chambers is because such activity generates advertising revenue, not because the algorithms have been left unchecked.

“The Facebook algorithms didn’t create themselves,” Bennet added. “They were designed by Facebook engineers to serve the company’s business objectives.”

As reports surfaced that Zuckerberg handed over the 3,000 ads purchased by a Russian firm to congressional investigators, the tech wunderkind promised last week to protect “election integrity” with a number of new measures.

“In the next year, we will more than double the team working on election integrity. In total, we’ll add more than 250 people across all our teams focused on security and safety for our community,” he said, enumerating other changes.

The prospect of Russian companies with connections to the Kremlin trying to cultivate a more schismatic political landscape in America by purchasing divisive political ads further worries lawmakers who long feared Russia’s influence on the U.S. presidential election.

But most of the purveyed advertisements reportedly did not focus on then-Republican presidential nominee Donald Trump or then-Democratic presidential candidate Hillary Clinton, and the total sales amounted to $100,000. As Axios reporter Sara Fischer notes, that is a very small amount, especially in a two-year time span. The small number of ads relative to the larger, massive political advertisement ecosystem means that the ultimate impact on the election was probably limited.

Fake news in general had a very limited effect on people’s opinion of the candidates, according to a study conducted by economists at Stanford and New York University.

Nevertheless, following the election, many credited or blamed the apparent rise of “fake news” and the election results particularly on Facebook. (RELATED: BuzzFeed’s Infamous Trump Dossier Is Facebook’s Most Read News Story In Past Year)

“For all its the emotional appeal, the idea that Russia was able to change the outcome of the presidential election with a $100,000 Facebook ad buy is absurd. If it were true, then every political consultant in the U.S. would be out of a job,” Bennett told TheDCNF. “Hillary and her supporters spent lots of money on social media campaigns, just not as wisely as the Trump campaign. The election turned out the way it did because Hillary not only failed to win the white working class vote, she didn’t even bother to ask for it. Voters don’t like being disrespected.”

Facebook can still be blamed, as Warzel does, for constructing and configuring an algorithm that failed to yield optimal results. But Facebook will — and more notably, always can — modify its algorithms to foster what it deems are desirable outcomes or even nix ones altogether. And, Facebook, valued at roughly $500 billion, could probably afford hiring additional workers to overlook the algorithms and provide additional human filtering for the platform to accompany the embedded technology.

Zuckerberg originally called “the idea that fake news on Facebook” — which he clarified was a “very small amount of the content” — could influence the election “pretty crazy.” Now, he has cooperated with official investigators, including special counsel Robert Mueller, who is investigating Russian interference in the 2016 election. (RELATED: Obama Gave Zuckerberg A Talking-To About Facebook’s ‘Fake News’ Problem)

“If Zuckerberg is trapped by anything, it’s not by the power of Facebook’s algorithms,” said Whatney. “It’s by the tangle of promises he’s made to the general public about the openness of the platform and the constant ways that people are trying to take advantage of that openness for their own gain.”

A cacophony of public demands may have confused Zuckerberg and Facebook leadership, perhaps leading them to deviate from their personal convictions and business-oriented mindset.

“A year ago Facebook faced criticism for doing too much curation on their platform and now they are under scrutiny for not doing enough,” Will Rinehart, director of technology and innovation policy at the American Action Forum, told TheDCNF. (RELATED: Facebook’s News Content Staff Was Filled With Liberals)

“Finding the signal from the noise is difficult, not just on Facebook, but on all platforms. Overall, I think we need to be more constrained in what we are asking of platforms. Finding an ever changing signal at scale is an incredibly hard task, which those in the industry rightly recognize,” he continued.

Warzel concluded with a reference to a New York Times article which described Facebook’s algorithms, among other things, as it’s “Frankenstein problem.”

“And in terms of responsibility, the metaphor is almost too perfect,” Warzel opined. “After all, people always forget that Dr. Frankenstein was the creator, not the monster.”

With the analogy, Warzel and TheNYT illustrate Facebook as its own worst enemy, the creator of its possible demise. But as the creator and controller of its oft-criticized algorithms, Facebook is just as likely to be its own savior.

Facebook was unable to respond to TheDCNF’s request for comment by time of publication.

Follow Eric on Twitter

Send tips to eric@dailycallernewsfoundation.org.

All content created by the Daily Caller News Foundation, an independent and nonpartisan newswire service, is available without charge to any legitimate news publisher that can provide a large audience. All republished articles must include our logo, our reporter’s byline and their DCNF affiliation. For any questions about our guidelines or partnering with us, please contact licensing@dailycallernewsfoundation.org.