- Conservatives are railing against a New York Times report that appeared to depict moderate conservative pundits as far-right ideologues.
- YouTube’s new algorithm might be inadvertently brainwashing young men, shifting them from conservatism to liberalism, The NYT reported.
- The NYT report suggests liberal internet trolls are hijacking algorithms to persuade brainwash conservatives.
YouTube might be inadvertently brainwashing young people, turning them into far-right ideologues and then back into left-leaning activists, The New York Times reported Saturday.
The company’s business model rewards extremist videos with advertising dollars while its algorithms are aimed at keeping viewers keyed into such content. Such content is changing users’ ideological positions, the report notes. Conservatives on Twitter, meanwhile, blasted The NYT report for depicting some moderate pundits as alt-right.
“Yes, I know I’m on the front page of today’s New York Times, in a story about YouTube and far-right radicals. The story doesn’t mention me, so I can’t ask for a correction, but much like Milton Freidman. I’m obviously not “far-right,” Candice Malcolm, a writer at the Toronto Sun, wrote on Twitter Sunday. She was referring to The NYT’s front page cover of the story, which appeared to paint moderate conservative voices as extremist ideologues.
Others were similarly upset. “The ‘newspaper of record’ publishing a front page story about one guy who watched youtube videos. There is no data, its an anecdote, the framing inverts the conclusion of the story, the core premise is easily debunked,” freelance journalist Tim Pool told his Twitter followers Sunday. Pool frequently reports on what he and many conservatives believe is big tech’s penchant for censorship.
The NYT report comes as other conservatives argue Google, which controls YouTube, has too much influence on human behavior. YouTube tweaked its artificial intelligence in 2015 and created a type of “addiction machine,” according to the report. The new technology was the bran child of Google’s AI-division, called Google Brain, which helped transformed YouTube’s system into a type of AI that mimics the human brain.
The report focuses on a 26-year-old man named Caleb Cain, who told The NYT that he “fell down the alt-right rabbit hole” on YouTube five years ago. He now distances himself from the movement, but he described being supposedly radicalized as the algorithm slowly but surely pulled him further and further into so-called right-wing content. (RELATED: Josh Hawley Introduces A Bill Cracking Down On Big Tech Exports To China)
“I just kept falling deeper and deeper into this, and it appealed to me because it made me feel a sense of belonging,” Cain said. “I was brainwashed.” Things began changing again for Cain sometime in 2018 after a group of liberal trolls began hijacking the algorithm delivering content, the report notes.
Cain found videos by Natalie Wynn, a former philosopher who goes by the name ContraPoints. She is part of a group of YouTubers who call themselves BreadTube, a reference to an 1892 book by a communist named Peter Kropotkin. The group also includes people like British philosopher Oliver Thorn, who hosts shows about topics like transphobia, racism and Marxist economics.
BreadTube’s strategy is a kind of algorithmic hijacking. By talking about many of the same topics that far-right creators do — and, in some cases, by responding directly to their videos — left-wing YouTubers are able to get their videos recommended to the same audience. Cain was a Donald Trump supporter in 2016, but the shift to Wynn and other content transformed him again, according to the report.
Cain started his own YouTube channel recently where he talks about politics from a left-wing perspective. “You have to reach people on their level, and part of that is edgy humor, edgy memes,” he told a reporter. “You have to empathize with them, and then you have to give them the space to get all these ideas out of their head.” Cain said he has no intention of pulling back his YouTube viewing habits.
Some academics worry Google has too much influence. Former Google design ethicist Tristan Harris, for one, warns that big tech companies’ algorithms are effectively controlling people’s minds. “A handful of people working at a handful of technology companies, through their choices, will steer what a billion people are thinking today,” Harris said at a TED talk in 2017. Others worry such influence could affect elections.
Psychologist Robert Epstein, for instance, calls the kind of influence that The NYT suggests influenced Cain the Search Engine Manipulation Effect. “These are new forms of manipulation people can’t see,” he told the Los Angeles Times in March. Algorithms “can have an enormous impact on voters who are undecided. … People have no awareness the influence is being exerted.”
Epstein tracked 47,300 searches by undecided voters in the California districts of Democratic Reps. Katie Porter, Harley Rouda and Mike Levin, all of whom won their elections in 2018. Mainstream news outlets like the LA Times and The NYT dominated the Google search results, his research found. Searches on Yahoo and Bing, on the other hand, linked people to outlets like Breitbart.
Epstein’s model showed that roughly 35,455 voters were persuaded to vote for a Democrat entirely because of the sources Google fed them. Google CEO Sundar Pichai dismissed Epstein’s premise in a congressional hearing in 2018, telling lawmakers that the professor’s methods were flawed.