YouTube’s move to sanitize itself for “family friendly” advertisers is crippling news channels and independent journalists, who are now unable to use the platform to report news from the ground.
Machine-driven algorithms are mass-deleting videos, and channels caught up in the fray are also being suspended.
The rules it has for offering “context” to the stories they report to avoid being flagged only exacerbate the censorship issue, as they prevent journalists from even describing the topics they cover.
At present, YouTube is the de facto platform for citizen journalists and alternative media commentators who report on news events that most people would not know about, were it not for their work. Living in these environments, these journalists can go to places where mainstream outlets do not dare to tread.
YouTube’s crackdown on extremist religious content is hurting their ability to get the word out. For reporters on the ground in Syria, evidence of war crimes and atrocities are being systemically flagged and removed by the platform’s automated systems before they’re even reviewed by humans.
The New York Times recently reported that organizations like Airwars, which tracks international airstrikes, were being affected by the new algorithm.
Additionally, over 6,000 videos hosted by Qasioun News Agency to document the war in Syria were deleted when the channel was suspended. Other affected channels included the Syrian Ministry of Defense, and at least five channels belonging to Syrian opposition groups — all of which provided different angles of the on-going conflict.
Beyond these channels, an unspecified number of citizen journalists and vloggers covering the stories have also been impacted by the sweep. Only a few videos (including Qasioun’s) were reinstated following a review by actual humans. Many of these videos were initially removed for “graphic content” despite having nothing of the sort.
YouTube’s efforts to purge the platform of extremist content is largely powered by machine learning—a feature that the company bragged about in a blog post several weeks ago. At the time, popular YouTubers told The Daily Caller that they were worried that their content would be censored. They weren’t wrong.
Besides skeptic and social commentary YouTubers, conservative and independent news channels were also hit by the algorithm. It caused their channels to be demonetized or delisted.
Affected YouTubers speculate that the algorithm prioritizes channels that already have one flagged video on them, so the instant one of their videos is flagged for removal, the other videos follow suit. Due to YouTube’s three-strikes system, a single strike can quickly avalanche into countless strikes, causing channels to be deleted within hours of a take down.
While news and human rights organizations share the work amongst themselves, independent journalists rely almost entirely on YouTube to host and share their work. Prosecutors who build cases against war criminals who rely on hosted videos will also experience more difficulty in gathering evidence due to YouTube’s actions.
Organizations and journalists intent on covering warzones are advised to follow YouTube’s guide for “context.” One of the rules states that contributors should “avoid using descriptions, titles, tags and thumbnails that highlight the most provocative or shocking aspects of your video.”
While avoiding gory thumbnails is understandable, one might ask how it would be possible to offer context in titles, tags, and descriptions for news reports that are shocking by their very nature.
How would one describe the recent terrorist attack in Barcelona that left 14 people dead? “Van in crowded Barcelona street alarms Europe” is hardly descriptive. And euphemisms would be nothing short of disrespectful. Would the Pulse nightclub shooting be described as “Pious man goes to nightclub, makes sure others don’t have a good time?”
Due to YouTube’s censorship of “shocking” content, the only way to cover these topics is to not mention them at all by making them undiscoverable to anyone looking for stories.