Pedophiles have taken advantage of Instagram’s algorithms and networking functions to engage in predatory behavior en masse, according to a Wall Street Journal investigation.
While child sexual predators have long used the internet to facilitate their illegal activities, Instagram provides an algorithmic platform which actively promotes thinly-veiled illicit content and enables pedophiles to connect with each other, according to the report. Instagram’s algorithms link buyers and sellers with similar niche interests, including highly sexualized and disturbing explicit content, according to the report. (RELATED: Thousands Of Pedophiles Released From California Prisons After Less Than A Year: REPORT)
Researchers from Stanford University and the University of Massachusetts Amherst found that the platform permitted users to search for content under hashtags such as “preteensex” and “pedowhore,” according to the report. These tags aggregated content and accounts directly related to the exchange of graphic and illegal sexual content involving underage children, the report continues. Some of these accounts purport to make children available for physical contact with pedophiles who are willing to pay, according to the report.
This work requires sustained, meaningful investments in Trust & Safety teams given the significant harms. We’ll continue to work with platforms to improve detection & prevention of risks to child safety, both directly and in concert w/ @missingkids & @tech_coalition. 7/7
— Stanford Internet Observatory (@stanfordio) June 7, 2023
The posting and promoting of graphic sexual content and child pornography violates the user rules of Instagram and its parent company, Meta. These online activities are also in flagrant violation of federal laws designed to protect children from sexual exploitation.
Accounts which engage in commercialized exchange of child sexual content rarely do so explicitly, according to the report. Instead, these accounts advertise their offerings on “menus” which thinly disguise the true nature of what is for sale. These niche sexualized accounts do not appear for the vast majority of the platform’s users. However, such accounts exist and interact with each other in the platform’s dark corners, selling items such as videos of young children engaging in sexual activity with animals and other disturbing content, according to the report and the researchers’ findings.
“Child exploitation is a horrific crime,” Meta said in a statement to the WSJ. The company is actively exploring “ways to actively defend against this behavior,” it said in the same statement. Meta claimed to have dismantled 27 pedophile networks on the platform and is looking to shut down more, according to the report. The company has established an internal task force to further address the problem of child sexual exploitation on Instagram, according to a report from The Verge.
The report comes in the wake of a heated public debate over the role that Instagram and other social media platforms ought to play in moderating the content and speech their users post. While platform moderators targeted content relating to allegations of election fraud and the efficacy of COVID-19 vaccines, the WSJ report indicates that content moderators missed considerable amounts of sexually exploitative content.
Elon Musk acknowledged that stamping out child sexual exploitation networks on Twitter was his “#1 priority” when he acquired the platform in Oct. 2022. Musk said “it’s a crime that (former management) refused to take action on child exploitation for years” in a Dec. 2022 tweet following the resignations of former Twitter content moderators.
All content created by the Daily Caller News Foundation, an independent and nonpartisan newswire service, is available without charge to any legitimate news publisher that can provide a large audience. All republished articles must include our logo, our reporter’s byline and their DCNF affiliation. For any questions about our guidelines or partnering with us, please contact email@example.com.