Opinion

LARSON: After Purchasing Twitter, Elon Musk Must Stop The Spam, Child Abuse, And Deplatforming

(Photo by Theo Wargo/Getty Images for TIME)

Caleb Larson Contributor
Font Size:

Under the leadership of Elon Musk, Twitter may finally stop its efforts to enforce political censorship and instead focus on properly protecting its users, vulnerable children, and our nation’s tradition of free and open discourse. With his recent offer to buy Twitter at his original price, Musk has a chance to right the ship. The site may currently be safe for only one side of the public discourse, but it doesn’t have to be this way. If Elon Musk can finally purchase Twitter, he must have the courage to excise its current leftward slant and focus the Big Tech company’s efforts on solving the real issues it has with censorship, cybersecurity, and child exploitation material.

Musk now has a chance at overhauling the priorities of one of Big Tech’s most influential sites. It is no secret that Twitter is outright adversarial to conservatives who post content contrary to the “accepted” liberal narratives of the day. However, while Twitter focuses on censorship enforcement in the name of safety, there are dark and serious problems it refuses to solve through adequate attention and resource allocation.

In July, the former head of Twitter’s security, Peter Zatko, sent a letter to the Securities and Exchange Commission outlining what he says are “extreme, egregious deficiencies in … user privacy, digital and physical security, and platform integrity / content moderation.” Instead of upholding their claim of a solid security plan, Zatko says Twitter overlooked vulnerable software, withheld breach information, and failed to protect user data and account integrity which led to the takeover of many high-profile accounts such as Barack Obama and Donald Trump. In regard to spam activity and the large number of bots on the platform – something about which Elon Musk has expressed reservations – Zatko claims that Twitter focused on growth rather than eliminating spam since executives financially gain from increasing users regardless of user quality or authenticity.

Even more sinister than lax cybersecurity controls is the presence of child sexual abuse material (CSAM) on Twitter. Recently, many advertisers pulled their ads from Twitter after learning that they had been appearing next to tweets about CSAM. Ghost Data, a cybersecurity company that researched the prevalence of accounts posting CSAM related content on Twitter, found 500 accounts that had shared or requested CSAM over a 20-day period. Twitter only removed less than 30% of them. Those are unacceptable numbers for a company that declares it has “zero tolerance for child sexual exploitation.”

Perhaps it is to be expected from a company that was considering monetizing the prevalence of pornography on its site through a subscription service similar to OnlyFans, only to be stopped by an internal report that found “Twitter cannot accurately detect child sexual exploitation and non-consensual nudity at scale.” Employees claim that executives know about the CSAM problem but do not do enough to address it. This is particularly troublesome given the 29.3 million reports of suspected child sexual exploitation received by the National Center for Missing & Exploited Children’s Cyber Tipline in 2021, a 35% increase over 2020.

Twitter is a notorious Big Tech platform that focuses on censoring ideas and debate while apparently ignoring the security of its users, the quantity of spam posted, and the proliferation of child abuse material. Elon Musk has been able to generate positive outcomes in space exploration, electric vehicle innovation, and much more. He has indicated he has a vision for Twitter that would benefit public discourse. If he can complete the purchase of Twitter, he must refute the Big Tech consensus and dedicate its resources to defending the best of what the internet has to offer rather than ignoring or embracing the worst.

Instead of the punishment of viewpoints that go against the mainstream media’s narratives, there could be a flourishing of free information exchange uninhibited by top-down dictates and permanent bans. Healthy user growth could be prioritized by focusing on solving the spam issue, leading to more enjoyable experiences for users, greater profitability and a model for what an online public square should look like. Most importantly, unrelenting focus should be paid to relieving the plight of vulnerable children. All it would take is a bit of courage.

Caleb Larson is a cybersecurity researcher, policy analyst with the Internet Accountability Project, a Heritage Foundation alum, and contributor at Human Events where he writes about cybersecurity-related issues facing the United States.