Content moderation can be controversial, as demonstrated earlier this week when leading online platforms removed content and accounts posted by Alex Jones and his media property “Infowars.”
Many conservatives contend that the removal of Alex Jones’s content violated his freedom of speech. Ironically, these are often the same people that argue private businesses should be able to operate the way they want.
Private entities, including online platforms, are not bound by the First Amendment, which applies only to action by the government. Private actors are bound by corporate policies and market forces.
The right to free speech only prevents governments from passing laws that restrict it, it does not force private platforms to host whatever content their users produce.
Online platforms like Facebook, Pinterest, and YouTube are private businesses that can allow or disallow content as they choose. Social media platforms can and do remove content for both legal and business reasons, and each platform has their own rules and processes.
Each platform publishes terms of service that operate as an agreement between the platform and users who post content. Not solely for legal compliance, these agreements also inform users how they may and may not use the platform.
These agreements also outline the rights of the platforms to moderate content. And platforms like Facebook and YouTube implemented robust community standards and guidelines that minimize arbitrary takedowns of content.
Even the website managed by Alex Jones has terms of service describing content it allows and disallows: “If you violate these rules, your posts and/or username will be deleted. Remember: you are a guest here. It is not censorship if you violate the rules and your post is deleted.”
Each platform has its own business model for the experience they want to deliver for users who post and consume content. Some platforms are lenient in their approach to content moderation and allow pornography, violent images, and offensive language. Others are very strict, banning almost anything that could be deemed offensive to readers.
Choosing how to moderate content is crucial for any business running a digital platform. The type of content they choose to host will determine how the platform is used and the audience they attract and retain.
Few people would use a platform if they are persistently offended and upset about content posted there. Platforms that want widespread adoption will, therefore, moderate content that is commonly deemed inappropriate.
Alex Jones is a controversial character and his content is often written to be offensive to a segment of the public. Why should a private company be obligated to host this content if they don’t want to?
Alex Jones has used his own app and website rather than rely just on platforms like Facebook and YouTube. Since his removal from several platforms, the “Infowars” app has become a top download.
Platforms that remove abusive content, be it racist, homophobic, or just downright distasteful, are going to be more popular with the wider public than those that don’t — whether supporters of Jones like it or not.
Facebook, YouTube, and others banning Alex Jones are not attempting to silence him. They are simply trying to protect their business model and retain their customers. Conservatives have to respect that right even as they argue for Alex Jones’ right to free expression.
Carl Szabo is general counsel for NetChoice, a trade association of eCommerce businesses and online consumers. Facebook is a member of NetChoice. Szabo is also an adjunct professor of privacy law at the George Mason University Antonin Scalia Law School.
The views and opinions expressed in this commentary are those of the author and do not reflect the official position of The Daily Caller.