Politics

‘A Disastrous Effect’: Landmark Supreme Court Case Could Muzzle The Internet, Experts Warn

(Photo by Spencer Platt/Getty Images)

Daily Caller News Foundation logo
Kate Anderson Contributor
Font Size:
  • The Supreme Court is hearing a case regarding Section 230 that has protected internet platforms from liability in the past and could completely change the internet, according to experts.
  • After the 2015 Paris terror attack that killed 130 people, several relatives of the survivors sued Google for its algorithms that allegedly contributed to the radicalization of the terrorists responsible.
  • “If the court were to rule in favor of Gonzales you’re going to have this really weird legal situation where technically if you do any type of moderation that signals that you have knowledge that there is bad content on your platform and therefore could be held liable for it,” an expert told DCNF.

A Supreme Court case challenging Section 230, which has unilaterally protected tech companies from legal action in the past, could potentially lead to “disastrous” changes regarding how companies suggest content to their users, according to some experts.

In 2015, 12 shooters killed 130 people in Paris in a coordinated terror attack, and several of the victims’ relatives filed a lawsuit against Google claiming that the company’s algorithms created a rabbit hole radicalizing the terrorists, according to LawFare. If the Supreme Court were to rule in favor of Reynaldo Gonzales, one of the family members of the victims, experts told the Daily Caller News Foundation that there would be a lot less speech in general and that the way we have traditionally understood the internet would completely change.

Chris Marchese, general counsel for NetChoice, a free speech advocate tech company with clients like Amazon, TikTok and Lyft, who is fighting similar cases in Texas and Florida, told the DCNF the “ramifications of this case for the internet are really severe.” (RELATED: ‘Take Aim’: Adam Schiff Threatens Big Tech Unless They Censor More Content)

“First and foremost, I think everyone is in agreement that the facts of Gonzales are atrocious, innocent American lives were lost and international terrorism is at fault,” Marchese said. “If the Supreme Court were to agree with the Department of Justice, as well as Gonzales, and hold that section 230 does not apply to algorithms [then] … [y]ou really can’t have an online website of any kind without some kind of algorithm organizing the content.”

Marchese explained that social media and even your Google search bar would fundamentally change since a large part of the platforms rely on algorithms. Without the ability to suggest content, tech platforms would be unable to perform basic functions such as showing trending topics on Twitter or creating playlists for music and videos.

Section 230 was put in place by Congress in 1996 to create a way for individuals to use a platform and express speech in line with the First Amendment without the companies being held liable for speech that could be hateful, violent or dangerous, according to the Bipartisan Policy Center. Section 230 (c)1, a short 26-word section, has been dubbed the “words that created the internet.”

“No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider,” the section reads.

Rachel Bovard, senior director of policy at the Conservative Partnership Institute, told DCNF that “Gonzales’ case is unique because of the terrorism aspect.”

“This case looks at whether or not the algorithmic amplification of content is still protected under Section 230,” Bovard stated. “This case is trying to say, well when it comes to this case which deals with terrorist content, the action that YouTube/Google took to act upon that content by promoting it is not protected, so they are in fact liable.”

Gonzalez alleges that before the Paris attack, Google created content using algorithms and “knowingly” allowed terrorist groups like ISIS to use this to recruit and “incite violence,” violating the Antiterrorism Act which gave the government special oversight when it came to terrorism-related crimes, according to the ACLU. As a result, Google falls out from under the protection of Section 230 and is partially responsible for the terrorism in Paris.

Marchese stated there would be “far less speech on the internet” if Section 230 was “gutted” by the Supreme Court. Adam Candeub, the director of the Intellectual Property, Information & Communications Law Program at Michigan State University and former Acting Assistant Secretary of Commerce for Telecommunications under the Trump administration, echoed similar thoughts but said it would be platforms that would have to be “a lot more careful about what they say.”

City workers clean the sidewalk and the street in front of the Bataclan concert hall in Paris on December 22, 2015, after the sidewalk in front of the venue was once again made accessible to pedestrians. (FRANCOIS GUILLOT/AFP via Getty Images)

A potentially larger concern raised is whether a ruling in favor of Gonzalez would prevent tech companies from having the freedom to get rid of content that is harmful and violent such as abuse against children. Eric Schnapper, the University of Washington law professor representing Gonzalez, argued his case was a “separate issue.”

“We are in favor of social media sites being able to take down content that’s offensive or dangerous,” Schnapper told the DCNF. “Our concern is that they are recommending things and promoting things that are dangerous.”

Candeub seemed to agree with Schnapper, explaining that 230 (c)2 dealt with such concerns. Section (c)2 protects any “provider” or “user of an interactive computer service” from repercussions if certain content is restricted or censored due to being “obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable,” according to the Bipartisan Policy Center.

“The platforms prefer section (c)1 because the immunity is absolute, (c)2, however, they can only get it if they show good faith and that the content that they are removing fits into that category, so they have a lot more to prove,” Candeub said. “What [Google] will be arguing is that c1 covers everything and therefore if c1 is cut back then they won’t be able to protect children and that’s not true. If (c)1 is cut back they will have to rely on (c)2 and they don’t like c2 as much.”

Marchese disagreed, arguing narrowing the scope of Section 203 (c)1 would have a “disastrous effect.”

“If the court were to rule in favor of Gonzales you’re going to have this really weird legal situation where technically if you do any type of moderation, that signals that you have knowledge that there is bad content on your platform and therefore could be held liable for it,” Marchese pointed out. “Which is precisely what section 230 was meant to do away with.”

The Supreme Court is scheduled to hear the case Gonzalez v. Google LLC on Feb. 21. Candeub said he would be “surprised if the court ruled in favor of Gonzales.”

“There’s just not enough facts in this case for the Supreme Court to say anything useful,” Candeub noted. “We don’t know how these algorithms work. You can’t just wave your hands and expect immunity, Google can’t do that. The question is, is it a speaker, is it communicating its own message through its algorithms?”

Google LLC counsel Lisa Blatt did not respond to the Daily Caller News Foundation’s request for comment.

All content created by the Daily Caller News Foundation, an independent and nonpartisan newswire service, is available without charge to any legitimate news publisher that can provide a large audience. All republished articles must include our logo, our reporter’s byline and their DCNF affiliation. For any questions about our guidelines or partnering with us, please contact licensing@dailycallernewsfoundation.org.