Justices at the United States Supreme Court have expressed uncertainty over whether to narrow a legal shield protecting internet companies from a wide array of lawsuits, in a major case involving YouTube and the family of an American student fatally shot in a 2015 rampage in Paris.
The justices heard arguments in an appeal by the family of Nohemi Gonzalez, a 23-year-old student at California State University, Long Beach, who was studying in France, after a lower court dismissed their lawsuit against Google LLC-owned YouTube. Google and YouTube are part of Alphabet Inc.
In dismissing the lawsuit, the San Francisco-based 9th US Circuit Court of Appeals relied on a federal law called Section 230 of the Communications Decency Act of 1996, which protects internet companies from liability for content posted by their users.
This case marks the first time the Supreme Court is examining the scope of Section 230.
The justices asked questions that reflected their concerns about the potential consequences of limiting immunity for internet companies — or figuring out where to draw that line. They also conveyed scepticism that these businesses should be shielded for certain types of harmful or defamatory content.
“These are not the nine greatest experts on the internet,” liberal Justice Elena Kagan said of her fellow Supreme Court members, eliciting laughter in the courtroom.
Gonzalez’s family claimed that YouTube, through its computer algorithms, unlawfully recommended videos by the group ISIL (ISIS), which claimed responsibility for the Paris attacks that killed 130 people. The recommendations helped spread ISIL’s message and attract recruits, the lawsuit said.
Kagan told a lawyer for the Gonzalez family, Eric Schnapper, that algorithms are widely used to organise and prioritise material on the internet and asked: “Does your position send us down the road such that [Section] 230 really can’t mean anything at all?”
Schnapper replied “no” and added, “As you say, algorithms are ubiquitous. But the question is, ‘What does the defendant do with the algorithm?’”
The lawsuit, accusing the company of providing “material support” for “terrorism”, was brought under the US Anti-Terrorism Act, a federal law that lets Americans recover damages related to “an act of international terrorism”.
The justices wondered whether YouTube should lose immunity if the algorithms that provide recommendations are “neutral” or are used to organise content based on users’ interests.
“I’m trying to get you to explain to us how something that is standard on YouTube for virtually anything that you have an interest in suddenly amounts to aiding and abetting because you’re in the ISIS category,” Justice Clarence Thomas told Schnapper.
Upending the internet?
Justice Samuel Alito asked Lisa Blatt, the lawyer representing Google, “Would Google collapse and the internet be destroyed if YouTube and therefore Google were potentially liable for hosting and refusing to take down videos that it knows are defamatory and false?”
Blatt responded, “Well, I don’t think Google would. I think probably every other website might be because they’re not as big as Google.”
The justices also questioned where to draw the line at weakening Section 230 protections.
Conservative Chief Justice John Roberts asked whether Section 230 should apply given that recommendations are provided by YouTube itself. “The videos don’t just appear out of thin air. They appear pursuant to the algorithms,” he said.
Kagan wondered about a website delivering defamatory content to millions of its users.
“Why should there be protection for that?” Roberts asked.
Google and its supporters have said a win for the plaintiffs could prompt a flood of litigation against platforms and upend how the internet works. Many websites and social media companies use similar technology to give users relevant content such as job listings, search engine results, songs and movies.
The case is a threat to free speech, they added, because it could force platforms to suppress anything that could be considered remotely controversial.
Section 230 protects “interactive computer services” by ensuring they cannot be treated as the “publisher or speaker” of information provided by users. Legal experts note that companies could employ other legal defences if Section 230 protections are eroded.
Critics of the law have said it too often prevents platforms from being held accountable for real-world harms. Many liberals have condemned misinformation and hate speech on social media. Many conservatives have said voices on the right are censored by social media companies under the guise of content moderation.
President Joe Biden‘s administration has called for Section 230 to be reformed and asked the Supreme Court to revive the lawsuit by Gonzalez’s family, including her mother Beatriz Gonzalez and stepfather Jose Hernandez.
The 9th US Circuit Court of Appeals in 2021 ruled that the lawsuit was barred by Section 230 because it was seeking to hold Google accountable for the ISIL content, and its algorithms did not treat the group’s content differently than any other user-created content.