US Supreme Court leaves protections for internet companies unscathed

The justices in the case involving Google LLC's video-sharing platform YouTube sidestepped making a ruling on a bid to weaken a U.S. law called Section 230 of the Communications Decency Act that safeguards internet companies from lawsuits for content posted by users

Is allowance instantly strangers applauded

The U.S. Supreme Court handed internet and social media companies a pair of victories on Thursday, leaving legal protections for them unscathed and refusing to clear a path for victims of attacks by militant groups to sue these businesses under an anti-terrorism law.

The justices in the case involving Google LLC's video-sharing platform YouTube sidestepped making a ruling on a bid to weaken a U.S. law called Section 230 of the Communications Decency Act that safeguards internet companies from lawsuits for content posted by users. The justices in a second ruling shielded Twitter Inc from litigation seeking to apply a federal law called the Anti-Terrorism Act.

In both cases, families of people killed by Islamist gunmen overseas had sued to try to hold internet companies liable because of the presence of militant groups on their platforms or for recommending their content.

The justices in a 9-0 decision reversed a lower court's ruling that had revived a lawsuit against Twitter by the American relatives of Nawras Alassaf, a Jordanian man killed in a 2017 attack during New Year's celebration in a Istanbul nightclub claimed by the Islamic State militant group.

In the case involving YouTube, which along with Google is part of Alphabet Inc, the justices returned to a lower court a lawsuit by the family of Nohemi Gonzalez, a college student from California who was fatally shot in an Islamic State attack in Paris in 2015. The justices declined to address the scope of Section 230, concluding that they did not need to take that step because the family's claims appeared likely to fail given the ruling in the Twitter case.

Section 230 provides safeguards for "interactive computer services" by ensuring they cannot be treated for legal purposes as the "publisher or speaker" of information provided by users.

Calls have come from across the ideological and political spectrum - including Democratic President Joe Biden and his Republican predecessor Donald Trump - for a rethink of Section 230 to ensure that companies can be held accountable for content on their platforms. This case marked the first time the Supreme Court had examined Section 230's reach.


"Countless companies, scholars, content creators and civil society organizations who joined with us in this case will be reassured by this result," said Google General Counsel Halimah DeLaine Prado. "We'll continue our work to safeguard free expression online, combat harmful content and support businesses and creators who benefit from the internet."

Critics have said Section 230 too often prevents platforms from being held accountable for real-world harms. Many liberals have condemned misinformation and hate speech on social media. Many conservatives have said voices on the right are censored by social media companies under the guise of content moderation.

The Istanbul massacre on Jan. 1, 2017, killed Alassaf and 38 others. His relatives accused Twitter of aiding and abetting the Islamic State, which claimed responsibility for the attack, by failing to police the platform for the group's accounts or posts in violation of the Anti-Terrorism Act, which enables Americans to recover damages related to "an act of international terrorism."

Gonzalez's family argued that YouTube provided unlawful assistance to the Islamic State by recommending the militant group's content to users. In their brief ruling on Thursday, the justices wrote that they "decline to address the application of (Section 230) to a complaint that appears to state little, if any, plausible claim for relief."

Twitter and its backers had said that allowing lawsuits like the one brought by Alassaf's family would threaten internet companies with liability for providing widely available services to billions of users because some of them may be members of militant groups, even as the platforms regularly enforce policies against terrorism-related content.

The case hinged on whether the family's claims sufficiently alleged that the company knowingly provided "substantial assistance" to an "act of international terrorism" that would allow the relatives to maintain their suit and seek damages under the anti-terrorism law.

After a judge dismissed the lawsuit, the San Francisco-based 9th U.S. Circuit Court of Appeals in 2021 allowed it to proceed, concluding that Twitter had refused to take "meaningful steps" to prevent Islamic State's use of the platform.

Conservative Justice Clarence Thomas, who authored the ruling, said the allegations made by the plaintiffs were insufficient because they "point to no act of encouraging, soliciting or advising the commission" of the attack.

"Rather, they essentially portray defendants as bystanders, watching passively as ISIS carried out its nefarious schemes," Thomas added.

Biden's administration supported Twitter, saying the Anti-Terrorism Act imposes liability for assisting a terrorist act and not for "providing generalized aid to a foreign terrorist organization" with no causal link to the act at issue.