- The Washington Times - Thursday, May 18, 2023

The Supreme Court gave some breathing room to technology companies on Thursday by shooting down two lawsuits by relatives of victims of terrorism who wanted to hold social media platforms liable for hosting terrorist content.

The justices ducked major questions about how far federal law goes in shielding companies from posting on their sites. Instead, the justices ruled that the families failed to show sufficiently firm connections between general terrorist activity online and the specific attacks that killed their relatives.

“Plaintiffs’ allegations are insufficient to establish that these defendants aided and abetted ISIS in carrying out the relevant attack,” Justice Clarence Thomas wrote for the court. “They essentially portray defendants as bystanders, watching passively as ISIS carried out its nefarious schemes. Such allegations do not state a claim for culpable assistance or participation.”

The court delivered two unanimous rulings on the two cases, which all sides in the social media debate watched closely.

The cases make clear that those challenging social media companies’ operations must prove specific connections between online postings and the damages they claim.

Some social media critics hoped the court would narrow the scope of tech companies’ liability protections under Section 230 of the Communications Decency Act, which generally holds internet firms harmless for postings on their sites.

The justices said the case that specifically challenged Section 230 was too tenuous for them to address.

“It has become clear that plaintiffs’ complaint — independent of §230 — states little if any claim for relief,” the court said in an unsigned opinion.

That case involved a challenge by relatives of Nohemi Gonzalez, an American killed in an Islamic State attack in Paris. Her family said YouTube, owned by Google, should be liable because it didn’t do enough to purge ISIS material and even promoted it to users through algorithms for recommended videos.

Another case involved a challenge against Twitter, Google and Facebook by U.S. relatives of Nawras Alassaf, a Jordanian who was killed in an Islamic State-inspired mass shooting in Istanbul. The family said the social media giants aided and abetted ISIS by hosting its content and, in some cases, deriving ad revenue from it.

They sued under the Justice Against Sponsors of Terrorism Act.

In his 38-page opinion, Justice Thomas said the family would have had to show that the tech companies knowingly provided substantial assistance to terrorists that helped them commit the attack.

He said general claims that Twitter and the other companies weren’t diligent enough to weed out terrorist content were insufficient.

“Our legal system generally does not impose liability for mere omissions, inactions, or nonfeasance; although inaction can be culpable in the face of some independent duty to act, the law does not impose a generalized duty to rescue,” he wrote.

During oral arguments this year, attorneys for the victims’ families told the justices that the tech companies weren’t doing enough.

Eric Schnapper, an attorney for the Gonzalez family, said the challenge wasn’t against Google’s ability to run social media sites but rather the algorithms that promoted content based on users’ searches.

That meant someone looking for ISIS content was fed even more of it, which Mr. Schnapper argued crossed lines. He said the government and media told the company about promoting violent content “dozens of times.”

“They did almost nothing about it,” he told the justices.

The justices wondered where to draw lines. They pondered whether a telecommunications company could be liable for someone who uses a phone to make criminal plans or whether a taxi driver who unknowingly transports someone who then commits a crime would bear liability.

The court said Thursday that those questions weighed heavily in favor of the tech companies.

“If aiding-and-abetting liability were taken too far, then ordinary merchants could become liable for any misuse of their goods and services, no matter how attenuated their relationship with the wrongdoer. And those who merely deliver mail or transmit emails could be liable for the tortious messages contained therein,” Justice Thomas wrote. “For these reasons, courts have long recognized the need to cabin aiding-and-abetting liability to cases of truly culpable conduct.”

Chris Marchese, director at NetChoice Litigation Center, a trade group for tech companies, called the rulings a victory for “free speech on the internet.” He said the court’s decisions prevent the companies from having to strictly police postings on their sites.

“Even with the best moderation systems available, a service like Twitter alone cannot screen every single piece of user-generated content with 100% accuracy. Imposing liability on such services for harmful content that unintentionally falls through the crack would have disincentivized them from hosting any user-generated content,” Mr. Marchese said.

Sen. Ron Wyden, an Oregon Democrat who helped write Section 230, called the rulings “thoughtful” and said they should derail a push on Capitol Hill to rewrite Section 230.

He said most liability challenges to tech companies would fail under First Amendment grounds or, as in the Supreme Court cases, an inability to prove a connection to the companies.

“While tech companies still need to do far better at policing heinous content on their sites, gutting Section 230 is not the solution,” Mr. Wyden said.

He said Congress should focus instead on protecting consumer privacy and reining in data brokers “in ways that don’t make it harder for users to speak or receive information.”

Not all senators agreed.

Senate Judiciary Committee Chairman Richard J. Durbin, Illinois Democrat, said the court’s rulings were disappointing.

“The Justices passed on their chance to clarify that Section 230 is not a get-out-of-jail-free card for online platforms when they cause harm,” Mr. Durbin said. “Enough is enough. Big Tech has woefully failed to regulate itself. Congress must step in, reform Section 230, and remove platforms’ blanket immunity from liability.”

 

• Stephen Dinan can be reached at sdinan@washingtontimes.com.

• Alex Swoyer can be reached at aswoyer@washingtontimes.com.

Copyright © 2024 The Washington Times, LLC. Click here for reprint permission.

Please read our comment policy before commenting.

Click to Read More and View Comments

Click to Hide