The Supreme Court announced Monday that the justices will take up cases that go to the heart of whether the law shields YouTube, Twitter and other internet companies from liability over the way they promote content to their users.
Both cases involve families of victims of terrorist attacks, who say the tech giants should be held responsible for fueling the rise of the Islamic State, or ISIS.
One case argues that the tech companies don’t do enough to police what’s posted on their sites, while the other case directly challenges the algorithms social media companies use to recommend more content to their users.
That case could upend Section 230 of the Communications Decency Act, a law that until now has generally shielded social media companies as conduits rather than actual speakers.
Google, which owns YouTube, had urged the justices to leave the current framework in place, warning that changes could rewrite the internet as we know it.
“This Court should not lightly adopt a reading of section 230 that would threaten the basic organizational decisions of the modern internet,” Google’s lawyers wrote in their first brief to the justices in the case Gonzalez v. Google.
The court will also hear a similar case in Twitter v. Taamneh.
The Gonzalez case was brought by the family of Nohemi Gonzalez, a 23-year-old U.S. citizen who was studying abroad in Paris in 2015 when she was gunned down in an ISIS terrorist attack.
Her family is seeking to sue Google, which runs YouTube, saying the company aided and abetted ISIS by allowing the terrorist group to post radical videos and recruitment tools.
Although the lower court found Google not liable under Section 230, the same panel had considered the defendants — Google, Twitter and Facebook — liable in a separate action under federal anti-terrorism statutes, saying they aided and abetted international terrorism.
The lawyer leading the Gonzalez family’s legal team cheered the high court’s move to hear the cases, but wouldn’t say whether it’s a sign the justices will curtail tech companies’ legal shield.
“We are of course pleased that the Court took Gonzalez. What it will mean for tech companies in the future will depend on how the Court decides the cases,” attorney Eric Schnapper said.
The Twitter lawsuit was brought by U.S. relatives of Nawras Alassaf, a Jordanian citizen who was shot and killed at a nightclub in Istanbul in 2017, in an attack claimed by ISIS.
That lawsuit focuses on the Anti-Terrorism Act, with the plaintiffs arguing tech companies can be held liable for “aiding and abetting” ISIS by not doing more to take down its content.
Neither lawsuit alleges the tech companies were directly involved in the attacks.
The cases come to the justices at a time when the tech giants are under intense scrutiny for their handling of deeply divisive political debates, including the COVID-19 pandemic and the 2020 election.
The tech companies say they are carrying out their terms of service in curtailing the spread of disinformation and hate or violent speech.
Both Democrats and Republicans on Capitol Hill have called for updating the law, though there’s little agreement so far on how to do it.
States are moving ahead, however.
Texas and Florida have enacted laws that would allow individuals or the state’s attorney general to sue large social media platforms for squelching a viewpoint. A federal appeals court has said the Texas law can take effect, while the Florida law has been enjoined.
Litigants have already petitioned the justices to take up those cases.
A key question in the Google case the justices have agreed to hear is how the algorithm is used to determine how content is promoted.
Those algorithms are central to the modern concept of social media, and a ruling against Google could force major changes on the industry.
“Today the income of many large interactive computer services is based on advertising,” Mr. Schnapper wrote in his brief to the justices on behalf of the Gonzalez family. “Internet firms that rely on advertising have a compelling interest in increasing the amount of time that individual users spend at their websites. The longer a user is on a website, the more advertising the user will be exposed to; that in turn will increase the revenue of the website operator.”
At least four justices had to vote to take up the cases for them to reach this year’s caseload.
The vote is not revealed, but Justice Clarence Thomas was likely among them. He had signaled in a 2020 statement that he was looking for “an appropriate case” to delve into Section 230.
He said that section of law was written in 1996, at the “dawn of the dot-com era,” when the big legal questions were whether chat rooms could be held liable for what users post. Section 230 was meant to offer protections to companies, saying they weren’t the actual publishers and as long as they didn’t knowingly allow illegal third-party content they were safe from liability.
But Justice Thomas said in the years since, lower courts have expanded that into “sweeping immunity” for tech companies. He said in his 2020 statement that it was time the justices took a look at what the law said, and compared it to what courts have actually done.
• Alex Swoyer can be reached at aswoyer@washingtontimes.com.
• Alex Swoyer can be reached at aswoyer@washingtontimes.com.
Please read our comment policy before commenting.