The Supreme Court cast a skeptical eye Tuesday on a challenge to big social media companies, with justices expressing concern that stripping them of legal protection for how they promote content on their websites could upend the entire internet economy.
“Lawsuits will be nonstop,” predicted Justice Brett M. Kavanaugh.
The case before the justices came from the family of a woman who died in an Islamic State terrorist attack in Paris in 2015.
The family, citing U.S. anti-terrorism law, said Google, which owns YouTube, could be held liable for promoting ISIS content because its algorithms suggest the terrorist group’s videos to people who go looking for them.
But the implications of the case are much broader.
Several justices said those algorithms are at the heart of the internet, and that opening companies up to liability could destroy everything from search engines to dating sites to restaurant reviews.
Justice Amy Coney Barrett questioned whether the act of retweeting or “liking” someone else’s content would be liable.
Lisa Blatt, Google’s lawyer, said the law’s theory is that if the harm flows from a posted video or story, it’s the creator of the content, not the internet company, that’s speaking. And she said the company contributes the algorithm that decides which content to suggest to each user — which is not speech.
“There are billions of hours of video a day watched on YouTube,” she said. “They have to organize it somehow.”
Tuesday’s case, and another one to be heard Wednesday, both challenge the way tech companies claim protection under Section 230 of the Communications Decency Act, a 1996 law that said the companies shouldn’t be considered publishers for content provided by third parties.
Relatives of Nohemi Gonzalez, an American who was killed in Paris, say ISIS posted content on YouTube, and that Google wasn’t diligent enough in removing it, and indeed in some instances would actually “recommend” ISIS videos to users.
Wednesday’s case was brought by U.S. relatives of Nawras Alassaf, a Jordanian who was killed in an ISIS-inspired mass shooting in Istanbul. They said Twitter, Facebook and Google aided and abetted ISIS by hosting its content and, in some cases, deriving ad revenue from it.
Eric Schnapper, a lawyer for the Gonzalez family, said the companies had been told by the government and the media about the promotion of the violent content “dozens of times.”
“They did almost nothing about it,” he said.
The Biden administration presented arguments that service providers can be sued for organizational purposes and decisions, but can still enjoy legal protection under Section 230 for content that’s created by a third party. The feds suggested the issue for YouTube and Google is with recommending certain videos.
“It’s still the platform’s own choice,” said Malcolm Stewart, deputy solicitor general.
The technical questions Tuesday sometimes flummoxed the justices, all of whom are at least 50 years old, and grew up in a pre-internet world.
It’s a point Justice Elena Kagan made.
“We’re a court. We really don’t know about these things. These are not like the nine greatest experts on the internet,” she said.
She said given the potential stakes, with tech companies warning an adverse ruling could crash the digital economy, judges should be cautious about inserting themselves into the complicated argument.
“Isn’t that something for Congress to do, not the court?” she said.
Mr. Schnapper said the families aren’t challenging the companies’ hosting services, but do challenge their algorithms that promote content or that automatically show additional content.
He said those tools cross the line because the company is doing the promoting, giving content more visibility than it would have received otherwise.
Justices said his argument was confusing.
“I don’t know where you’re drawing the line, that’s the problem,” said Justice Samuel A. Alito Jr.
Justice Clarence Thomas, who has signaled skepticism over the broad protection big tech has enjoyed, also was confused by Mr. Schnapper’s attempt to draw lines.
“You have to give us a clearer example of what your point is exactly,” Justice Thomas told the family’s counsel.
And Justice Ketanji Brown Jackson pronounced herself “thoroughly confused.”
At the same time, she sounded a concern over unleashed tech companies.
She said when Congress crafted Section 230 in the 1990s, its goal was to offer protection to services that were trying to erase objectionable content. She said the companies are now turning that on its head.
“You’re saying the protection extends to internet platforms that are promoting offensive materials,” she told Ms. Blatt.
The cases against both Google and Twitter come to the justices at a time when the tech giants are under intense scrutiny for their handling of deeply divisive political debates, including the COVID-19 pandemic and the 2020 election.
Both Democrats and Republicans on Capitol Hill have called for updating Section 230, but there’s little agreement on how to do it.
States are moving ahead, however. Texas and Florida have enacted laws that would allow individuals or the state’s attorney general to sue large social media platforms for squelching a viewpoint. Litigants have already petitioned the justices to take up those cases.
The high court has asked the federal government to weigh in on those cases as they consider what to do with the lawsuits against Google and Twitter.
Court watchers have suggested that these cases could get swept up in the question over Section 230 liability that the justices will decide this term, likely issuing rulings by the end of June.
• Stephen Dinan can be reached at sdinan@washingtontimes.com.
• Alex Swoyer can be reached at aswoyer@washingtontimes.com.
Please read our comment policy before commenting.