- The Washington Times - Thursday, August 29, 2024

A federal appeals court ruled this week that social media companies can be liable for algorithms they create to promote content to their users, raising new questions about when the companies could face charges for their own speech versus the postings of others on their sites.

The case in the 3rd U.S. Circuit Court of Appeals stems from the death of a 10-year-old girl who watched a viral challenge video on TikTok that had been promoted to her “For You” feed by the tech company’s algorithm. The court ruled the case could proceed because the company can’t escape liability under Section 230 of the Communications Decency Act, which generally holds internet firms harmless for postings on their sites.

Legal experts say the case could finally force the Supreme Court to grapple with Section 230.

“This is a developing area of law with developing facts and new scenarios,” said Ilya Shapiro, director of constitutional studies at the Manhattan Institute. “If this ends up with a finding of liability and significant damages against TikTok, then I think the Supreme Court would have to take it up because that would be a novel thing.”

The high court has shied away from diving into the scope of Section 230.

The justices last year took up a dispute involving the families of victims of a terrorist attack in Paris, who said the tech companies aided and abetted the terrorist organization in its recruitment.

The justices decided not to weigh the platforms’ claims of immunity under Section 230 and instead evaluated the claims under a federal anti-terrorism act.

Last month, the justices declined to take up a case against Snap, the parent company of Snapchat, over Section 230 concerns in a dispute brought by an unidentified man who overdosed at age 15 from drugs that his science teacher gave him after grooming him through the secret messaging app for a sexual relationship.

The man, identified as Doe, sued over what he called a flawed design that aids sexual predators, but the lower courts dismissed the claim, reasoning that Section 230 gives Snap immunity.

Justice Clarence Thomas and Justice Neil M. Gorsuch dissented from the court’s move to reject Doe’s case.

Mary Graw Leary, a professor at the Catholic University of America, said lower court rulings are pressing the justices to guide the scope of immunity for tech platforms.

“Section 230 has shut the courthouse doors to so many victims,” she said. “Ultimately, the court is going to have to respond.”

The TikTok case, brought against the platform and its parent company, ByteDance, involves a “blackout challenge” that dared users to choke themselves with belts or strings until they passed out. The people doing the challenge would record themselves and post the videos to TikTok, placing those posts on certain users’ “For You” feeds.

Nylah Anderson died in 2021 while taking the challenge.

Her mother, Tawainna Anderson, sued in 2022, saying the company was aware of the blackout challenge and was negligent in recommending and promoting the videos to her daughter.

The district court dismissed the case, reasoning that Section 230 protects social media companies from liability for content created by third parties.

In its 11-page ruling, a three-judge panel of the 3rd Circuit vacated that decision in part and sent the dispute back to a lower court for further proceedings.

The appeals panel, comprising one Obama appointee and two Trump appointees, reasoned that because TikTok created algorithms to promote certain content to a user, it is “first-party” speech by TikTok and thus does not apply to Section 230.

Citing a Supreme Court ruling last term about states that passed strict laws governing social media platforms, the panel said the justices noted that platforms “engage in protected first-party speech under the First Amendment when they curate compilations of others’ content via their expressive algorithms. … It follows that doing so amounts to first-party speech under § 230, too.”

A spokesperson from TikTok did not respond to a request for comment.

The 3rd Circuit referenced the high court case Moody v. Netchoice, in which the justices unanimously ducked a decision on whether Florida’s and Texas’ social media laws restricting companies from taking certain actions on users’ posts and accounts violated the First Amendment.

In July, the justices returned the dispute to lower courts for further proceedings, but the majority opinion recognized that social media companies have a First Amendment right in how they carry out “content moderation.”

The 3rd Circuit, relying on that precedent, said Section 230 applies only to information posted by another party.

“Here, because the information that forms the basis of Anderson’s lawsuit — i.e., TikTok’s recommendations via its [For You] algorithm — is TikTok’s own expressive activity, § 230 does not bar Anderson’s claims,” the court ruled.

Adam Feldman, a Supreme Court scholar and creator of the Empirical SCOTUS blog, said the high court did not put Section 230 “through a rigorous analysis.”

“This decision may well come to the Supreme Court either as it relates to Moody or in its own right,” he said. “Ultimately, the Supreme Court has a lot of work to do in this area, and it seems to be letting the issues percolate from below before it goes in to presumably clear up any doctrinal ambiguities.”

Jeffrey Goodman, an attorney for the Anderson family, said the 3rd Circuit’s ruling was a significant loss to Big Tech.

“Big Tech just lost its ‘get out of jail free’ card,” he said. “This ruling ensures that the powerful social media companies will have to play by the same rules as all other corporations, and when they indifferently cause harm to children, they will face their day in court.”

The Anderson family released a statement saying they hope the court’s decision helps others.

“Nothing will bring back our beautiful baby girl. But we are comforted knowing that — by holding TikTok accountable — our tragedy may help other families avoid future, unimaginable suffering. Social media companies must use their technology to prevent dangerous content from being consumed by young children; they’ve got to stop exploiting children in the name of profit,” the statement read.

• Alex Swoyer can be reached at aswoyer@washingtontimes.com.

Copyright © 2024 The Washington Times, LLC. Click here for reprint permission.

Please read our comment policy before commenting.