A federal appeals court ruled this week that social media companies could be liable for algorithms they create to promote content to users.
The dispute stems from the death of a young girl who watched a viral challenge video on TikTok that had been promoted to her “For You Page” by the tech company’s algorithm. The case was brought against TikTok and its parent company, ByteDance.
A spokesperson from TikTok did not immediately respond to a request for comment.
The Philadelphia-based 3rd U.S. Circuit Court of Appeals ruled Tuesday that the case against TikTok brought by the mother of 10-year-old Nylah Anderson could proceed because the company can’t escape liability under Section 230 of the Communications Decency Act.
Nylah died in 2021 after attempting a “Blackout Challenge” where people choke themselves with belts or strings until they pass out. The individuals doing the challenge would record themselves and post the videos to TikTok, which would place those posts on certain users’ “For You Page.”
Mother Tawainna Anderson sued in 2022, claiming the company was aware of the Blackout Challenge and was negligent in recommending and promoting the “Blackout Challenge” videos to her daughter.
The district court had dismissed the case, reasoning Section 230 gives social media companies protection from liability for content created by third parties.
Section 230 of the Communications Decency Act generally holds internet firms harmless for postings on their sites.
But in its 11-page ruling, a three-judge panel vacated that decision in part and sent the dispute back to lower court for further proceedings.
The appeals panel — composed of one Obama appointee and two Trump appointees — reasoned that because TikTok created algorithms to promote certain content to a user, it’s “first-party” speech by TikTok itself and thus Section 230’s third-party immunity is inapplicable.
Citing a Supreme Court ruling earlier this term about red states that had passed strict laws governing social media platforms, the panel said the justices noted that platforms “engage in protected first-party speech under the First Amendment when they curate compilations of others’ content via their expressive algorithms … it follows that doing so amounts to first-party speech under § 230, too.”
The high court case referenced by the 3rd Circuit was Moody v. Netchoice, where the justices unanimously ducked deciding whether Florida’s and Texas’ social media laws restricting companies from taking certain actions on users’ posts and accounts ran afoul of the First Amendment.
The justices, in July, sent the dispute back to lower courts for further proceedings. But in its opinion, the high-court majority recognized that social media companies have a First Amendment right in how they carry out “content moderation.”
The 3rd Circuit, relying on that precedent, said that Section 230 only applies to information posted by another party.
“Here, because the information that forms the basis of Anderson’s lawsuit — i.e., TikTok’s recommendations via its [For You Page] algorithm — is TikTok’s own expressive activity, § 230 does not bar Anderson’s claims,” the court ruled.
Jeffrey Goodman, a lawyer for the Anderson family, said the ruling was a major loss to big tech.
“Big Tech just lost its ‘get-out-of-jail-free’ card,” he said. “This ruling ensures that the powerful social media companies will have to play by the same rules as all other corporations, and when they indifferently cause harm to children, they will face their day in court.”
The Anderson family released a statement following the ruling, saying they hope the court’s decision helps others.
“Nothing will bring back our beautiful baby girl. But we are comforted knowing that – by holding TikTok accountable – our tragedy may help other families avoid future, unimaginable suffering. Social Media companies must use their technology to prevent dangerous content from being consumed by young children; they’ve got to stop exploiting children in the name of profit,” the statement read.
• Alex Swoyer can be reached at aswoyer@washingtontimes.com.
Please read our comment policy before commenting.