- The Washington Times - Wednesday, June 28, 2023

Legal analysts say the hundreds of school districts suing Big Tech over social media’s negative impact on youth mental health won’t succeed and their litigation likely will be dismissed.

Although they have good intentions, the school districts will have trouble proving that tech platforms such as Facebook, TikTok and Snapchat have caused depression and anxiety among students, they say.

“It is a lawsuit that is pushing the boundaries in multiple ways,” said Don Gifford, a law professor at the University of Maryland. “The cause of those problems could be things at home. It could be the fact they are going to school and could potentially end up being shot. There are a lot of reasons people get anxious and depressed, so the causation issue is going to be very difficult for the plaintiff.”

“We are dealing with [free] speech and gigantic causation issues,” said Larry Levine, a law professor at the University of the Pacific. “The courts are going to just think it’s too remote or too hard to prove.”

Montgomery County Public Schools, the largest school system in Maryland, joined dozens of other school systems this month to sue several major technology companies in federal court. The districts are in several states, including Pennsylvania, Arizona and Florida.

The massive litigation accuses the social media companies of profiting from vulnerable children and causing depression, violence and self-harm.

James Frantz of Frantz Law Group is spearheading the lawsuit on behalf of several school districts. He anticipates that more than 1,000 districts in at least 35 states will be involved in the litigation within the next month.

The litigation aims to hold Big Tech liable for using algorithms that target youths. The lawsuit says the algorithms are “intentionally and deliberately designed to exploit and cause minors to become addicted, which has caused the harm.”

“It is a travesty what has happened, and these social media companies won’t take it upon themselves to regulate themselves,” Mr. Frantz told The Washington Times when he filed the lawsuit for Montgomery County.

His 107-page lawsuit says Meta and its social networks Facebook, Instagram and WhatsApp; Snap Inc., the parent company of Snapchat; TikTok and its parent company, ByteDance; and Alphabet and its companies, Google and YouTube, should be held accountable under federal law for negligence and conspiracy to cause harm to minors.

Google spokesperson Jose Castaneda rejected the allegations.

“Protecting kids across our platforms has always been core to our work. In collaboration with child development specialists, we have built age-appropriate experiences for kids and families on YouTube, and provide parents with robust controls. The allegations in these complaints are simply not true,” Mr. Castaneda said.

Meta has added more parental controls because of adolescent mental health concerns. The move allows parents to monitor how much time their children spend on social media and which accounts they follow.

“We want to reassure every parent that we have their interests at heart in the work we’re doing to provide teens with safe, supportive experiences online. We’ve developed more than 30 tools to support teens and their families, including tools that allow parents to decide when, and for how long, their teens use Instagram, age verification technology, automatically setting accounts belonging to those under 16 to private when they join Instagram, and sending notifications encouraging teens to take regular breaks,” said Antigone Davis, head of safety at Meta.

“We’ve invested in technology that finds and removes content related to suicide, self-injury or eating disorders before anyone reports it to us. These are complex issues, but we will continue working with parents, experts and regulators such as the state attorneys general to develop new tools, features and policies that meet the needs of teens and their families,” Ms. Davis added.

A spokesman from Snapchat differentiated the platform from others.

“Our app opens directly to a camera rather than a feed of content that encourages passive scrolling and is primarily used to help real friends communicate. We aren’t an app that encourages perfection or popularity, and we vet all content before it can reach a large audience, which helps protect against the promotion and discovery of potentially harmful material. While we will always have more work to do, we feel good about the role Snapchat plays in helping friends feel connected, informed, happy and prepared as they face the many challenges of adolescence,” the spokesperson said.

The lawsuit says Section 230 of the Communications Decency Act, which shields internet companies from legal liability for content posted by third parties, shouldn’t be an escape from accountability for the tech giants because the companies know of the harm and don’t censor the damaging content.

The Supreme Court heard two cases this term challenging the liability of technology companies, including an opportunity to chip away at Section 230, but the justices left the protections intact.

Lawsuits against the tech companies have been mounting this year, so the federal cases are being consolidated in the Northern District of California under U.S. District Judge Yvonne Gonzalez Rogers, an Obama appointee. Seattle Public Schools filed the first lawsuit in January.

Mr. Levine said he doubts the tech companies will even settle the lawsuits. The litigation process for the plaintiffs will be costly if the judge allows them to go to discovery in an attempt to prove the liability of the social media giants, he said.

Meta CEO Mark Zuckerberg said in a 2021 Facebook post that his company doesn’t push content on users to induce responses.

“The argument that we deliberately push content that makes people angry for profit is deeply illogical,” Mr. Zuckerberg wrote.

The U.S. surgeon general this year issued an advisory report titled “Social Media and Youth Mental Health,” which found that social media has both positive and negative effects on youths.

It said social accounts allow self-expression and connections but noted that a study of 12- to 15-year-olds who spent more than three hours a day on social media “faced double the risk of experiencing poor mental health outcomes including symptoms of depression and anxiety.”

A report this year from the Centers for Disease Control and Prevention found that teen girls are experiencing increased sadness and violence related to social media.

“In 2021, 16% of high school students were electronically bullied, including through texting, Instagram, Facebook or other social media, during the past year. Female students were more likely than male students to be electronically bullied,” that report states.

“Essentially, schools are saying, ‘Look at what’s happening to our youth, and you, the social media companies, are responsible,’” Robert Hachiya, an education professor at Kansas State University, told Education Week in April about the litigation. “There’s no question there is a problem. The issue is, how can social media companies be assigned some kind of liability for this problem?”

He said it isn’t about money. “It’s about getting them to change their practices.”

• Alex Swoyer can be reached at aswoyer@washingtontimes.com.

Copyright © 2024 The Washington Times, LLC. Click here for reprint permission.

Please read our comment policy before commenting.

Click to Read More and View Comments

Click to Hide