- The Washington Times - Tuesday, March 19, 2024

Chase Nasca, 16, took his life two years ago by stepping in front of a moving train. His parents and many others suspect TikTok drove him to suicide by bombarding him with more than 1,000 unsolicited, psychologically disturbing videos.

As Congress weighs legislation that could lead to a nationwide TikTok ban, opponents of the social media app say China’s access and control over the platform is just one of its problems. Critics say the app is uniquely harmful to children and deadly in some cases.

“It would be unwise to fail to acknowledge that what we’re dealing with is the potential exploitation of our children on a mass scale, with the interests of a geopolitical adversary in mind,” said Michael Toscano, executive director of the conservative-leaning Institute for Family Studies.

Many children see disturbing images on TikTok the second they open the app.

The “For You” page on the Chinese-affiliated social media platform runs videos on autoplay, often with shocking algorithms that, in the case of Chase, were centered on violence and suicide.

The Institute for Family Studies filed an amicus brief in the case of TikTok Inc. v. Knudsen. It sided with Montana’s legally contested legislation that would ban the app unless it separates from its parent company, Beijing-based ByteDance.

Montana is the only state that has moved to ban TikTok, but the law is mirrored in federal legislation that the House has passed and President Biden has pledged to sign. It could lead to a nationwide ban by requiring TikTok to be sold to a non-Chinese entity.

The House bill centers on cutting off China’s access to TikTok and its user data.

Mr. Toscano and other opponents of the app say the content is dangerous, particularly to children.

Chase was one of many children who TikTok critics say fell victim to harmful content on the site.

Lalani Erika Walton, 8, and Arriani Jaileen Arroyo, 9, died in 2021 by accidental self-strangulation.

In separate incidents, the two were attempting the “blackout challenge,” a trend that went viral on TikTok. Users posted videos of self-strangulation until they lost consciousness. The videos were featured on Lalani’s and Arriani’s “For You” pages.

Both were on the social media platform despite TikTok’s requirement that users be at least 13.

Arroyo was found hanging by her dog’s leash in her room. Walton was strangled after tying a rope to her bed, where her bathing suit had been laid out in anticipation of swimming later in the day.

“As child advocates, we don’t care who owns TikTok if they keep sending suicidal content, harmful content and addictive materials to young people. It doesn’t matter who owns it,” said Matthew P. Bergman, who represents the families of Chase, Lalani and Arriani in lawsuits against TikTok. “Separate and apart from appropriate national security concerns is a public safety concern that is omnipresent.”

More than 1,200 families are involved in lawsuits against TikTok and other social media companies, and child psychologists are warning parents against allowing children younger than 16 and even 18 to access social media apps.

TikTok, they say, is uniquely harmful to children because, unlike YouTube, Instagram and other social media apps, the site uses an algorithm that is designed to aggressively push increasingly more harmful content. Rather than basing a user’s video feed solely on preferences, sharing and “likes,” it pushes users “down a rabbit hole” by feeding more extreme content based in part on how long a video is viewed.

A user’s feed can quickly be overwhelmed with videos about eating disorders, suicide and other harmful content that could foster damaging addiction in children.

“The algorithm is watching you, to see what you watch to the end and to see what you watch again. And it starts to customize its offerings,” Dr. Leonard Sax, a pediatrician and psychologist who lectures parents on the online habits of children, told The Washington Times.

Dr. Sax said children believe the app can practically read their minds based on customized video feeds, and not necessarily in a positive way.

The app appears to recognize and exploit users’ vulnerabilities and can accelerate dark and disturbing algorithms that encourage dangerous behavior.

“It’s leading kids down a rabbit hole, and girls especially are being sucked into these threads about anorexia, valorizing self-harm, and suicide,” Dr. Sax said.

TikTok has more than 170 million users in the United States. Company representatives did not respond to requests for data on age breakdown, but a 2023 Pew Research Center report that polled nearly 1,500 children ages 13-17 found that 63% used TikTok and nearly half used the app almost constantly or several times a day.

Many users may be younger children.

In 2020, The New York Times revealed internal TikTok data that classified more than one-third of its daily American users as 14 or younger.

In a recent examination of the app’s impact on children, the Center for Countering Digital Hate set up several TikTok accounts based on profiles of 13-year-olds.

The accounts were paused briefly on videos about body image and mental health, and videos were “liked.”

In less than three minutes, TikTok recommended “suicide content.” Within eight minutes, TikTok served content related to eating disorders. Every 39 seconds, TikTok recommended videos about body image and mental health to teens.

“The results are every parent’s nightmare. Young people’s feeds are bombarded with harmful, harrowing content that can have a significant cumulative impact on their understanding of the world around them and their physical and mental health,” said Imran Ahmed, CEO of the Center for Countering Digital Hate.

Last year, TikTok set up 60-minute daily time limits for users younger than 18, but the restriction is reportedly easy to bypass.

China imposes much stricter limits on the app. Teenagers in China can use TikTok for only 40 minutes daily, and the content is limited to child-friendly videos.

Researchers say the U.S. version of the app is not all harmful content and has helped those with mental health issues find support and information.

A 2023 University of Minnesota study found that TikTok’s algorithm could start with helpful videos for those seeking mental health support but then evolve into a harmful spiral of negative content. Study participants said clicking the app’s “Not interested” button did not stop disturbing videos from appearing in their feeds.

TikTok’s U.S. fate now lies in the Senate, where the House-passed bill requiring the app to be sold to a non-Chinese company has stalled but appears to have bipartisan support.

TikTok CEO Shou Zi Chew told users last week that the legislation “will lead to a ban on TikTok in the United States” if it becomes law and urged users to call the Senate and “make your voices heard.”

Though the bill passed by an overwhelming bipartisan vote in the House, dozens of lawmakers voted against banning the app. The bill’s opponents argued that the platform is critical for small businesses and provides social connections and support for users that would otherwise be difficult to find.

Rep. Robert Garcia, California Democrat, said TikTok is a place for gathering information about the world, politics, entertainment and pop culture.

“We know that TikTok is also a space for representation, and banning TikTok also means taking away a voice and a platform for communities of color and queer creators that have made TikTok their home,” Mr. Garcia said.

Child advocates said forcing new ownership would not save children from TikTok’s harmful content.

“This whole debate about what’s going to happen in the Senate is irrelevant to the topic I discuss with parents, which is whether TikTok is safe,” Dr. Sax said. “The answer is no, which means your kid should not be on it.”

• Susan Ferrechio can be reached at sferrechio@washingtontimes.com.

Copyright © 2024 The Washington Times, LLC. Click here for reprint permission.

Please read our comment policy before commenting.