Parents say TikTok algorithms killed their kids

Chase Nasca, 16 years old, took his own life by stepping into a moving train two years ago. His parents, as well as many others, believe that TikTok is responsible for his suicide because it bombarded him with over 1,000 psychologically disturbing videos.

While Congress considers legislation that may lead to a TikTok nationwide ban, opponents say China’s control and access over the platform are just some of the problems. Critics claim that the app can be deadly and harmful for children.

Michael Toscano is executive director of conservative Institute for Family Studies.

As soon as they open TikTok, many children are confronted with disturbing images.

Make America Wealthy Once Again

When they stop talking “unity,” don’t be surprised when Biden and his team to ramp up their lies about Trump and the GOP to Black voters. Fortunately, we know how to engage Black voters and share the truth. And we don’t stop at 8 pm! I can assure you Black America wants a change. Help Bo Snerdley’s New Journey make that happen.

Make America Wealthy Once Again
1776 Coalition Sponsored

The “For You’ page of the Chinese-affiliated platform plays videos automatically, with algorithms that are often shocking. In the case Chase, the algorithm was centered around violence and suicide.

The Institute for Family Studies has filed an amicus curiae brief in the case TikTok Inc. V. Knudsen. It supported Montana’s controversial legislation, which would have banned the app unless ByteDance, a Beijing-based parent company of TikTok Inc. was separated.

Montana is the only US state to have banned TikTok. However, the federal legislation has been passed by the House and signed by President Biden. This could lead to an overall ban if TikTok is sold to non-Chinese entities.

The House bill focuses on blocking China’s access TikTok users data and TikTok.

Mr. Toscano, and other critics of the app, say that the content is harmful to children.

Chase is one of the many children that critics of TikTok say were affected by harmful content on this site.

Lalani Erika Walton and Arriani jaileen Arroyo died by self-strangulation in 2021.

Both men were trying to complete the “blackout” challenge, a viral trend on TikTok. Users uploaded videos of themselves strangulating until they lost consciousness. These videos appeared on Lalani and Arriani “For You” page.

The two were both on TikTok despite the fact that TikTok requires users to be at least 13 years old.

Arroyo’s dog leash was discovered hanging in her bedroom. Walton was strangled by a rope after she tied it to her bed where her bathing suits were laid out, in anticipation of going swimming later that day.

As child advocates, it doesn’t matter who owns TikTok as long as they continue to send harmful, addictive and suicidal materials to children. Matthew P. Bergman represents the families who sued TikTok, and said that it doesn’t really matter who owned it. Separate and apart from the appropriate national security concerns, there is an omnipresent public safety concern.

Child psychologists warn parents not to allow children under 16 or 18 years old to use social media apps.

They claim that TikTok is particularly harmful for children, because unlike YouTube, Instagram, and other social media applications, it uses an algorithm designed to aggressively push more harmful content. It does not base a user’s video feed on their preferences, sharing or “likes” but instead pushes them “down a hole” with more extreme content that is based on the length of a video.

Videos about eating disorders, suicidal thoughts and other damaging content can overwhelm a user’s feed.

The algorithm watches you to see what videos you finish and what you repeat. It starts to tailor its offerings,” said Dr. Leonard Sax. He is a pediatrician, psychologist, and speaker who teaches parents about the online habits of their children.

Dr. Sax stated that children think the app is able to read their minds, and not always in a good way.

App appears to identify and exploit user vulnerabilities. It can also accelerate dark algorithms that encourage dangerous behaviors.

Dr. Sax stated that the internet is leading children down a rabbit-hole. Girls are especially being drawn into threads about self-harm and suicide, as well as anorexia.

TikTok is used by more than 170 millions people in the United States. The company did not provide data on the age breakdown. However, a Pew Research Center survey of nearly 1,500 13-17-year-olds in 2023 found that 63% were using TikTok. Nearly half were using it almost constantly or multiple times per day.

Some users are younger children.

The New York Times published internal data in 2020 that showed more than a third of the daily American users were aged 14 or younger.

The Center for Countering Digital Hate created several TikTok profiles based on the profiles of 13-year olds in a recent study of the app’s effect on children.

Videos about mental health and body image were “liked” and the accounts briefly paused.

TikTok’s content on eating disorders was served in less than eight minutes. TikTok recommends videos to teens about mental health and body image every 39 seconds.

The results are every parent’s worst nightmare. Imran Ahmed is the CEO of Center for Countering Digital Hate. “Young people’s feeds have been bombarded by harmful, harrowing material that can have an impact on their understanding the world around them, and their mental and physical health.”

TikTok imposed a daily 60-minute time limit for users under 18 years old last year. However, the restriction was reportedly easily bypassed.

China has much stricter restrictions on the app. In China, teens can only use TikTok 40 minutes a day and are limited to videos aimed at children.

Researchers claim that the U.S. app version is not full of harmful content, and it has helped people with mental health problems find information and support.

In a 2023 study by the University of Minnesota, it was found that TikTok’s algorithm can start with helpful videos to help those who are seeking mental support. However, this could then spiral into a negative spiral of content. The study participants reported that clicking on the “Not interested” option in the app did not prevent disturbing videos from showing up in their feeds.

The fate of TikTok in the U.S. now rests with the Senate. A bill passed by the House that would have required the app to only be sold to non-Chinese companies has been stalled, but it appears to enjoy bipartisan support.

Shou Zi Chew, CEO of TikTok, told users that if the bill becomes law it “will result in a ban on TikTok within the United States.” He urged users to “make their voices heard” by calling the Senate.

The bill was passed with a bipar