TikTok ‘Poisoning’ Vulnerable Young Minds by Promoting Eating Disorders, Self-Harm, and Suicide: Report

Spread the love

A new report by the Center for Countering Digital Hate (CCDH) reveals that Chinese social media platform TikTok is pushing harmful content involving self-harm and eating disorders into children’s feeds.

In their study, the CCDH established accounts posing as 13-year-old teenagers in the United States, United Kingdom, Canada, and Australia. One account in each nation was assigned a traditional female name. A second account was also created in each country with usernames containing the characters “loseweight” in addition to the name. Researchers used these characters after finding that people with issues like body dysmorphia usually express their situation via usernames.

The team then looked at the first 30 minutes of content recommended by TikTok in these accounts’ “For You” feed. When videos with potentially dangerous content about disorderly eating, self-harm, or mental issues were shown, the researchers would pause and like it, just like a typical teenager.

Every 39 seconds on average, the accounts were served with videos related to body image and mental health. Content referencing suicide was shown on one account within two and a half minutes. One account received an eating disorder content within eight minutes.

The accounts with “loseweight” characters were delivered three times more harmful content than other accounts. In addition, these accounts were also exposed to 12 times more suicide and self-harm videos. According to the CCDH, an eating disorder community hosted at TikTok had more than 13.2 billion video views.

Harmful Content

Imran Ahmed, CEO of the CCDH, pointed out that TikTok was designed to influence young users into giving up their time and attention. The research proved that the app is “poisoning their minds” as well.

“It promotes to children hatred of their own bodies and extreme suggestions of self-harm and disordered, potentially deadly, attitudes to food,” he said, according to the report.

“Parents will be shocked to learn the truth and will be furious that lawmakers are failing to protect young people from Big Tech billionaires, their unaccountable social media apps, and increasingly aggressive algorithms.”

In an interview with AP, Josh Golin, executive director of Fairplay, a nonprofit that bats for protecting children online, noted that TikTok is the only platform which is failing to protect its young users from harmful content. The app also engages in aggressive data collection, he added.

In a statement, a spokesperson from TikTok questioned the conclusions of the study, and insisted that results were skewed since researchers did not use the app like typical users.

Lingering Harms of TikTok

In a recent interview with The Epoch Times, Anthony Luczak, an advanced practice registered nurse in pediatric primary care, warned that the Chinese-owned social media app is behaving “exactly like a psychological warfare tool” that intends to undermine the psychological health of an enemy population.

Though harmful trends are common in social media, TikTok is different in the sense that it “seems to adapt” its harmful content depending on the user, he stated.

“TikTok is actively trying to make each person using it into a worse version of themselves, and this is incredibly dangerous for children.”

The app seems to know which children trend toward negativity like anxiety, drug use, porn, depression, or criminal activities and “exploits those tendencies,” he stated.

Earlier in May, attorneys general from states including California, Florida, Kentucky, Massachusetts, Tennessee, New Jersey, Vermont, and Nebraska launched an investigation into TikTok and the app’s potential harmful effects on youngsters.

Naveen Athrappully


Naveen Athrappully is a news reporter covering business and world events at The Epoch Times.

Source link

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.