A recent study found that TikTok is recommending self-harm and eating disorder content to some users within minutes of joining the platform

Shocking Study: TikTok bombards users with harmful and suicidal content

A recent study found that TikTok is recommending self-harm and eating disorder content to some users within minutes of joining the platform, according to a new report published by the Center for Countering Digital Hate (CCDH).

The center published a new study on Wednesday in which researchers created TikTok accounts as 13-year-old users interested in content about body image and mental health.

The study found that within 2.6 minutes after joining the app, harmful content that encouraged suicide was recommended by the TikTok algorithm, with the report showing that content encouraging an eating disorder was recommended by the platform within 8 minutes of a user browsing.

During this study, researchers found 56 TikTok hashtags hosting eating disorder videos with more than 13.2 billion views.

“The TikTok algorithm is bombarding teens with harmful content that promotes suicide, eating disorders, and body image issues that fuel teen mental health issues,” said James P. Steyer, founder, and CEO of Common Science Media.

TikTok application – which was launched by the Chinese company “ByteDance” worldwide in 2017, and works through algorithms inspired by personal data (likes, followers, watch time, and user interests) – has become the fastest-growing social media application in the world, with the number of its users reaching one billion. monthly active user by 2021.

The Tik Tok application was launched by the Chinese company ByteDance worldwide in 2017
The Tik Tok application was launched by the Chinese company ByteDance worldwide in 2017

The CCDH report details how TikTok’s algorithms improve the videos shown to users, and the app collects more information about their preferences and interests.

Suggestions from the algorithm in the “For You” feed, as the app describes it, are designed to be a central element of the TikTok experience. But new research shows that the video platform can push harmful content to vulnerable users, and it seeks to keep them interested.

To test the algorithm, CCDH researchers registered users in the US, UK, Canada, and Australia, and created “strong” and “weak” TikTok accounts (meaning accounts with all user information and accounts with less information).

A total of 8 accounts were created, and data was collected from each account for the first 30 minutes of use. CCDH researchers say a small recording window was developed to calculate how quickly the video platform can understand each user and push potentially harmful content to them.

Each researcher pretended to be a 13-year-old teenager, which is the minimum age TikTok is allowed to sign up for its service.

In all tests, the researchers paused videos about body image and mental health. They “liked” the videos, as if the teens were interested in the content.

When views were compared against the benchmark, the researchers found that people were presented with 3 times more total harmful content, and 12 times more specific videos of self-harm and suicide than accounts where content about weight loss and good mental health was shown.

Imran Ahmed, CEO of the Center “CCDH” – which is located in Washington, D.C., which advocates for the Children’s Online Safety Act, which would put barriers to protect minors online – said that TikTok is able to identify user weaknesses and seeks to exploit them. . It’s part of what makes TikTok’s algorithms so insidious; The app is constantly testing our children’s psyche and adapting to keep them online.”

TikTok response

A TikTok spokesperson challenged the study’s methodology – according to the CBS report. “We regularly consult with health experts, remove violations of our policies, and provide access to supportive resources to anyone in need,” he said.

A TikTok spokesperson went on to say that the video platform was “aware that displaying private content according to viewer preferences is unique to each individual” and that the social platform “remains focused on promoting a safe and comfortable space for all”.

The study comes as more than 1,200 families are filing lawsuits against social media companies, including TikTok.

These lawsuits allege that the content on social media platforms significantly affected the mental health of their children, and in some cases led to their death. More than 150 lawsuits are expected to proceed next year.