In an effort to appear more transparent, Tiktok has introduced a new tool that explains how its algorithm works and why certain videos are recommended to users.
The tool, which is available in the For You feed on the TikTok app, displays a window titled “Why are you watching this video?”, which provides a list of reasons, which can include “This video is popular in your area” or “This video is popular in your area.” This video was recently posted by an account you follow.
“TikTok” – owned by the Chinese company Bytedance – says that the new feature brings “more context” and “purposeful transparency” to users.
The platform uses an algorithm to promote videos to users on the For You page from people they don’t even follow, amid accusations that the controversial process promotes dangerous content.
For example, teens may be “bombarded” with content inciting self-harm and suicide on TikTok within minutes of joining the platform.
TikTok says in a blog post, “We want people to feel empowered to create, connect, and share on our platform. That’s why we’re equipping creators and viewers with a set of features, tools, and resources so they can be in control of their experience. Today, we’re adding to this toolbox a feature that helps In providing more context for the content recommended in the ForYou feed.
In June 2020, TikTok published a detailed blog post explaining how the For You page works, as part of promoting transparency.
When TikTok users open the app, they are presented with a “For You” feed, which is a collection of videos recommended to users.
ForYou is powered by an algorithm that makes personalized recommendations based on several metrics, including viewing behavior and interactions with videos.

Similar to the recommendations made by search engines, streaming services and other social media platforms, 4You is designed to give users a more personalized experience.
Unfortunately, one of the problems inherent in social media algorithms is directing users to a set of posts related to a video they have watched.
A recent report by the Center for Countering Digital Hate (CCDH) found that the 4U page was “bombing vulnerable teens with dangerous content that may encourage self-harm, suicide and eating disorders”.
Researchers at the non-profit organization created accounts posing as 13-year-old girls who liked the harmful content.
They found that videos alluding to suicide were presented to one account within 2.6 minutes, while eating disorder content was presented to one account within 8 minutes.
The Center for Combating Digital Hate said the 4U page lacked “meaningful transparency” and that its algorithm operated in an “opaque way”.
The platform allows users under the age of 13 to create an account by asking about their date of birth, although it does not prevent children under that age from using it.
You must log in to post a comment.