Recently, discussions about the responsibility of social networking services (SNS) to protect youth ..

When I went to the ‘TikTok Transparency and Accountability Center’ in Singapore, only those over the age of 14 joined…Self-filtering of content containing inappropriate topics by age limitations Introducing parents’ child account management function in 2019 Direct management of children’s screen time, followers, and keywords
A TikTok official is explaining the TikTok function at the
A TikTok official is explaining the TikTok function at the “TikTok Singapore Transparency and Accountability Center (TAC).” [Picture = TikTok]

Recently, discussions about the responsibility of social networking services (SNS) to protect youth have been heating up around the world. As voices calling for the protection of youth from harmful online content grow, corporate social responsibility is being emphasized more. In this trend, TikTok, a global social media platform with more than 1 billion users, is also strengthening safety measures to protect youth and securing the reliability of the platform.

TikTok is doing its best to build a safe environment for youth users by introducing a content level system and a function that allows parents to manage their children’s accounts directly, said a TikTok official who met at the TikTok Transparency and Accountability Center (TAC) in Singapore on the 13th.

TikTok operates TAC with the aim of transparently disclosing content review, recommendation process, and platform security and informing measures to strengthen social responsibility. Singapore TAC is the third center established in October 2023 after Los Angeles (LA) and Dublin, Ireland, and is responsible for Asia.

The official here emphasized that TikTok has strict standards for youth safety by age. In Korea, only teenagers over the age of 14 can use TikTok according to the standards of teenagers under domestic law. Accounts under the age of 16 have private default settings and are restricted from using direct messages (DMs). Live streaming is not possible for those under the age of 18.

TikTok Youth Protection [Picture = TikTok]
TikTok Youth Protection [Picture = TikTok]

In addition, through the content level system, content containing topics inappropriate for youth users to watch is filtered and not exposed to youth. In addition, a zero-tolerance policy has been applied to child sexual abuse and sexual exploitation of minors, and reports are being made to local specialized agencies and law enforcement authorities as soon as such content is found.

TikTok also introduced a “safe pairing” feature in 2019 that allows parents to directly manage their children’s accounts and continues to expand related functions. Parents can control their children’s daily screen time and “take a break” (time when the app is not available), check their children’s followings and followers, and manage keywords that will not appear in their child’s recommendation feed.

This is basically similar to the “teenage account” (youth account) that Meta introduced on Instagram in September last year. Meta’s teenage accounts also focus on setting up an account privately and receiving messages only from people who have contacted them. The time of use can also be limited by parents.

TikTok explains that it is providing these youth protection functions one step ahead of Meta. A TikTok official added, “We are helping teenagers to have a positive and correct digital experience and parents control their children’s digital habits through safety pairing.”

In addition to its basic functions, TikTok can also set up exposure to “STEM Feed,” a bundle of useful learning content. Parents can also turn it back on even if their children turn off their feeds. A TikTok official said, “The STEM feed has not yet been introduced in Korea, but it is a feature that is in the spotlight abroad,” adding, “It can induce a balance between entertainment and learning.”

Reporter Ahn Sun-je in Singapore

This post was originally published on this site be sure to check out more of their content