Grieving parents asked TikTok for age verification, but got age ratings instead [Updated]

Advertisement

Grieving parents asked TikTok for age verification, but got age ratings instead [Updated]

(Update, 3:15 p.m. ET: A TikTok spokesperson responded in more detail on how content tiers are assigned. Once a video is uploaded, TikTok reviews the videos to ensure they follow community guidelines. After the video passes this review and becomes viewable on TikTok, “it may be sent to our trusted and safety peers for moderation” if “its popularity increases or if a community member reports the video.” During moderation, a video can be removed for violating TikTok’s Terms of Service, or the moderator can “assign a content layer to the video.”

Addressing concerns about users under the age of 13 on TikTok, the spokesperson also detailed ways TikTok has already taken steps to prevent underage users from accessing the platform: “While most people understand the importance of being honest about their age, some do not provide the correct information, which poses a challenge for many online services. Therefore, our commitment to enforcing our minimum age requirements does not end at age, and we are taking a number of additional approaches to identify and remove suspected underage account holders implemented methods and processes to identify and remove underage users, which has resulted in that approximately 20 million accounts were removed worldwide in the first quarter of 2022.” TikTok also expects that “Content Levels will provide an additional level of protection for those community members who truthfully state that they are under the age of 18.”)

Original story: TikTok’s safety features were the focus of a recent lawsuit from parents who claim the app’s addictive design is responsible for the deaths of at least seven children, six of whom were too young to be on TikTok. These parents suggested TikTok take steps to protect young users, urging the platform to add an age verification process to restrict content or terminate the accounts of children under 13 — the minimum age required to participate on TikTok.

However, that’s not the direction TikTok has decided to go. At least not yet. Instead, TikTok announced on Wednesday that it is adding new security measures for all users to limit exposure to harmful content and give users more control over what appears in their feeds. This includes giving users the ability to block content that contains specific words, hashtags, or sounds.

TikTok is specifically focused on improving safety measures for TikTok’s “teen community members” and “is working to build a new system to organize content based on topic maturity” – essentially creating age ratings for TikTok videos, like the ratings you see in movies or video games.

“In the coming weeks, we will begin rolling out an early version to prevent content with apparent mature themes from reaching an audience aged 13-17,” wrote Cormac Keenan, TikTok’s Head of Trust and Safety, in the blog post.

Additionally, TikTok has provided an update on the previously announced steps it’s taking with its algorithm to protect users from endlessly scrolling through “potentially challenging or triggering viewing experiences.”

“Last year we started testing ways to avoid recommending a bunch of similar content on topics that might be fine as a single video but might be problematic when watched repeatedly, such as: B. Issues related to dieting, extreme fitness, sadness and other well-being issues,” Keenan wrote. “We also tested ways to detect if our system was inadvertently recommending a shortlist of content to a viewer.”

Probably because TikTok community standards limit accounts to 13 years and older, TikTok has so far only discussed the desired impact of new security features on adult and teen users. That means TikTok has yet to take action to address concerns parents have about “hundreds of thousands of children as young as 6 years old,” which the lawsuit alleges TikTok knows “is currently using its social media product.” will” with no security features designed just for them.

TikTok didn’t immediately respond to a request for comment on the new safety features or how age ratings would work. It’s unclear if future safety features – like age verification – are planned to address growing concerns about children under the age of 13, who are reportedly being harmed when using the app.

You May Also Like