TikTok Allegedly Directs Children's Profiles to Explicit Material In Just a Few Taps
Per findings from a fresh inquiry, TikTok has been found to direct children's accounts to adult videos in just a couple of steps.
Testing Approach
An advocacy group created simulated profiles using a 13-year-old's birth date and activated the platform's content restriction feature, which is meant to limit exposure to inappropriate content.
Study authors discovered that TikTok suggested sexualized and explicit search terms to seven test accounts that were established on unused smartphones with no search history.
Alarming Recommendation Features
Search phrases recommended under the "suggested searches" feature included "provocative attire" and "explicit content featuring women" – and then escalated to terms such as "graphic sexual content".
Regarding three of the accounts, the adult-oriented recommendations were suggested immediately.
Rapid Access to Explicit Content
Following just a few taps, the study team encountered explicit material ranging from women flashing to explicit intercourse.
Global Witness claimed that the content attempted to evade moderation, usually by presenting the video within an benign visual or video.
In one instance, the procedure took two interactions after accessing the app: one click on the search feature and then another on the proposed query.
Compliance Requirements
The research entity, whose scope includes examining big tech's impact on public safety, stated it carried out multiple testing phases.
Initial tests occurred before the enforcement of minor safety measures under the United Kingdom's digital protection law on July 25th, and another subsequent to the regulations took effect.
Serious Findings
The organization added that multiple clips showed someone who looked like they were a minor and had been reported to the child protection organization, which monitors harmful material involving minors.
Global Witness claimed that TikTok was in non-compliance of the Online Safety Act, which mandates digital platforms to block children from encountering dangerous material such as pornography.
Government Position
An official representative for Ofcom, which is responsible for regulating the act, said: "We appreciate the effort behind this study and will review its findings."
Official requirements for following the law indicate that digital platforms that carry a medium or high risk of displaying dangerous material must "modify their programming" to block harmful content from minors' content streams.
The platform's rules ban pornographic content.
TikTok's Statement
The video platform said that following notification from the research group, it had deleted the problematic material and introduced modifications to its suggestion feature.
"As soon as we were made aware of these assertions, we took immediate action to examine the issue, remove content that violated our policies, and launch improvements to our recommendation system," said a official speaker.