TikTok’s algorithm recommends pornography and sexualised material to children, according to a report by a human rights campaign group. Researchers created fake child accounts, switched on safety settings, yet still received explicit search suggestions. These searches led to videos showing simulated masturbation and even penetrative sex. The platform insists it took immediate action and remains committed to safe experiences.
Fake child accounts reveal hidden risks
In late July and early August, researchers from Global Witness set up four TikTok accounts. They pretended to be 13-year-olds using false birth dates. The app did not request additional identity checks. The team activated TikTok’s “restricted mode”. The company claims this setting blocks mature themes, including sexually suggestive content. Yet investigators still saw highly sexual search suggestions in the “you may like” section. These led to clips of women flashing underwear, exposing breasts and simulating masturbation. At the most extreme, the platform showed explicit pornographic films. Some of these videos were embedded in innocent content to bypass moderation systems.
Global Witness sounds alarm
Ava Lee from Global Witness called the findings a “huge shock”. She said the platform not only fails to protect children but actively suggests harmful content. Global Witness normally investigates how technology influences democracy, human rights and climate change. The group stumbled across the pornographic content while conducting unrelated research in April.
TikTok claims swift removals
Researchers informed TikTok about their findings. The company said it removed the material and fixed the problem. But Global Witness repeated the test weeks later and found sexual content again. TikTok says it has more than 50 features to protect teenagers. The app claims it removes nine out of ten rule-breaking videos before anyone sees them. The company added that it launched improvements to its search suggestion system after the report.
Law demands stricter child protection
On 25 July, the Children’s Codes from the Online Safety Act came into force. These rules force platforms to use strong age checks and stop children from accessing pornography. Algorithms must also block content that promotes eating disorders, suicide or self-harm. Global Witness carried out its second study after the new rules began. Ava Lee urged regulators to act, saying everyone agrees children must be safe online.
Users raise questions
During the investigation, researchers also tracked how other users reacted. Some asked why their search recommendations suddenly became sexual. One user wrote: “can someone explain to me what is up with my search recs pls?” Another simply asked: “what’s wrong with this app?”

