TikTok Accused of Recommending Pornographic Content to Children
A new investigation has alleged that TikTok’s algorithm promotes pornographic material to underage users, even when parental safety features are switched on.
Global Witness, a human rights campaign group, said its researchers created fake accounts posing as 13-year-olds in late July and early August. Despite enabling TikTok’s “restricted mode,” the accounts were served sexualised search suggestions that led to explicit material – including videos of simulated masturbation and, in some cases, full pornographic films embedded within other content.
The group said the findings came as a “huge shock.” Ava Lee, one of the researchers, noted: “TikTok isn’t just failing to prevent children from accessing inappropriate content — it’s actively suggesting it to them as soon as they create an account.”
TikTok said it acted immediately when informed of the issue earlier this year, removing flagged videos and updating its search suggestion tools. The platform added it has more than 50 safety features in place and claims that nine out of 10 videos breaching its rules are removed before they are seen. “We are fully committed to providing safe and age-appropriate experiences,” a spokesperson said.
The report comes just weeks after the UK’s Online Safety Act Children’s Codes took effect on 25 July, requiring platforms to use strict age checks and adjust algorithms to block harmful material, including pornography.
Global Witness repeated its tests after the new rules were implemented and reported that inappropriate recommendations were still appearing, prompting calls for stronger regulatory oversight.
During the research, some TikTok users appeared to notice similar issues. One commenter asked: “What’s wrong with this app?” while another wrote: “Can someone explain to me what is up with my search recs pls?”