The explicit content your child is being shown on TikTok revealed
Videos detailing graphic sex acts, encouraging teens to rate their sexual “purity” and to make explicit videos are being shown to underage teenagers on TikTok, an exclusive investigation has revealed.
NSW
Don't miss out on the headlines from NSW. Followed categories will be added to My News.
Young teens are being fed videos on TikTok which detail graphic sex acts, rate their sexual “purity” and encouraging them to make explicit videos themselves, as concerned experts warn parents to protect teenagers from the “vortex” of inappropriate content.
An investigation by The Daily Telegraph has lifted the lid on the confronting content coming across children’s devices which included trends where young children garner millions of views on videos talking about their sex lives.
When The Telegraph approached the tech giant with the content, most of which had been live for at least a week and was created by overseas accounts, the app urgently removed them or restricted them to adult users.
The Telegraph set up a profile posing as a 16-year-old Australian girl which prompted the algorithm to flood the feed with videos shared by other users in a similar age bracket.
Liking just one video from a trend where teens recap and rate the places they had sex prompted the algorithm to send more such videos before it escalated to far more explicit content.
Among the most popular trends were teens recording their Rice Purity Score — an online test that scores people on how “innocent” they are, awarding status for not using a condom, filming sex acts, having a pregnancy scare, having a sexually transmitted infection, being paid for sex and other things too graphic to print.
Another trend predominantly featured videos of young girls talking about making a sex tape and posting suggestive pictures in bikinis.
Cybersecurity expert Susan McLean said TikTok’s claims of safety measures for children were “flawed”.
“Tiktok is a cesspool. Despite its Ts and Cs about claims ‘we don’t host content that is damaging and dangerous’ … they do,” she said.
“You don’t have to search for cute kids on TikTok if you are a paedophile, you just have to like one video and the algorithm will send it your way.”
Ms McLean said the findings blew apart a myth among parents that TikTok is a safe platform because their own feeds did not feature inappropriate content.
“You have got all the viral challenges that occur on TikTok and the trends … underage sex is one level of content that should never be pitched towards children but there is the choking challenge... (and spraying) deodorant on your arms until there is a third degree burn,” she said.
“It is psychologically damaging to children … it does my head in that there is still a lack of willingness by parents to acknowledge this.
“Parents go ‘I use TikTok and I don’t have a problem.”
She is pushing for young children under the age of 16 to be protected from the app.
“They can do a far better job, they simply choose not to. What incentive do they have to change? There is none. People still go there, people flock there.”
A TikTok Australia spokesman said more than 40,000 trust and safety professionals had been engaged around the globe to keep the platform safe and to enforce their guidelines.
“Videos that do not breach our guidelines, but contain content which may be inappropriate for our younger audience, are not included in their ‘ForYouFeed’. Regrettably, on occasion, such content may appear in their feed, but it is removed once we are aware of it.
“Of the videos reported, those that breached our guidelines are being, or have been removed, and we are working to ensure that the ones deemed inappropriate for under 18s will no longer be in their suggested feed ... Over a three month period in Australia, we actively removed nearly 550,000 videos from the platform; nearly 85 per cent of them had never been seen.”
Professor Adam Guastella, chair of child and youth mental health at Sydney University and The Children’s Hospital at Westmead, said parents need to be aware of the dangers of platforms like TikTok even if their own feed is filled with innocuous videos.
“This is an issue I hear from parents all the time — how do you protect your kids from content that is really inappropriate and how do you stop kids being stuck into vortexes of inappropriate content,” he said.
He believes TikTok normalises behaviour online and encourages children to replicate what they are witnessing.
“You certainly see it in schools, where schools become far more sexualised in cohorts because of the sharing and normalisation of behaviour,” he said.
“There is strong evidence that girls in particular try to conform to sexual activity they see online so they think they have to participate in far more severe sexual activity than people around them. There is also good evidence that boys have a distorted view of what sex is.”