NewsBite

EXCLUSIVE

Number of sexualised ‘deepfake’ images of schoolchildren soars

The reported number of intimate, digitally-altered or AI-generated images featuring under-18s has more than doubled in 18 months – and experts fear the true figure could be much higher.

The number of explicit deepfake images created of ­underage Australians has skyrocketed in 18 months, with experts warning the alarming surge is only the tip of the iceberg.

New figures from the federal eSafety office’s image-based abuse reporting line shows the number of intimate, digitally-altered or AI-generated images – known as deepfakes – featuring under-18s has more than doubled in the past 18 months, compared to the seven years prior.

And eSafety Commissioner Julie Inman Grant has concerns the true number could be much higher.

“We suspect what is being reported to us is not the whole picture,” Ms Inman Grant said, citing the danger of so-called “nudify” apps which can digitally alter a picture to remove clothes or have a subject look like they are performing sexual acts.

“Anecdotally, we have heard from school leaders and education sector representatives that deepfake incidents are ­occurring more frequently, particularly as children are easily able to access and misuse nudify apps in school settings.”

Australia’s eSafety Commissioner Julie Inman Grant believes shocking statistics about deepfakes of children are just the tip of the iceberg. Picture: NewsWire/Martin Ollman
Australia’s eSafety Commissioner Julie Inman Grant believes shocking statistics about deepfakes of children are just the tip of the iceberg. Picture: NewsWire/Martin Ollman

The massive jump in fake sexual images has prompted Ms Inman Grant to write to education ministers across the country to ensure schools are enforcing mandatory reporting obligations.

“With just one photo, these apps can nudify the image with the power of (AI) in seconds.

“Alarmingly, we have seen these apps used to humiliate, bully and sexually extort children in the schoolyard and beyond,” she said.

“There have also been reports that some of these images have been traded among schoolchildren in exchange for money.”

The new data also states that four out of five reports involved female students being targeted.

Ms Inman Grant appealed to school authorities to report all deepfake incidents immediately so that material could be removed quickly.

Collective Shout director Melinda Tankard Reist. Picture: NCA Newswire/Gary Ramage
Collective Shout director Melinda Tankard Reist. Picture: NCA Newswire/Gary Ramage

“It is clear from what is already in the public domain and from what we are hearing directly from the education sector, that this is not always happening,” she said.

The eSafety office has released a suite of measures to try to combat the growing threat of nudify apps, including new online safety advice for parents and an updated Toolkit for Schools to provide a step-by-step guide to dealing with deepfake incidents.

Last month, The Sunday Telegraph revealed young male students from a Sydney private school were caught selling explicit deepfake images of their female classmates in online group chats for less than $5, while girls at two other independent schools also had their photos used to make AI-generated “nudes”.

Collective Shout director Melinda Tankard Reist said there were more than 127 nudifying, undressing and face-swapping apps which could create vile images in “seconds”.

“These tech platforms are available for as little as $2 and boys are using them as we speak to morph any woman or girl into any type of porn they want,” she said.

“These are digital weapons of abuse.”

Originally published as Number of sexualised ‘deepfake’ images of schoolchildren soars

Add your comment to this story

To join the conversation, please Don't have an account? Register

Join the conversation, you are commenting as Logout

Original URL: https://www.thechronicle.com.au/news/nsw/number-of-sexualised-deepfake-images-of-schoolchildren-soars/news-story/1af39d36acd0607ae466fc3c9ff5a216