AI deepfake ‘nudifying’ and ‘undressing’ web apps blocked for Australian users over child abuse
The world’s most notorious deepfake nudifying web apps are no longer accessible in Australia, after the eSafety Commissioner threatened to take their UK parent company to court.
Three of the world’s most notorious deepfake nudifying web apps are no longer accessible in Australia after the eSafety Commissioner threatened to take their UK parent company to court if they refused to shut them down.
English firm Itai Tech – which owns free ‘deepnude’ site Undress.cc and other similar sites – has agreed to block Australians’ access to three websites following an official warning in September that it had allowed for the creation of “synthetic child sexual abuse material”.
The web apps have been involved in multiple incidents in which school students created and distributed fake nude images of their classmates, including at Bacchus Marsh Grammar in Victoria.
Itai Tech’s failure to comply with Australia’s mandatory Codes and Standards in relation to child sexual abuse material meant it would be liable to court action if its sites remained live.
eSafety Commissioner Julie Inman Grant said she was “glad we’ve achieved what we wanted to” without resorting to legal action, but warned the Australian government would continue to pursue other undressing apps – even those hidden behind layers and layers of shell companies.
“This is huge, and we will keep the pressure up,” she said.
Her office is receiving a new report of AI-generated deepfake image abuse at schools “on almost a weekly basis”, Ms Inman Grant said, and Australians were visiting these three blocked apps alone an estimated 100,000 times a month.
“I think to a certain degree (schools) didn’t understand that … the technologies that are being used are so hyper-realistic that you can’t tell by the naked eye that these videos are not of a school student engaging in a sexual act,” she said.
“It’s humiliating, it’s denigrating and extremely traumatic for the young girls, mostly, that (are) experiencing it.”
Deepnude providers have been openly marketing ‘sex mode’ and ‘schoolgirl’ image generation features, she said, along with an option to undress ‘any girl’.
“There is no positive use case. These (sites) are malicious, but they generate a lot of money and … interest,” Ms Inman Grant said.
“They shouldn’t exist at all.”
Generative AI model hosting platform Hugging Face has also changed its terms of service after the eSafety Commissioner raised compliance concerns over users’ misuse, which have led to the creation of child sexual exploitation and pro-terror material on other platforms.
Hugging Face will now be required to take action against users who upload AI models in breach of the new terms of service, or face enforcement action including penalties of up to $49.5 million.
Do you have a story for The Daily Telegraph? Message 0481 056 618 or email tips@dailytelegraph.com.au
