NSW women targeted in predatory AI deepfake forum
Men in sickening underground forums are offering cash in exchange for explicit AI-generated images and videos of women from NSW.
NSW
Don't miss out on the headlines from NSW. Followed categories will be added to My News.
Men are offering cash for explicit AI-generated images and videos of women in several sickening underground forums.
The Sunday Telegraph was alerted to the websites, which show hundreds of requests from men in NSW and other states asking for their ex-partners and women known to them to be depicted in pornographic deepfakes.
One of the forums, which was previously exposed last month to be a haven for men requesting nude images of women in NSW, has since changed its domain name and hosting provider in a bid to evade authorities.
Shockingly, the forums show men requesting images and videos of women to be “nudified” including having their “underwear and bra removed” among several other vile requests.
The men ask for a “wizard” – known as someone that can create the AI deepfakes on the forum – to digitally manipulate images or videos.
“AI wizards work your magic … undress, boob expansion,” one male wrote on the forum.
In some posts, men have even offered $100 for the creation of deepfakes.
The federal government last year introduced new laws banning the sharing of deepfakes, imposing jail terms of up to seven years.
But in NSW, ACT, and other states, it’s only seen as a criminal offence if there is evidence of either threats of an image being shared online or circulation of them.
Victoria is the only state in Australia where the creation of adult deepfakes without a person’s consent is a criminal act.
Collective Shout has been calling for the creation of adult deepfake images as a stand-alone offence in NSW.
The not-for-profit company’s movement director Melinda Tankard Reist said: “These forums have turned into an AI-enabled predator’s playground.”
Cyber safety expert Susan McLean said forums promoting image-based abuse needed to be “urgently” investigated and shut down by the relevant authorities.
Macquarie University criminologist Dr Vincent Hurley said the images on these forums are contributing to the proliferation of violence against women.
A spokesman for Attorney-General Michael Daley said it was “shocking” that cowards who hid behind internet anonymity seek to intimidate and degrade innocent women in this way.
“This is rapidly changing technology that rules have not kept up with and we know that we must act,” he said.
“The NSW Government is waiting for the recommendations from the parliamentary inquiry (into harmful pornography including AI) and is looking closely at changes to the law to prevent this behaviour.”
An eSafety spokesman said the authority had a high success rate removing harmful content when reported and has been taking action against these rogue websites for many years.
“These reports are deeply concerning and we continue to encourage Australians to report further instances of image-based abuse (non-consensual sharing of intimate images) or doxing (non-consensual sharing of personal, identifying information) to us at eSafety.gov.au,” he said.
“Under the Act, a key factor in determining a breach is the absence of consent.
“We must be satisfied that the person depicted in the image did not consent to the image being shared.
“Without this confirmation, eSafety cannot take regulatory action under the law.”
A NSW Police spokeswoman said: “One of the most alarming trends police are seeing is the increase in generative AI being used to create deepfake sexual material both of strangers and people known to the offender.”
Any Australian experiencing online abuse can report it to: esafety.gov.au/report
Originally published as NSW women targeted in predatory AI deepfake forum