Minns government to criminalise creation of sexually explicit deepfakes
New legislation protecting victims of sexually explicit deepfakes will be introduced to NSW parliament. Here’s what the new laws mean.
The creation of sexually explicit deepfakes will now be a criminal offence in NSW under strengthened reforms by the Minns government.
The legislation being introduced to parliament on Thursday will expand existing offences related to the production and distribution of intimate images without consent to cover those created entirely using Artificial Intelligence (AI).
It comes after The Telegraph exposed several incidents in recent months highlighting the legal loopholes related to creation of explicit deepfakes.
The new laws covering the creation of deepfakes will create an offence punishable up to three years’ jail.
Sharing images, even if the person hasn’t created them, will also be a crime.
It is already a crime in NSW to record or distribute intimate images of a person without their consent or to threaten to do so, which includes photos and videos that have been digitally altered.
The government will also expand existing offences to encompass sexually explicit audio whether real or designed to sound like a real, identifiable person.
The amendments will bring NSW into line with other jurisdictions including Victoria that have criminalised the non-consensual production and distribution of sexually explicit material involving adults regardless of how it is created.
Young male students from a Sydney private school had been caught selling explicit deepfake images of their female classmates in online group chats for less than $5.
In another shocking incident, at least 16 women – some working in high level public service roles in Canberra – were depicted in more than 100 deep fake nude images created by a 23-year-old ACT man.
The incidents had prompted Emma Mason, the mother of NSW schoolgirl Matilda “Tilly” Rosewarne, who took her own life after bullying including a fake nude image of her being shared on Snapchat, to call for an urgent upgrade to image-based abuse laws to criminalise the creation of explicit deepfakes.
Ms Mason said she welcomed the deepfake reforms, saying it was “fantastic news” to further protect young people from online harms.
“This goes towards addressing the harm that is created by these kinds of images as opposed to police not being able to manage it or deal with those events,” she said.
“It makes me feel there is movement towards protecting the victims of these types of crimes.”
A NSW Parliamentary inquiry into harmful pornography, which began last year has been looking into the impacts of deepfakes.
Several advocacy groups, including Collective Shout, had called for strengthened deepfake laws as part of the inquiry.
The group’s movement director Melinda Tankard Reist said she welcomed the news.
“It’s great news to have creation recognised as a stand-alone offence. It causes trauma to victims whether or not the image is shared,” Ms Tankard Reist said.
“This move by the Minns government sends a strong warning that morphing women into synthetic deepfake abuse is harmful on its own and hopefully the strengthening of the laws will have a strong deterrent effect.”
North Shore Liberal MP Felicity Wilson also introduced a similar bill to criminalise the creation of deepfakes to parliament earlier this week.
Attorney-General Michael Daley said the NSW government recognised the law needed to keep up with technology and was moving to better protect people, particularly young women, from image-based abuse.
“This bill closes a gap in NSW legislation that leaves women vulnerable to AI-generated sexual exploitation,” he said.
“We are ensuring that anyone who seeks to humiliate, intimidate or degrade someone using AI can be prosecuted.”
NSW Women’s Safety Commissioner Hannah Tonkin said the devastating impacts of image-based abuse could not be underestimated.
“Rapid developments in AI have made it easy to create extremely lifelike, sexually explicit depictions of real people,” she said.
“We know that women and girls are the main targets of deepfake images. This is terrifying technology, which can be weaponised to cause immense harm.
“It’s vital that the community understands that this form of abuse will not be tolerated – stronger legal protections help send this message.”
Full Stop Australia CEO Karen Bevan said: “The new law directly acknowledges the serious impacts that production and distribution of this non-consensual material have on victim-survivors.”
The new laws will not affect existing child abuse material offences that already criminalise the production, possession and dissemination of explicit material of a child.
Do you have a story for The Daily Telegraph? Message 0481 056 618 or email tips@dailytelegraph.com.au
Originally published as Minns government to criminalise creation of sexually explicit deepfakes
