NewsBite

Experts concerned Australian legislators can’t keep up as cases of pornographic deepfakes rise

Experts have raised concerns Australian legislators may not be able to keep up as cases of a “degrading” act pop up across the country.

Trend Micro announces new tool to help identify deepfakes

Experts are concerned legislators may be unable to keep up as cases of deepfake pornography skyrocket, with technological advancements making it increasingly easier to create hyper-realistic, fake pornographic content.

A deepfake is a form of image-based abuse, where digitally created or generated content can portray someone doing something that never happened.

Taylor Swift fell victim to the abuse last year, while more recently in Australia a student from southwestern Sydney allegedly made deepfake pornography of female students using artificial intelligence and images sourced from social media.

The rapid advancement of technology has made it easier to create and distribute deepfake pornography. Picture: iStock.
The rapid advancement of technology has made it easier to create and distribute deepfake pornography. Picture: iStock.

It follows an incident at a school in Victoria’s northwest last June where a student allegedly created graphic nude images of about 50 girls from the school, while fake sexual images of a female teacher were circulated around another school in Melbourne’s southeast in May.

Teach Us Consent founder Chanel Contos said the organisation was “constantly” receiving reports of deepfake pornography from young people.

“Deepfake technology is used as a form of bullying and other times as a form of humiliation, or that kind of satisfaction – mainly male entitlement – for a sexual reason,” Ms Contos told NewsWire.

Teach Us Consent founder Chanel Contos said the rise of deepfake pornography was fuelled by ‘male sexual entitlement’. Picture: NewsWire / Martin Ollman
Teach Us Consent founder Chanel Contos said the rise of deepfake pornography was fuelled by ‘male sexual entitlement’. Picture: NewsWire / Martin Ollman

eSafety Commissioner Julie Inman Grant said it was becoming “harder to tell the difference between what’s real and what’s fake”.

“The rapid deployment, increasing sophistication and popular uptake of generative AI means it no longer takes vast amounts of computing power or masses of content to create convincing deepfakes,” Ms Inman Grant said.

“We are already receiving reports containing synthetic (AI generated) child sexual abuse material, as well as deepfake images and videos created by teens to bully their peers and of course, ‘deepfake porn’ through our image-based abuse scheme.”

An Australian Federal Police spokesman said the increasing use of AI to “increase the volume of child abuse material available” could make it “harder for police to identify and protect real child victims”.

“It is harder to discern if an image involves a real child or is purely AI-generated child abuse material,” the spokesman said.

However the spokesman said “anything that depicts the abuse of children — whether that’s videos, images, drawings or stories — is child abuse material” (including material generated by AI), and is illegal in Australia.

Expert explains what a deepfake is

“Male sexual entitlement” and “lack of empathy” driving image-based abuse

RMIT professor of Information Sciences Lisa Given said public figures, including Andrew Tate – an influencer accused of human trafficking and rape, who gained a following by promoting his misogynistic interpretation of masculinity online – could empower men to believe this type of behaviour was OK.

“When these figures take on a life of their own, when you’ve got a lot of peer pressure around those messages, a lot of young people can feel they should be engaging in this kind of abuse,” Ms Given said.

“It can kind of snowball.”

However Ms Contos said people like Tate couldn’t be attributed as the “sole reason” for the uptick in deepfakes.

“Widespread access to porn is going to increase demand for people to see porn of their peers,” she said.

“It’s fuelled by sexual entitlement – particularly male sexual entitlement – and a lack of empathy.

“Consequences aren’t truly thought about … there’s that step of removal with deepfake technology.

“I think the dehumanisation of women and girls in general is what means this technology is being employed for this purpose, to satisfy male sexual entitlement and demand.”

eSafety Commissioner Julie Inman Grant said more pressure must be put on AI companies and platforms to address deepfakes. Picture: NewsWire / Martin Ollman
eSafety Commissioner Julie Inman Grant said more pressure must be put on AI companies and platforms to address deepfakes. Picture: NewsWire / Martin Ollman

Can the law keep up?

A bill was passed in the Senate last year to tackle the rise in non-consensual deepfake pornography, however, Ms Contos was concerned legislators may not be able to keep up with the rapid rate technology is advancing.

“Not that long ago (creating deepfakes) would’ve required a high processing computer … now its something that can be done on an iPhone,” she said.

“I (was) in school not long ago and this would’ve been completely unimaginable back then.

“The process of implementing legislation will never be able to keep up with the pace that technology can develop at and I don’t think that we can imagine what the next challenge will look like.”

Ms Given said legislation must be broad enough that it could remain relevant as new technology – and therefore, forms of image-based abuse – emerged.

“One of the challenges we have with a lot of legislation is that it can be slow to change, but often it’s written in a way that’s speaking about today rather than the future,” she said.

Attorney-General and Cabinet Secretary Mark Dreyfus said new criminal offences applied to ‘all forms of deepfake material that is shared without consent’. Picture: NewsWire / Martin Ollman
Attorney-General and Cabinet Secretary Mark Dreyfus said new criminal offences applied to ‘all forms of deepfake material that is shared without consent’. Picture: NewsWire / Martin Ollman

However Attorney-General Mark Dreyfus offered his assurance that the new criminal offences to tackle the “damaging and deeply distressing form of abuse” applied to “all forms of deepfake material”.

“Overwhelmingly it is women and girls who are the target of this offensive and degrading behaviour,” Mr Dreyfus said.

“That’s why the Albanese government created serious criminal penalties for the sharing of digitally created sexually explicit material.

“The new criminal offences will apply to all forms of deepfake material that is shared without consent, no matter what technology is used to create them.”

The maximum penalty for sharing non-consensual sexually explicit deepfakes is six years imprisonment, while people can face up to seven years imprisonment if they also created the deepfake in question.

The AFP-led Australian Centre to Counter Child Exploitation (ACCCE) will also work to identify and respond to “evolving global threats and challenges, including AI”, via its role as chair of the Virtual Global Taskforce over the next three years.

Further to legislation, Ms Contos said more must be done socially and culturally.

“Gender equality, teaching respect …(addressing the) initial culture that would make a person want to do that to another person,” she said.

Ms Inman Grant also flagged “a greater burden must fall on the purveyors and profiteers of AI to take a more robust approach so they are engineering out misuse at the front end”.

She emphasised the importance of the commission’s Safety by Design guidelines, which detail how technology companies can minimise online threats, and encouraged anyone concerned about the sharing of intimate images to report it to the eSafety Commission.

Originally published as Experts concerned Australian legislators can’t keep up as cases of pornographic deepfakes rise

Add your comment to this story

To join the conversation, please Don't have an account? Register

Join the conversation, you are commenting as Logout

Original URL: https://www.dailytelegraph.com.au/technology/online/experts-concerned-australian-legislators-cant-keep-up-as-cases-of-pornographic-deepfakes-rise/news-story/a18cfc5fc8c9f41fee4c48100f434539