NewsBite

‘Massive steps backwards’: AI deepfakes fed by online misogynistic silos

Sparked by a string of deepfake porn and sexually explicit images created at Australian high schools, experts say unfiltered AI tools have collided with misogyny.

AI not a ‘doom and gloom’ story if Australia can ‘get it right’

Sexual doctored images targeting women are a product of online silos, warped by misogyny and there are no Band-Aid fixes for the use of AI in creating deepfake nudes, experts warn.

Last month, 50 girls in years nine to 12 from one Victorian high school were targeted by “incredibly graphic” fake naked images, made using AI and social media photos, including images from school formals.

The incident, slammed as “disgraceful” by Victorian Premier Jacinta Allan, happened to students at Bacchus Marsh Grammar, west of Melbourne. A 17-year-old male former student was cautioned by police and the investigation was closed.

A former Bacchus Marsh Grammar student was cautioned after police investigated the incident. Picture: Supplied
A former Bacchus Marsh Grammar student was cautioned after police investigated the incident. Picture: Supplied

In a separate incident, a female teacher was targeted by fake sexual images circulated around the Catholic boys school Salesian College in Melbourne’s southeast in May. A 15-year-old male student was expelled.

Monash University associate professor of criminology Asher Flynn said these non-consensual incidents were unsurprising.

“Sadly, these recent harmful incidents in Victoria are not unexpected,” Dr Flynn said.

“A big concern is that the rapid spread of these tools is already normalising the practice of sexualising images - particularly of women and girls - without their consent.”

“So there is a real risk here that we are taking some massive steps backwards in our efforts to address and prevent sexual harassment abuse.”

Federal parliament is debating laws that would carry a maximum sentence of six years in prison for people who share non-consensual deepfake sexually explicit material and seven years if the offender created the content. Picture: iStock
Federal parliament is debating laws that would carry a maximum sentence of six years in prison for people who share non-consensual deepfake sexually explicit material and seven years if the offender created the content. Picture: iStock

There should be an onus on tech platforms to remove the thousands of applications that allow the creation of sexually explicit deepfakes, Dr Flynn said.

“We need to be careful not to absolve the people involved for what amounts to disrespectful, sexist and, in many cases, abusive behaviour. Just because the tool is there doesn’t mean we’re not responsible for choosing to use it to harm someone,” she said.

Software with the ability to create such images should carry a requirement to warn users of the harms of non-consensual sexualised deepfakes, Dr Flynn said.

eSafety Commissioner Julie Inman Grant has announced ‘world-leading’ new standards that would require social media platforms, cloud and messaging companies to detect and remove child abuse content. Picture: NewsWire / Martin Ollman
eSafety Commissioner Julie Inman Grant has announced ‘world-leading’ new standards that would require social media platforms, cloud and messaging companies to detect and remove child abuse content. Picture: NewsWire / Martin Ollman

The eSafety Commissioner should compel organisations to provide evidence and report on how they had integrated safety-by-design in their technologies, with penalties applied if platforms did not remove non-consensual image-based sexual abuse content, Dr Flynn said.

But the eSafety Commissioner has said “we are not going to regulate or litigate our way out of this” despite agreeing a stopgap needs to be in place in the software design.

Federal legislation introduced in June creates new criminal offences to ban the sharing of non-consensual deepfake sexually explicit material.

Attorney-General Mark Dreyfus says digitally created and altered sexually explicit material that is shared without consent is a damaging and deeply distressing form of abuse. Picture: NewsWire / Martin Ollman
Attorney-General Mark Dreyfus says digitally created and altered sexually explicit material that is shared without consent is a damaging and deeply distressing form of abuse. Picture: NewsWire / Martin Ollman

“Such acts are overwhelmingly targeted towards women and girls, perpetuating harmful gender stereotypes and contributing to gender-based violence,” Attorney-General Mark Dreyfus said at the time.

The gender aspect is something Tom Harkin sees all too often.

Mr Harkin goes into schools across the country educating boys and girls, separately and in unisex sessions, about relationships.

The father of two says parents put so much effort and thought into who their kids socialise with in-person and which school to attend but online the kids roam free.

Relationship consultant Tom Harkin works in schools across the country. Picture: Supplied
Relationship consultant Tom Harkin works in schools across the country. Picture: Supplied

“They’re in these unregulated (online) environments, having their views about life, about gender, completely warped,” he said.

After an incident like at Bacchus Marsh, the focus was often on the male perpetrator, Mr Harkin said.

“What’s the lived experience for these young women on the other side of this, and how are their lives being impacted and are they OK?” he said.

Society was ignorant to the harmful pornography students were looking at, he said.

Tom Harkin (far left) says with young people being exposed to graphic porn, it is no wonder people making deepfakes act with disregard for their victims. Picture: Supplied
Tom Harkin (far left) says with young people being exposed to graphic porn, it is no wonder people making deepfakes act with disregard for their victims. Picture: Supplied

“We’ve all turned a blind eye to the mainstream use of PornHub as a sex education and gratification tool for young people. We know those environments are warped.

“Some teenage boys, multiple times a week, are consuming material that is dominant (of) the woman, abusive of the woman,” he said.

Rates of criminal strangulation during sex had increased, he said.

Victoria Police say they made extensive inquiries into the Bacchus Marsh incident, but no charges were laid. Picture: NewsWire / Sarah Matray
Victoria Police say they made extensive inquiries into the Bacchus Marsh incident, but no charges were laid. Picture: NewsWire / Sarah Matray

Mr Harkin has been a relationship educator in schools for 20 years and remembers an emerging trend of 15, 16 and 17-year-old males trying to have anal sex with girls.

“And all of that was influenced through porn,” he said.

“Largely (in the porn videos) she’s responding with pleasurable reactions to that ‘power-over’ dynamic and the pain that’s been induced.

“And then we’re up in arms because a young man has used these tools and operated with entitlement and operated with a lack of regard towards 50 young women?”

Deepfake explicit images of Taylor Swift were widely shared on social media earlier this year. Picture: YouTube
Deepfake explicit images of Taylor Swift were widely shared on social media earlier this year. Picture: YouTube

The “horrible, horrific” Bacchus Marsh incident was probably the tip of an iceberg of similar unreported cases, Mr Harkin said.

“These things are indicative of young guys hanging out online, they’re not hanging out in the same environment as young women,” he said.

“Empathy drops, understanding drops … We’ve got a generation of young guys that don’t really know how to socialise.

“I don’t think parents are equipped. Teachers definitely aren’t equipped.”

Australian high schools this year have had scandals involving harmful male behaviour directed at female peers online. Picture: Supplied
Australian high schools this year have had scandals involving harmful male behaviour directed at female peers online. Picture: Supplied

The digital AI tools that can turn these misogynistic attitudes into criminal acts are nearly impossible to curtail.

Digital security expert David Fairman said parental or school controls on devices simply could not blanket the some 20,000 AI tools on offer.

There was an “arms race” to continually track, classify and, if need-be, censor new AI tools, the Netskope chief security officer said.

“There’s no silver bullet … So there’s always ways to exploit the system if you’re a malicious person and you have malicious intent,” he said.

Netskope chief Asia Pacific information and security officer David Fairman says there is no silver bullet to keep on top of malicious AI use. Picture: Supplied
Netskope chief Asia Pacific information and security officer David Fairman says there is no silver bullet to keep on top of malicious AI use. Picture: Supplied

Netskope works with companies and schools to identify what AI is being used on their networks, not explicitly to ban it but help the organisation decide how AI fits into its practice.

Netskope has an ongoing cyber-bullying detection program it began two years ago.

“So yes, there are abilities out there that parents can start to research and start to leverage. But again, that’s just one piece of the bigger puzzle, right?” Mr Fairman said.

“I think there’s a big thing about teaching the future generation about morals and ethics.”

Mr Fairman sympathised with the victims of the “awful” Bacchus Marsh incident.

Melbourne Catholic boys school Salesian College expelled a male student earlier this year after he created images of a female teacher. Picture: Supplied
Melbourne Catholic boys school Salesian College expelled a male student earlier this year after he created images of a female teacher. Picture: Supplied

“Let’s think about responsible AI … are we using it for the purpose of good versus bad? Governance and putting guardrails and regulation is absolutely the right thing to do and should not be overlooked.

“There’s a lot of really good people with the right motivations and determination in government who want to do the right thing.”

Victoria passed deepfake porn laws in 2022, punishing the creation and distribution of such material.

Victoria Police declined a request for an interview about the closed Bacchus Marsh case. Picture: NewsWire / Andrew Henshaw
Victoria Police declined a request for an interview about the closed Bacchus Marsh case. Picture: NewsWire / Andrew Henshaw

Victoria Police’s Brimbank Sexual Offences and Child Abuse Investigation Team investigated the Bacchus Marsh incident. A police spokeswoman declined to elaborate on the unit’s “extensive investigations” in the case or facilitate an interview with the officer in charge.

At the time, Bacchus Marsh Grammar said the school was “taking this matter very seriously” and had contacted police.

“This is something that affects the 50-odd girls. But the reality is, it has reverberated throughout the community as well,” principal Andrew Neal said.

“This is not a Bacchus Marsh Grammar issue. This is not a school X or Y issue. It’s an issue for every school in the country, indeed every school anywhere.”

Originally published as ‘Massive steps backwards’: AI deepfakes fed by online misogynistic silos

Original URL: https://www.adelaidenow.com.au/technology/online/massive-steps-backwards-ai-deepfakes-fed-by-online-misogynistic-silos/news-story/1f0bc9ccc61c4419f8b13c68be76f779