NewsBite

‘Kids are considering suicide’: Horrific ‘undressing’ apps ruining lives of teachers and teens in Aussie schools

Apps allowing teenagers to create AI-generated fake nudes of classmates and teachers are out of control in Aussie schools, as parents plead with politicians to take action.

The apps are being used inside the classroom. Picture: Supplied
The apps are being used inside the classroom. Picture: Supplied

After learning about “Nudify” apps, Jenny Branch-Allen quickly knew they would be a problem for Aussie parents.

The President of the Australian Parents Council later realised just how “horrifying” the impact was when parents started coming to her, saying their kids were considering suicide after becoming victims of them in the classroom.

“These kids are considering suicide cause they can’t live with what’s been done to them,” she said.

The apps – which have risen in popularity in recent months – use AI to “undress” or generate a fake nude image using a picture of someone fully clothed.

But it’s their growing popularity within the classroom that has parents, teachers and politicians terrified.

The apps are also being used against teachers. Picture: Collective Shout
The apps are also being used against teachers. Picture: Collective Shout
The horrifying apps are being used across the classroom. Picture: Collective Shout
The horrifying apps are being used across the classroom. Picture: Collective Shout

Australia’s eSafety Commissioner said reports from children about “Nudify” apps have more than doubled in the past 18 months, with four out of five involving the targeting of females.

“We’re seeing it used by schoolchildren against other schoolchildren, and it’s devastating.

“It’s more than bullying, it’s demoralising. Some are being doing these horrible acts against their siblings; it’s just traumatising to all of us.”

In September, a UK-based technology company received a warning from the eSafety Commissioner after it was found to have enabled the creation of child sexual exploitation material through its “Nudify” service.

Some of the apps even provide a financial incentive. Picture: Supplied
Some of the apps even provide a financial incentive. Picture: Supplied

The services were attracting about 100,000 visitors per month and were being used to generate explicit deepfake images of students in Australian schools.

“Following reports to eSafety, we found both online services were used nefariously by Australian schoolchildren, to create deepfake image-based abuse of their peers,” eSafety Commissioner Julie Inman Grant said.

“The fidelity of this deepfake imagery is so high that it is near-impossible to tell that the image isn’t real.”

A student from southwestern Sydney allegedly made deepfake pornography of female students using artificial intelligence and images sourced from social media, while a student from a school in Victoria’s northwest allegedly created graphic nude images of about 50 girls from the school last June.

Fake sexual images of a female teacher were also circulated around another school in Melbourne’s southeast last May.

This week, a Louisiana middle school was forced to defend itself against lawyers after they expelled a 13-year-old girl for hitting a male classmate who allegedly shared AI-generated nude photos of her.

The Lafourche Parish County School District expelled her in August after the incident, with her family now suing the school board.

PROMPTS INCLUDE ‘BEATEN’, ‘ABUSED’

Australian activist group Collective Shout conducted an analysis of 20 nudifying, deepfake and “virtual girlfriend” apps, to test their capabilities using an AI-generated image of a “non-existent young woman”.

Some of the apps not only undressed the image of the young woman, but could create pornographic scenarios which could be posted into public online galleries.

Some of the prompts used to generate the images include terms like ‘abused’, ‘beaten’, ‘bruised’, ‘raped’ and ‘crying’.

Several apps have ever offered users financial incentives to invite friends and “trade nudified images”.

The clothes prompts in one of the apps. Picture: Collective Shout
The clothes prompts in one of the apps. Picture: Collective Shout
The “mode” prompts in one of the Nudify apps. Picture: Collective Shout
The “mode” prompts in one of the Nudify apps. Picture: Collective Shout

In September, the federal government announced plans to restrict access to Nudify apps.

However, independent federal MP Kate Chaney wants to take the issue further – introducing a bill that would make it a criminal offence to download, access, supply or offer access to Nudify apps and other tools where the sole purpose is the creation of child sexual abuse material.

The federal government has not taken the bill on. It instead wants to crack down on the tech organisations themselves.

“This is a really specific evil that can be done using AI and addressing this quickly is important,” she said.

“This is a complete no-brainer. It’s a risk we can react to now.”

In September, the NSW government introduced laws against the sharing of AI-generated sexually explicit deepfakes.

The move came after a rise in “sextortion scheme” where perpetrators threaten to release further explicit content unless victims comply with demands.

Four out of five of the reports to the eSafety Commissioner targeted females. Picture: iStock
Four out of five of the reports to the eSafety Commissioner targeted females. Picture: iStock

NSW Women’s Safety Commissioner Hannah Tonkin said the technology was “terrifying”.

“Women and girls are the main targets of deepfakes, and the impacts of their dissemination can be devastating,” she said at the time.

“This legislation sends an important message that image-based abuse will not be tolerated.”

Ms Cheney said that data currently coming out of the International Centre for Missing & Exploited Children is “mind-blowing”, with the organisation now receiving 65,000 reports each year of online child sexual exploitation.

“They’re going through this huge cache of AI abuse material – and all of it starts with a child who is real.”

She also said in the US – more than 30 per cent of children report having an “intimate” relationship with AI, whether that is romantic, sexual or a friendship.

“And the experts say Australia is about a year behind the US, so it’s an idea of just how things are changing,” she said.

Originally published as ‘Kids are considering suicide’: Horrific ‘undressing’ apps ruining lives of teachers and teens in Aussie schools

Add your comment to this story

To join the conversation, please Don't have an account? Register

Join the conversation, you are commenting as Logout

Original URL: https://www.heraldsun.com.au/education/support/technology-digital-safety/kids-are-considering-suicide-horrific-undressing-apps-ruining-lives-of-teachers-and-teens-in-aussie-schools/news-story/6970282b0d4c4366b6044d8596bb2c46