A sexual deepfake ripped away my dignity. It was the worst day of my life
A Victorian teen recounts the nightmare of finding their face in an explicit AI deepfake. Charges against teens for deepfake creation have more than doubled in a year — the harm is severe and the consequences few.
It was the worst day of my life.
When I found out someone at my school had used my face in an explicit deepfake, it was like time stopped and I had stepped into a nightmare.
I felt sick. I thought I was going to throw up, but I was totally speechless. I didn’t know what to do.
There are no words to describe how it felt to see fake images of myself in such a way. I obviously knew it wasn’t real, but I just felt so helpless.
Who has seen this? Do people know it’s fake? How do I get it off the internet? Am I going to be in trouble? How can I prove it’s not real? What if I can never get a job because of this? Will it come up if you search my name online?
These are some of the questions I couldn’t stop thinking about. I felt horror and disbelief. These fake photos took my dignity away and they robbed me of my privacy.
I’m scared this is going to impact me for the rest of my life. I’m scared it will have a negative effect on my future.
I felt like I never wanted to go to school ever again. It makes me so sad and angry, because the person who made the photos should be the one who feels like that.
They are the one who should be embarrassed and ashamed.
This is something I never thought would happen to me. There’s no way to prepare for it, or even stop it from happening.
The only way you could stop this from happening would be to never have a photo taken of yourself, which is not a solution. I think you have to be really messed up in the head to do this to a person.
I want people to know that this is what sexual assault looks like now. It’s not just physical stuff, it’s online stuff. Even though the person doesn’t physically touch you, it’s still assault.
You can’t victim-blame for something like this, because all I did was pose for a photo.
I don’t know how to fix this problem, but I think people with power need to do something to stop this from happening.
Victorian teens are creating AI explicit deepfakes at record rates. Schools are in crisis
Minors in Victoria are producing explicit intimate images of others at an all-time high, with image-based abuse production charges more than doubling since last year.
Data obtained by the Herald Sun has revealed a 175 per cent increase in the number of youths creating non-consensual explicit images since the offence was introduced in July 2023.
Eight youths were charged in the first year, all of whom were male. As of September 2025, 24 individuals have been charged, at least 22 of whom were male.
Despite a relatively low charge rate, a report from the federal eSafety Commissioner revealed complaints of explicit deepfake image abuse targeting underage Australians have doubled since the start of 2024.
The rise in image-based abuse has sparked major concerns from experts about the influence of AI and deepfake technology on youths.
It comes as Victorian schools face a crisis of deep fake abuse among students, with teens – overwhelmingly boys – creating fake pornographic pictures of their peers.
In February, about 60 students at Gladstone Park Secondary College were impacted by sexually explicit AI-generated images that were circulated online.
Similarly, students at Mount Scopus Memorial College and Bacchus Marsh Grammar were targeted by deepfake explicit images in 2024.
One Victorian teenager whose face was non-consensually used in a pornographic deepfake said they believe the crime is hugely underreported.
“When I found out what had happened, it was like time stopped and I had stepped into a nightmare,” they said.
“One of the worst things now is thinking about, well, even if someone hasn’t shared fake photos like these with anyone, what if they’ve still generated them? Just for personal use?
“I don’t know how to fix this problem, but I think people with power need to do something to stop this from happening.
“It’s a bigger problem than adults realise.”
Criminal defence lawyer Elizabeth McKinnon, an expert in image-based abuse legality, said many teens believe they can act with impunity.
“In the Children’s Court, there are so few consequences. I’ve seen the most horrendous cases there,” Ms McKinnon said.
“I think the punishments have to be harsher. First of all, the child is already not identified – which is understandable, children make mistakes – but say that to the victim.
“She was identified in the crime, and has to live with the consequences of that.
“There’ll be a deep fake or maybe a real image of her that has been given to a number of people, and this material is like gas. Once it’s out, it fills a room. When it’s out, it’s out.
“And we can talk about awards or damages, but to sue someone, you need assets. What assets does a 15-year-old boy have? None. So, it’s something like a formal apology and off you go.
“The victim will write a victim impact statement, and the kid will go ‘yeah’, and that’s it. Some of them care, many of them don’t.”
In terms of regulation and enforcement, Ms McKinnon said there was a chronic lack of resourcing for e-crimes in Victoria.
“I think the police are dealing with a real resourcing issue. They need more people working in the IT area in e-crimes, going through everything.
“They’re doing such important work and they’re run off their feet. It’s extremely taxing.
“You can have a law, you can say don’t run a red light, don’t speed over 40, but if there’s no one there to deal with it, it’s pointless.”
She said Victorian law is doing well to keep up with the rapidly evolving technology, but without adequate resources it was all moot.
“Restricting people’s access to AI tools could be a start, but these people always find a way. There are so many ways to communicate. It’s extremely difficult to regulate.
“The technology is so advanced. Everyone’s got a phone. The kids are way ahead of their parents with programs and hiding things. And younger people are notoriously more technologically savvy than older people.”
Professor Nicola Henry, socio-legal scholar and expert in the nature and impacts of online and offline sexual abuse, said technology-facilitated abuse was multifaceted, complex and difficult to regulate.
In particular, she said there is no one-size-fits-all solution to dealing with offending minors.
“In Victoria, young people can be charged with image-based abuse offences under the Crimes Act,” Prof Henry said.
“With children and young people, charging, prosecuting or convicting should always be a last resort. Absolutely, there should be consequences for image-based offences for young people, I’m not denying that.
“But we need to implement other non-punitive consequences like restorative justice, especially in the school context. We need to figure out ways in which young people can take responsibility for their behaviour.
“When we resort to criminal punishment for young people, we risk trapping them in the revolving door of prison and crime.”
Throwing resources into ongoing education and digital literacy is critical, Prof Henry said, and is essential learning for youth in a digital age.
“This is a problem that needs proactive action from tech companies, the community, educators, and law enforcement,” she said.
“We shouldn’t place the blame solely on individuals. Platforms and tech developers — as well as society more broadly — also have a responsibility for creating and perpetuating this problem.”
A Victoria Police spokesperson said while police do see unauthorised sharing of explicit images or videos there has been extremely isolated instances where digitally generated images have been created.
“There were fewer than 24 offences for producing an intimate image among people aged 12-18, with police continuing to work to protect young people from harm.
“Police, including detectives, visit schools across the state to ensure students and staff are aware of the consequences of non-consensual sharing or distribution of intimate images and know how and where to seek help if they are a victim.”
The offence for producing an intimate image of another person came into effect on 30 July 2023.
Recorded offences for the production, distribution and threatened distribution of intimate images among this age group are down 20 per cent year-on-year.
Originally published as A sexual deepfake ripped away my dignity. It was the worst day of my life