NewsBite

EXCLUSIVE

Experts reveal a terrifying new porn trend sweeping Aussie schools

A disturbing new porn trend is sweeping schools across the country, with experts warning that “no child is safe”.

Deepfake porn is becoming a problem in Aussie schools.
Deepfake porn is becoming a problem in Aussie schools.

A terrifying new porn trend is sweeping Australian schools at an alarming rate.

Parents are being warned about new AI technology which allows users to seamlessly put one person’s face onto another’s body, known as ‘deepfakes’.

While it might sound like a simple bit of Snapchat or TikTok fun, the technology is being used maliciously and illegally – and it’s frighteningly simple to do.

Teens in a US school were earlier this month reported to be using a deepfake app to create pornographic images of their classmates.

While technically fake, the photos appear very real and are usually created to shame, humiliate and bully others - completely indistinguishable from the real thing.

They can even be used as a tool to manipulate and ‘sextort’ people, a practise of extorting money or sexual favours from a person with the threat of revealing intimate content.

Now the conversation has turned to Australia, where experts have warned of this terrifying trend infiltrating schools across the country.

There have been cases of school kids using A.I. technology to bully and humiliate others. Picture: iStock
There have been cases of school kids using A.I. technology to bully and humiliate others. Picture: iStock

Not only have there been cases of pictures of school kids being used in this way, but reports have also emerged of children creating deepfake pornographic images of their teachers.

A cybersecurity expert told news.com.au that the process of creating deepfake material is surprisingly easy.

“The first deepfakes were created in the film industry, where new technologies helped with special effects,” Tony Burnside, vice president of Netskope Asia Pacific, told news.com.au.

“Think about those scenes on Forrest Gump when he meets JFK or John Lennon for example.”

He explained that for a long time, the huge cost of such technology meant it was limited to creative professionals.

“However, in recent years progress in Artificial Intelligence has made this task easier and malicious actors have seized this opportunity,” Mr Burnside explained.

“In the late 2010s, they started creating deepfakes for large-scale, mostly political, disinformation campaigns, where one fake pictures or video could influence millions.

“Nowadays, you don’t need to be a cyber criminal or possess extensive skills to create deepfakes.”

The deepfake porn trend is sweeping Aussie schools. Picture: iStock
The deepfake porn trend is sweeping Aussie schools. Picture: iStock

Australian kids are at risk

AI expert Anuska Bandara, who is the founder of Elegant Media based in Melbourne, added that children and teenagers were particularly vulnerable to deepfake technologies.

“Since the advent of the AI hype in November 2022, marked by the emergence of OpenAI’s flagship product, ChatGPT, the conversation has taken an unsettling turn with the rise of deepfake technology,” Mr Bandara told news.com.au.

“This issue is poised to have far-reaching consequences for Australians, particularly children and teenagers who are increasingly vulnerable.

“The younger demographic have become avid followers of their favourite influencers, be they animated characters or sports personalities, often unquestioningly accepting their messages on social media.

“The peril lies in the fact that the real individuals have no control over what deepfakes, created using advanced AI techniques, might communicate. Exploiting this technology, scammers are leveraging deepfakes to influence unsuspecting individuals, leading them into dangerous situations or even engaging in the distribution of explicit content.

Have you ever been a victim of deepfake technology? Continue the conversation: jasmine.kazlauskas@news.com.au

Once a photo is out there, it is very hard to get it back, experts warn. Picture: iStock
Once a photo is out there, it is very hard to get it back, experts warn. Picture: iStock

“The ramifications of this misuse pose a significant threat to the wellbeing and safety of the younger generation as they navigate the online landscape.”

Mr Bandara said that photographs of children could easily be used without their parent’s knowledge to create explicit content.

“This certainly can happen, especially with publicly available content online,” he said.

“It’s crucial to comprehend the privacy policies and settings associated with sharing online content featuring your children.”

He explained that with photos easily manipulated, even using more basic tools like Photoshop, parents need to be aware of their children’s images or who can access them.

“Numerous tools are accessible for effortlessly creating deepfake videos. It’s essential to educate your kids on recognising such content,” Mr Bandara explained.

“Exercise caution with content from unverified sources and always trust material from reputable publishers, including mainstream media.”

Lifelong psychological impacts

Psychologist Katrina Lines, who is also CEO of Act For Kids, told news.com.au that with issues like sextortion on the rise, it is a very frightening time for deepfake technology.

She added that it was vital to educate both parents and children about the potential dangers of posting content online, no matter how benign it may seem.

“The issue of sextortion is increasing, and that is directly related to the sharing of content,” Ms Lines said.

“Some teens are easily duped into thinking they are sending an explicit pictures to someone they or someone their age.

Psychologist Katrina Lines warned of the emotional impact this can have on kids. Picture: Supplied
Psychologist Katrina Lines warned of the emotional impact this can have on kids. Picture: Supplied

“But then now the issue of deepfake comes in, and it just makes everything more complicated. “You have no control over it, people think you’ve sent explicit material when you haven’t.

“It is sexual abuse, and it has lifelong psychological impacts.

“I know that in many parts of the dark web, existing child sexual exploitation material is being digitally altered and recirculated.

“This is just an ongoing sexual abuse of kids, and it is just awful.”

Ms Lines urged parents to be careful about what they are sharing online.

“We all like to post happy snaps of our family and things like this online, but it is so important to realise that once a photo is out there, you usually can’t get it back,” she warned.

“There is no real way to know if your child’s images are being used online. Most of the time, it does not exist in the normal web, but on the dark web, and it is harder for normal, everyday people to find it.”

Easier to inflict harm

Australia’s eSafety Commissioner, Julie Inman Grant, confirmed that they had received a growing number of complaints about pornographic deepfakes since the start of the year.

She also said that with the ease of creating deepfakes, it was therefore easier to “inflict harm” upon others.

“The rapid deployment, increasing sophistication and popular uptake of generative AI means it no longer takes vast amounts of computing power or masses of content to create convincing deepfakes,” Ms Grant told news.com.au.

“That means it’s becoming harder and harder to tell the difference between what’s real and what’s fake online. And it’s much easier to inflict great harm.

Child exploitation material is rife on the dark web. Picture: iStock
Child exploitation material is rife on the dark web. Picture: iStock

“eSafety has seen a small but growing number of complaints about explicit deepfakes since the beginning of the year through our image based abuse scheme.

“We expect this number to grow as generative AI technology becomes more advanced and widely available – and as people find ever more creative ways to misuse it.

“We’ve also received a small number of cyberbullying deepfake reports where children have used the technology to bully other children online.

“That should all give us all pause. And galvanise industry to take action to stem the tide of further misuse and abuse.”

Ms Grant said that it can be “devastating” for someone to find out their image has been used in an explicit deepfake, and urged anyone in this predicament to report it online.

“Deepfakes, especially deepfake pornography, can be devastating to the person whose image is hijacked and sinisterly altered without their knowledge or consent,” she said.

“The availability and investment of deepfake detection tools is sorely lagging, thereby denying victims any potential validation or remedy.

“We encourage Australians experiencing any kind of image-based abuse, including those involving deepfakes, to report it to eSafety.gov.au.

“Our investigators stand ready to support Australians dealing with this distressing abuse and have an 87 per cent success rate in removing this material.”

Original URL: https://www.news.com.au/lifestyle/parenting/school-life/experts-reveal-a-terrifying-new-porn-trend-sweeping-aussie-schools/news-story/f10b8316d70aead7c9c03d24df05609d