Our Watch report finds production of naked images and sexualised messages is fuelling ‘sextortion’ and blackmail
Female teachers are leaving the industry because they fear for their safety, according to research into the alarming rise in students creating deepfake nude images.
Victoria
Don't miss out on the headlines from Victoria. Followed categories will be added to My News.
Artificial intelligence “nudify” technology that enables the generation of deepfake nude images has promoted calls for a new crackdown on “sextortion” in Victorian schools.
Female teachers and students are regular targets of the technology, with male students overwhelmingly the perpetrators, research shows.
Cyber safety expert Susan McLean has joined overseas calls for deepfake nude sites to be removed from app stores, saying such crimes are underreported by schools.
She said younger pupils in primary schools were “using AI video tools to create videos of teachers making racist comments, but by secondary school they are using nudify apps to make sexually explicit pornographic content”.
“If you are using these tools then it’s a criminal act as you are creating child sex abuse material,” she said.
A report by national violence prevention organisation Our Watch, released on Tuesday, says the production of naked images and sexualised messages is fuelling “sextortion” and blackmail as well as non-consensual tracking and stalking.
“It is deeply concerning that nearly every week there is a new incident in the school environment – whether that be young men sharing deepfake porn of young women they know or female teachers leaving the industry because they fear for their safety,” Our Watch chief executive Patty Kinnersly said.
Our Watch is calling for age-appropriate education about respectful relationships, including the impact of pornography and the use of AI to alter images, to be embedded in schools from prep to year 12.
New materials are based on a Victorian and Queensland pilot program that includes year 1 and 2 pupils learning about gender-stereotypical attitudes.
Schools are even offered scripts to address backlash from parents of year 1 or 2 pupils who may feel such teaching contravenes their values or cultural and religious beliefs.
Victorian schools have had respectful relationships classes since 2016 as part of the core curriculum.
Despite this, schools have been sites of a spate of abuse, including a rise in the creation of deepfake nudes, such as those created by a male student at Salesian College last year.
US expert Thorn says ease of access to AI technology allows children to create such images with “minimal skill or effort” and “introduces significant risks” in the fight against sexual abuse.
“Many female teachers state that their workplaces are no longer safe due to misogynistic behaviour directed towards them and female students,” The Our Watch report says.
Independent Education Union general secretary David Brear said such harassment could be challenging to address in schools “because the perpetrators of this harassment are, in most cases, teenage boys in the care of the school”.
Students have also been targeted by deepfakes at schools such as Bacchus Marsh Grammar and Gladstone Park Secondary College.
The state government was contacted for comment.
Originally published as Our Watch report finds production of naked images and sexualised messages is fuelling ‘sextortion’ and blackmail