NewsBite

Artificial Intelligence experts, NSW Police warn of rise in ‘deepfake’ porn

Cybercrime specialists have revealed the ways AI technology was being used to create fake porn videos of celebrities – to now targeting everyday Australians. Here’s what it means for you.

Celebrities and advocates are talking out over the rise of deepfake porn.
Celebrities and advocates are talking out over the rise of deepfake porn.

Artificial intelligence experts have revealed criminals are using basic audio and video data from social media profiles to “mimic” an unsuspecting victim’s appearance, speech, and behaviour in an effort to smear them online with deepfake porn.

A release from BOCSAR that focused on five cybercrime categories – online image abuse, device offences, cyber-enabled fraud, identity theft and cyber-enabled abuse – revealed an almost 50 per cent increase in cybercrimes in the last three years.

Deepfake is media which has been digitally manipulated to replace a persons’ likeness convincingly with that of another.

In the age where one’s identity is plastered over the internet, the potential for deepfake to be used to create lifelike intimate images without consent is a very possible reality, according to experts.

Macquarie University psychologist Dr Mic Moshel and author of ‘Are you for real? Decoding realistic AI-generated faces from neural activity’, said AI is able to “produce highly realistic counterfeit content”.

“These technologies enable the creation of face-swapped or artificially constructed pornography that is inexpensive, rapidly created, and hyper-realistic,” Dr Moshel said.

Macquarie University Psychologist Doctor Mic Moshel. Picture: supplied
Macquarie University Psychologist Doctor Mic Moshel. Picture: supplied

Deepfake pornography has been employed in numerous ways, such as a smear campaign targeting an Indian feminist journalists or the very many famous actors depicted in fabricated pornographic scenarios.

A simple Google search with the term: “deepfake [enter famous person name]” will reveal thousands of websites dedicated to fake pornography.

“The proliferation of fake porn, explicit content involving minors, and celebrity-related pornography prompted Reddit, one of the platforms where deepfakes initially emerged, to prohibit the sharing of fake depictions and lookalike pornography,” Dr Moshel said.

Kristen Bell told Vox of the trauma she felt after discovering her face had been virtually edited and used in pornographic deepfake.

Kristen Bell was targeted.
Kristen Bell was targeted.

“I was just shocked, because this is my face,” Bell said to Vox.

“It belongs to me! It’s hard to think about – that I’m being exploited.”

With entire websites dedicated to “deepfake pornography”, the implications have had profound effects on those targeted.

Scarlett Johansson told the Washington Post that fighting deepfake was a “lost cause”.

“The internet is just another place where sex sells and vulnerable people are preyed upon,” Johansson said.

Scarlett Johansson said fighting deepfake was a “lost cause”.
Scarlett Johansson said fighting deepfake was a “lost cause”.

“Clearly this doesn’t affect me as much because people assume it’s not actually me in a porno, however demeaning it is. It’s a useless pursuit.”

Deepfake pornography was also used in a smear campaign targeting Indian journalist Rana Ayyub in 2018, after she spoke out about the rape of an eight-year-old girl in India.

Ayyub’s face was manipulated into pornographic videos, with her later telling Huffington Post she struggled with mental health issues.

Indian journalist Rana Ayyub.
Indian journalist Rana Ayyub.

Ayyub said the videos continued to resurface whenever she took a high-profile case.

Youth Australian of the Year in 2018 Noelle Martin, discovered that at just age 18, criminals had used deepfake to create images of her from social media to edit her face into pornographic videos – even including her name and address.

Ten years on, Martin is now a dedicated advocate for online sexual abuse, and appeared in the SBS doco “Asking For It”.
Ten years on, Martin is now a dedicated advocate for online sexual abuse, and appeared in the SBS doco “Asking For It”.

The feminist, law graduate and activist courageously took action and helped provide justice avenues for victims of image-based sexual abuse.

Her actions led to a new law introduced in NSW in 2017, making it a criminal offence to distribute non-consensual intimate images.

Dr Moshel said the tools required to generate deepfakes are openly available to the public and “only require a basic computer or smartphone”.

“The implications of this accessibility are deeply unsettling,” Dr Moshel said.

Dr Moshel said human’s innate facial detection ability is under threat with AI due to how life like it had become.

“Humans have historically excelled at face detection (in comparison to computers) due to the specialised network in our brains known as the fusiform face area,” he said.

Dr Moshel said the tools required to generate deepfakes are openly available to the public and “only require a basic computer or smartphone”.
Dr Moshel said the tools required to generate deepfakes are openly available to the public and “only require a basic computer or smartphone”.

Director for the Centre of Applied Artificial Intelligence at Macquarie University – Professor Amin Beheshti – said AI had made “significant progress”.

“AI algorithms, trained on vast amounts of data, can mimic a target’s appearance, speech, and behaviour,” Mr Beheshti said.

He said creating high-quality deepfakes still requires technical expertise and computational resources, despite increased accessibility.

“The process involves collecting data, training AI models, and using specialised software. The off-the-shelf tools like DeepFaceLab, Faceswap, FakeApp and Avatarify are often used by more experienced users who are comfortable working with advanced softwares,” Mr Beheshti said.

Director at the Centre of Applied Artificial Intelligence at Macquarie University, Professor Amin Beheshti. Picture: Michael Amendolia
Director at the Centre of Applied Artificial Intelligence at Macquarie University, Professor Amin Beheshti. Picture: Michael Amendolia

He said there is an immediate need for safety tools as AI technology seeds into everyday life.

“It’s crucial to have robust measures to protect sensitive information.”

Mr Beheshti said governments and organisations must start establishing ethical guidelines and frameworks for AI’s development and deployment.

Australian AI institute Professor Jie Lu said AI experts are combating deepfake by developing and using advanced machine learning algorithms to detect fake media.

“Under responsible AI framework, we’ve developed approaches to mitigate fake media,” Prof. Lu said.

Australian AI institute Professor Jie Lu. Picture: supply
Australian AI institute Professor Jie Lu. Picture: supply

Prof. Lu said laws and education should be implemented alongside technological safeguards to protect identities.

A NSW Police Cybercrime squad was established in 2018 as a direct response to a cyber crime increase.

Detective Superintendent and Commander of cyber crime squad Matt Craft, said technological advancements saw an increase in deepfake since 2018 – a time when scams were of most concern.

“Better software is available now which pulls from huge data samples, that includes audio that can replicate voice and be used nefariously to perpetrate crimes,” Commander Craft said.

Detective Superintendent and commander of cyber crime squad Matt Craft. Picture: supplied
Detective Superintendent and commander of cyber crime squad Matt Craft. Picture: supplied

He said now more than ever, parents need to be cautious about monitoring their children’s social media use.

Commander Craft said AI is used under limited circumstances in the police force, but legislation should be reviewed and updated as it is inconsistent across Australia.

“AI was developed with good intentions and AI use has endless value in the community. Like anything, criminals will seek to exploit technology to further a criminal act,” he said.

“There is no requirement to specifically legislate against AI due to the benefits it can bring – like any type of software.”

Add your comment to this story

To join the conversation, please Don't have an account? Register

Join the conversation, you are commenting as Logout

Original URL: https://www.dailytelegraph.com.au/newslocal/blacktown-advocate/artificial-intelligence-experts-nsw-police-warn-of-rise-in-deepfake-porn/news-story/ee1b249f15dc6298688391751215c8dc