NewsBite

The rise of fake porn as blackmail — technology used to weaponise doctored sex videos

You might find yourself featuring in an explicit sex tape that could be seen by your loved ones and employer, even if you never actually filmed one.

The real threat of Deep Fakes

Imagine you receive a video of yourself engaged in an explicit sexual act, with a demand to hand over money or else the clip will be sent to your family, friends and co-workers.

Only, it’s not really you, but a shockingly convincing fake that looks and sounds just like you.

Your options are to pay up and hope the blackmailer keeps good on their word or run the gauntlet of convincing everyone — including strangers who find it online — that you’re innocent.

It’s a worrying reality that’s not only possible thanks to rapidly evolving and widely available technology, but already playing out in a new trend called ‘malicious deepfakes’.

A victim in the United States told the Washington Post that she discovered a video of her face digitally stitched onto the body of a pornography actress circulating online.

“I feel violated — this icky kind of violation,” the woman, who is in her 40s, told the newspaper. “It’s this weird feeling, like you want to tear everything off the internet. But you know you can’t.”

Convincing fakes of innocent people in explicit videos are popping up online and authorities are struggling to combat the disturbing new trend.
Convincing fakes of innocent people in explicit videos are popping up online and authorities are struggling to combat the disturbing new trend.

The report said similar fakes, made using open source machine learning technology developed by Google, had been used to threaten, intimidate and extort women.

Hollywood megastar Scarlett Johansson has been victim to the sickening trend, with dozens of hard-to-spot fake sex tapes circulating online.

In one instance, a video described as a leaked sex tape has been viewed almost two million times.

“Nothing can stop someone from cutting and pasting my image or anyone else’s onto a different body and making it look as eerily realistic as desired,” Johansson said.

Other stars, from the singer Taylor Swift to Wonder Woman actress Gal Gadot, have been inserted into similarly vile videos.

But the technology has given those with sinister and malicious motives the opportunity to quickly ruin someone’s life at the click of a mouse.

The technology developed by Google is open source, meaning it’s available to anyone.
The technology developed by Google is open source, meaning it’s available to anyone.

HELD AT RANSOM

In 2016, a man in California was charged with targeting his ex-wife while earlier this year

Indian investigative journalist Rana Ayyub found herself the victim of a deepfake video.

It spread quickly via social media in an apparent retaliation to a piece she had written exposing government corruption.

“The slut-shaming and hatred felt like being punished by a mob for my work as a journalist, an attempt to silence me,” Ms Ayyub wrote in an op-ed for The New York Times.

“It was aimed at humiliating me, breaking me by trying to define me as a ‘promiscuous’, ‘immoral woman’.”

Technology allows people to make convincing fake videos of almost anyone, just like this one of Barack Obama.
Technology allows people to make convincing fake videos of almost anyone, just like this one of Barack Obama.

There are forums online devoted to deepfakes, where users can make requests for women they want inserted into unflattering and usually pornographic scenarios.

And creators charge too — usually about $20 per piece for those wanting fabricated videos of exes, co-workers, friends, enemies and classmates.

Researchers testing the technology created a video of Barack Obama delivering a speech, mapping his facial movements with thousands of available images and pieces of footage.

With the use of artificial intelligence, the end product looked and sounded just like the former US President, except he had never actually uttered those words.

And while the outcome was innocent, fun and purely an example, it showed how easily someone could use the technology for evil.

For the victim interviewed by the Washington Post, it emerged that the creator needed just 450 images of her, all sourced from search engines and social media.

Siwei Lyu, associate professor of computer science at the University of Albany, said there were some subtle clues that a video could be fake.

“When a deepfake algorithm is trained on face images of a person, it’s dependent on the photos that are available on the internet that can be used as training data,” he said.

“Even for people who are photographed often, few images are available online showing their eyes closed.

“Without training images of people blinking, deepfake algorithms are less likely to create faces that blink normally.”

A team of cyber experts act as digital detectives, finding and stamping out acts of revenge porn in Australia, as part of efforts by the eSafety Commissioner. Picture: Brianne Makin
A team of cyber experts act as digital detectives, finding and stamping out acts of revenge porn in Australia, as part of efforts by the eSafety Commissioner. Picture: Brianne Makin

However, Mr Lyu admits software designed to scan for fakes is struggling to keep up with advancements in the technology that creates them.

“People who want to confuse the public will get better at making false videos — and we and others in the technology community will need to continue to find ways to detect them.”

ENORMOUS RISKS

In a research paper published this year, Robert Chesney from the University of Texas and Danielle Citron from the University of Maryland said the damage could be “profound”.

“Victims may feel humiliated and scared,” they wrote.

“When victims discover that they have been used in fake sex videos, the psychological damage may be profound — whether or not this was the aim of the creator of the video.”

The technology is advancing rapidly, making the fakes increasingly difficult to spot.
The technology is advancing rapidly, making the fakes increasingly difficult to spot.

They could be used to extort and threaten victims, to spread misinformation in the increasingly worrying era of fake news, or to bribe elected officials, the report warned.

Or these faked clips, which are difficult to detect, could be used to terrify the public — such as “emergency officials ‘announcing’ an impending missile strike on Los Angeles or an emergent pandemic in New York City, provoking panic and worse”.

In February, tough new laws against ‘revenge porn' — including fabricated deepfakes — passed in federal parliament, introducing six-figure financial penalties.

But policing the internet is a notoriously difficult task.

“Individuals and businesses will face novel forms of exploitation, intimidation, and personal sabotage,” Professor Chesney and Citron wrote. “The risks to our democracy and to national security are profound as well.

“Machine learning techniques are escalating the technology’s sophistication, making deep fakes ever more realistic and increasingly resistant to detection.”

Continue the conversation shannon.molloy@news.com.au

Australians affected by image- and video-based abuse can report incidents to the eSafety Commissioner

Add your comment to this story

To join the conversation, please Don't have an account? Register

Join the conversation, you are commenting as Logout

Original URL: https://www.news.com.au/technology/online/security/the-rise-of-fake-porn-as-blackmail-technology-used-to-weaponise-doctored-sex-videos/news-story/aa943ca0e1c4e16d7919b8062b2e2657