NewsBite

British woman becomes ‘deepfake’ victim after PornHub campaign

Kate Isaacs fought to get PornHub to delete millions of explicit videos, making her a target of a “terrifying” new sex crime soaring in numbers.

Woman becomes deepfake victim after PornHub campaign

After her friend’s explicit video was stolen and uploaded to PornHub, Kate Isaacs was determined to strip non-consensual footage from the adult website – but her campaign saw her become the target of a new sexual crime.

The British woman launched the “NotYourPorn” campaign in 2019 in a bid to protect non consenting adults, sex workers and under 18s from image-based sexual abuse in the UK.

After mounting pressure, PornHub eventually deleted almost 10 million videos, admitting they could not verify whether the people in the removed videos were consenting or of age.

The major clear up took the number of videos on the adult platform down from 10.5 million, to less than 3 million CNN reported at the time – and also made Kate a target.

But the 30-year-old from London only became aware of the campaign against her when she uncovered a disturbing video of herself on Twitter.

Kate Isaacs became a ‘deepfake’ victim after campaigning for PornHub to remove non-consensual sexual content from its website. Picture: Instagram/KateFIsaacs
Kate Isaacs became a ‘deepfake’ victim after campaigning for PornHub to remove non-consensual sexual content from its website. Picture: Instagram/KateFIsaacs

The clip, which was entirely fabricated, showed the 30-year-old appearing to engage in a sexual act – but while it wasn’t real, it was shockingly realistic.

So much so, Kate initially thought she had been filmed without her knowledge or consent.

Kate had in fact been “deepfaked”, a new phenomenon that involves using artificial intelligence to digitally manipulate her face onto someone else’s body. In this case, an adult actress.

“My heart sank. I couldn’t think clearly,” she told the BBC recently.

“It was so convincing, it even took me a few minutes to realise that it wasn’t me.

“I remember just feeling like this video was going to go everywhere – it was horrendous.”

While imaged-based sexual abuse of this nature is a relatively new crime, it’s so prolific, experts have warned we’re facing an “epidemic”.

Thanks to ever-improving technologies, it only takes a few minutes to superimpose someone’s face onto an explicit video, with untold ramifications on the victim’s life.

An innocent video of the 30-year-old was turned into pornographic footage using AI technology. Picture: Facebook/KateFIsaacs
An innocent video of the 30-year-old was turned into pornographic footage using AI technology. Picture: Facebook/KateFIsaacs

The video of Kate was made using completely innocent footage of her taken from the internet, and she now speaks out to warn others of the dangers lurking online.

“It makes every woman who has an image of herself online a potential victim; that’s pretty much all of us, these days,” she told MailOnline.

But the female activist said she was “powerless” to do anything about the explicit video, which led to her receiving even more “vile” abuse online.

“They’d used an interview I’d done with the BBC and made the video, putting it up on the internet,” she told online program My Big Story.

“It was very scary, for anyone to go on Twitter and see a video of themselves having sex with someone they don’t know, in a situation they don’t remember, as a woman that is terrifying.

“Along with the deepfake, they found and posted my address on Twitter, and these men were basically sharing it and commenting that they were going to find me, follow me home from work, rape me and film it then upload it to the internet.”

Noelle Martin, image-based sexual abuse survivor, discusses importance of consent

Tragically, it’s not just happening overseas, with the crime regularly occurring here in Australia too.

Perth lawyer Noelle Martin discovered a video of herself overlayed onto a pornographic videos after she successfully helped reform laws in 2018, criminalising the distribution of non-consensual intimate images in Australia.

The activist fought for change after becoming the victim of a relentless image-based sexual abuse attack which had seen her photos stolen from her social media account and photoshopped onto the bodies of adult film stars.

The perpetrators were relentless, and as technology advanced, so did their crimes.

Ms Martin, who is now 28, told news.com.au recently that many still sadly don’t understand the “toll” it takes on a person who experiences this digital form of sexual abuse.

“This is something that you cannot escape from because it is a permanent, lifelong form of abuse,” she said.

“They are literally robbing your right to self-determination, effectively, because they are misappropriating you, and your name and your image and violating you permanently.

“You do not have control over the way that you’re represented, and the way that you present to the rest of the world, and that impacts everything, from your economic freedom, to your employability, to your interpersonal relationships, to your romantic relationships to your physical and emotional wellbeing.”

Ms Martin was 18 when she became the target of image-based sexual abuse. Picture: Paul Kane
Ms Martin was 18 when she became the target of image-based sexual abuse. Picture: Paul Kane
Pictures falsely depicting her in graphic sexual scenarios flooded multiple pornographic sites. Picture: SBS
Pictures falsely depicting her in graphic sexual scenarios flooded multiple pornographic sites. Picture: SBS

Ms Martin appeared on the SBS docuseries Asking For It in April, which explores the importance of consent education amid prevalent rape and sexual violence numbers.

Everyday an average of 85 sexual assaults are reported in Australia, the Australian Bureau of Statistics reports.

This is estimated to be a fraction of the total number that occur – with an estimated 90 per cent of sexual assaults going unreported, data from a 2020 report conducted by the Australian Institute of Health and Welfare determined.

One-in-10 Australians have now experienced image-based abuse according to a recent report, a figure Martin believes could be lowered if more people understood consent.

“There’s a false distinction about the, that what happens online is, is entirely separate from what happens in the real world,” she said.

“Our lives are completely fused in the digital age between what happens on the internet and what happens in real life.

“What people post and images of them and their, their likeness and their bodies, even in digital form, I would say is, is an extension of their body.”

Originally published as British woman becomes ‘deepfake’ victim after PornHub campaign

Original URL: https://www.dailytelegraph.com.au/lifestyle/british-woman-becomes-deepfake-victim-after-pornhub-campaign/news-story/ca0d3917ba806e56c047cccbcb49af4c