NewsBite

Blaire’s ‘adult’ video went viral – only problem was, it wasn’t her

As artificial intelligence becomes more commonplace, an “insidious” issue is on the rise – one victims say is as traumatic as a sexual assault.

QTCinderella reacts to deepfake porn scandal

As artificial intelligence becomes more commonplace, an “insidious” issue is on the rise – one victims say is as traumatic as a sexual assault.

Deepfake – a portmanteau of ‘deep learning’ and ‘fake’ – media overlays an image or video onto an existing image. It’s been used to enhance both the entertainment and gaming industries. But there are now mounting concerns over its potential to create sexually explicit content, without the consent of those depicted.

Such was the experience of Blaire – a Twitch streamer who goes by ‘QTCinderella’ online – when she received dozens of messages in February about an “adult video” of her that had gone viral on a pornographic website.

The issue? The 28-year-old had never created such a video – and was distraught to learn her face had been pasted onto another woman’s body, making it look as if she was legitimately engaging in the depicted act. She had become a victim of deepfake porn.

While image-based abuse – commonly referred to as ‘revenge porn’ – remains a concern, especially among young women, experiences like Blaire’s show that sexual content no longer needs to be produced in the first place for people to share it.

Non-consensual deepfake pornography, Dr Emma Jane, one of the world’s leading academic experts on digital misogyny and gender-based violence, told news.com.au, “absolutely” sits on the spectrum of sexual violence.

Twitch streamer QTCinderella was a victim of deepfake pornography earlier this year. Picture: Twitch
Twitch streamer QTCinderella was a victim of deepfake pornography earlier this year. Picture: Twitch

It’s a form of abuse referred to in academia as technology-facilitated gender-based violence (TFGBV) – and one with a “depressingly long history”. Three decades after the first widely-reported rape in the digital world – a 1993 incident in a small multi-player online game called LambdaMOO – Dr Jane said “it’s incredibly dispiriting that we’ve made so little progress”.

‘An insidious form of victim-blaming’

Contributing to that lack of progress is the entrenched belief that TFGBV doesn’t constitute sexual abuse because it doesn’t occur in “real” life.

Considering “our daily interactions increasingly involve a merging of the digital and the physical”, Dr Jane said, “I no longer think it’s useful to try to make stark distinctions between our online and offline lives”.

“This is one reason it’s not OK to tell women who’ve been targeted for TFGBV (like deepfake pornography) that the solution is to simply ‘take a little break’ from using the internet,” she added.

“It’s on par with suggesting they stay locked down in their homes rather than risk venturing outdoors and constitutes an insidious form of victim-blaming.”

Telling women who’ve been targeted for non-consensual deepfake pornography that it’s not “real”, Dr Jane continued, could be considered a form of gaslighting.

Blaire said she was left feeling ‘so violated’ after the incident. Picture: Instagram
Blaire said she was left feeling ‘so violated’ after the incident. Picture: Instagram
The 28-year-old said the harassment in the wake of the incident was relentless. Picture: Instagram
The 28-year-old said the harassment in the wake of the incident was relentless. Picture: Instagram

“Unfortunately, it has many parallels with the way sexual violence and gendered harassment is downplayed in our broader culture. Victim blaming and shaming is prevalent everywhere,” she said.

“Debates about whether sexual violence is ‘real’ if it occurs online are often used as an opportunity to further attack those who speak publicly about their experiences … TFGBV can have a severe – and embodied – impact on targets that affects their psychological health, their livelihoods, their reputation, and their physical safety.”

She pointed to a 2020 study by the EIU, which found that of women who had experienced online violence in the previous year, seven per cent lost or had to change their job, 35 per cent reported mental health issues, one in 10 experienced physical harm as a result of online threats, and almost nine in 10 reported restricting their online activity in a way that limited their access to employment, education, healthcare, and community.

The sexual nature of deepfakes reflects rape culture on a global scale. As Blaire told VICE’s Motherboard, the harassment in the wake of her incident was relentless, and resurfaced trauma from her past.

“You feel so violated … I was sexually assaulted as a child, and it was the same feeling,” she said.

“Where you feel guilty, you feel dirty, you feel like, ‘What just happened?’. And it’s bizarre that it makes that [trauma] resurface. I genuinely didn’t realise it would.”

Maya Higa, one of the victims Twitch streamer Atrioc (Brendan Ewing) was recently caught watching a deepfake porn video of, shared a similar perspective in a harrowing statement on Twitter.

“In 2018, I was inebriated at a party and I was used for a man’s sexual gratification without my consent,” she wrote.

“Today, I have been used by hundreds of men for sexual gratification without my consent. The world calls my 2018 experience rape. The world is debating over the validity of my experience today.”

‘Not something we can simply arrest our way out of’

As is often the case with sexual abuse and harassment, the path to justice for victims of TFGBV is not an easy one. Not only does the costly, time-consuming burden of legal recourse fall on them; it’s further complicated by the fact most people sharing abusive images online are doing so anonymously, and can be harder to pin down.

In the UK, the government recently amended its Online Safety Bill in a bid to crack down on deepfakes; in the US, only Virginia and California have laws that ban faked and deepfaked revenge porn.

At home, eSafety Commissioner Julie Inman Grant told news.com.au, “deepfakes are covered under eSafety’s image-based abuse scheme, which is the only scheme of its kind in the world”.

“Any Australian whose images or videos have been altered to appear intimate and are published online without consent can contact eSafety for help to have them removed … We have a 90 per cent success rate in achieving these take downs, primarily from overseas sites,” Ms Inman Grant said.

Twitch streamer Atrioc was forced to apologise after he posted a video of himself watching deepfake pornography. Picture: Twitch
Twitch streamer Atrioc was forced to apologise after he posted a video of himself watching deepfake pornography. Picture: Twitch

Dr Jane said the issue of deepfake pornography is “not something we can simply arrest our way out of”.

“These are problems involving complex combinations of people and technology, so there’s never going to be a single simple solution,” she said.

“I do, however, think an important first step is for governments and individuals to take a good hard look at the way the business models of social media giants like Meta and Alphabet facilitate and exacerbate these sorts of problems.”

Ms Inman Grant agreed.

“Innovations to help identify, detect and confirm deepfakes are advancing and technology companies have a responsibility to incorporate these into their platforms and services,” she said.

“Antidotes to these risks need urgent investment and innovation now, as they are lagging behind the rapid proliferation of these technologies and the online harms they are likely to engender.”

Image-based abuse is a breach of the Online Safety Act 2021, and under the Act, perpetrators can be issued a fine or imposed with jail time in some jurisdictions. Any Australian whose images or videos have been altered to appear sexualised and are published online without consent can contact eSafety for help to have them removed.

Originally published as Blaire’s ‘adult’ video went viral – only problem was, it wasn’t her

Original URL: https://www.dailytelegraph.com.au/lifestyle/blaires-adult-video-went-viral-only-problem-was-it-wasnt-her/news-story/5748445b933424d0b3869bf39dded897