NewsBite

Facebook ‘can’t combat child pornography’ as moderators tell all

Former contractor lifts the lid on the online child pornography industry and the tech giant’s struggle to tackle it.

A number of former contractors plan to sue Facebook and its contracting agency in Ireland’s High Court, declaring they suffered ‘psychological trauma’ as a result of poor working conditions and a lack of proper training. Picture: AFP
A number of former contractors plan to sue Facebook and its contracting agency in Ireland’s High Court, declaring they suffered ‘psychological trauma’ as a result of poor working conditions and a lack of proper training. Picture: AFP

A new lawsuit is casting fresh doubt on Facebook’s ability to block child pornography and support the people tasked with combatting it, with one former contractor lifting the lid on the sheer size of the horrific industry he says the tech giants are ill-equipped to tackle.

A number of former contractors said overnight they are gearing up to sue Facebook and its contracting agency in Ireland’s High Court, declaring they suffered “psychological trauma” as a result of poor working conditions and lack of proper training. The company is also already facing a similar lawsuit in California brought by two former moderators.

The company is understood to have about 15,000 moderators who work for third-party companies on behalf of Facebook, whose job it is to keep abhorrent material off the platform.

READ MORE: Scott Morrison vows to tackle tech giants over child porn | Privacy, ads focus of reply to report

One former US-based contractor, Colin*, speaking exclusively to The Australian, said he’d been given inadequate support to deal with content that it was his job to view and moderate, included babies being sexually assaulted and bestiality involving children.

Colin - who could not use his real name for legal contractual reasons - sifted through masses of photos every day, identifying which ones were child pornography then grading them based on the age of the subject and the content of the image.

He said around half of the images would be typical pornography, with the other half being extreme content.

If anything was labelled ‘A1’ it was sent to law enforcement and the National Center for Missing and Exploited Children (NCMEC) in the US.

“Because of bureaucracy and the fact that we didn’t actually work for Facebook, the most that our direct supervisors, [who were] Facebook full-timers, could get us was two free sessions with a counsellor,” he told The Australian.

“We were basically just told ‘yeah you can get two free sessions, email this person and they’ll help you out.’

“So then you had to go through about five people to be put in touch with a therapist close to you. Whether or not that therapist was trained to handle PTSD or secondary trauma of any kind, it was unclear.

“I showed up to my therapist and she was a normal family therapist, so it’s not clear how much trauma she was able to help me handle in my two free hours with her.”

The company is understood to have about 15,000 moderators who work for third-party companies on behalf of Facebook, whose job it is to keep abhorrent material off the platform. Picture: AFP
The company is understood to have about 15,000 moderators who work for third-party companies on behalf of Facebook, whose job it is to keep abhorrent material off the platform. Picture: AFP

According to Colin, during his time at Facebook his employers had been hired straight out of college and were ill-equipped to handle the mammoth, growing industry that is child pornography.

“There’s someone in a former Soviet state right now working as a lighting technician on a child porn shoot,” he said. “That was probably commissioned by someone 5000 miles away.”

He added that the supervisors were overworked but did try to boost morale by, for a time, encouraging contractors to eat lunch away from their desks.

“The fact these contractors are being underpaid and asked to, without any kind of training or support network, somehow juggle all manner of horrors at once is pretty disgusting,” Colin told The Australian.

Some of the Dublin-based contractors suing Facebook are claiming to suffer from Type 2 PTSD, which can result in symptoms including suicidal thoughts and panic attacks.

“In terms of symptoms, I was definitely a lot more sullen and stressed out, I had a few nightmares and I think my anxiety overall has increased since the job,” Colin said.

“Overall since 2012 I’ve had a hard time both focusing and allowing myself to relax. I have to be listening to something, or doing something even if it’s completely mindless. And it’s almost like if I stop, my brain’s memory banks will unlock and all this stuff will just flood out.”

A Facebook spokesman told The Australian in a statement the company is ‘committed to providing support for those that review content for Facebook’. Picture: AFP
A Facebook spokesman told The Australian in a statement the company is ‘committed to providing support for those that review content for Facebook’. Picture: AFP

Colin, who left Facebook after his 12 month contract ended, said the only skills he and others gained from the job was learning how to detect child pornography.

“I think a few people ended up doing the same job at Twitter and YouTube,” he said.

“Thankfully my contract expired the week that they introduced videos.

“The advice we got was ‘don’t have the sound on’.”

A Facebook spokesman told The Australian in a statement the company is “committed to providing support for those that review content for Facebook as we recognise that reviewing certain types of content can sometimes be difficult”.

“Everyone who reviews content for Facebook goes through an in-depth, multi-week training program on our Community Standards and has access to extensive psychological support to ensure their wellbeing.

“This includes 24/7 on-site support with trained practitioners, an on-call service, and access to private healthcare from the first day of employment. We are also employing technical solutions to limit their exposure to graphic material as much as possible.

“This is an important issue, and we are committed to getting this right.”

The spokesman added on the issue of child exploitation that “for years we’ve been tackling this issue with the most advanced technologies, industry collaboration and partnerships with child safety NGOs.

“What separates us from every other company is that we deploy sophisticated technology across all of our platforms to proactively find and remove as much child exploitative content as we can and work with local and international law enforcement to take action on perpetrators.

“Our advanced technology now removes 99.5% of content before anyone reports it, oftentimes as soon as someone tries to upload it.”

Original URL: https://www.theaustralian.com.au/business/technology/facebook-cant-combat-child-pornography-as-moderators-tell-all-in-court/news-story/952b79ef989343428ffc008f0fc8a71a