Moderator sues Tik Tok after watching cannibalism and other depraved acts as part of job left her with PTSD
WARNING GRAPHIC CONTENT. Tik Tok has been sued after a content moderator was allegedly forced to watch cannabilism and other horror acts as part of her job.
Social
Don't miss out on the headlines from Social. Followed categories will be added to My News.
A content moderator at Tik Tok is suing the social media giant for allegedly failing to take measures to protect her mental health after she watched hours of traumatic videos including cannibalism, rapes, and suicides.
In the proposed class-action lawsuit seen by news.com.au, Candie Frazier, who worked for a third party company, said Tik Tok had not followed industry standards in regards to protecting moderators.
Moderators typically work 12-hour shifts with one hour off for lunch and two 15-minute breaks.
Ms Frazier claimed Tik Tok failed to implement technical safeguards for moderators, including blurring or reducing the resolution of videos the moderators have to watch.
The court filing said Ms Frazier was forced to watch depraved acts as part of her job.
“For example, Plaintiff witnessed videos of: a smashed open skull with people eating from it; a woman who was kidnapped and beheaded by a cartel; a person’s head being run over by a tank; a man eating the head off a rat; a fox being skinned alive; a man falling to his death off a roof that included audio of the impact of his body hitting the ground; school shootings included dead bodies of children; a politician shooting himself; backyard abortions; child abuse; and child sexual assault,” it said.
Ms Frazier is also suing Tik Tok’s parent company ByteDance.
The documents filed in the United States District court also revealed the huge earning of the parent company.
“In fiscal year 2020, ByteDance made approximately (US) $34.3 billion in advertising revenue. In 2019, that number was $17 billion, and in 2018 that number was $7.4 billion. ByteDance accomplished this in part due to the popularity of its Tik Tok App,” it said.
The documents also said due to the strict monitoring of the moderators’ activity there is increased pressure on the staff to watch as many videos as possible.
Tik Tok implements software that tracks moderators online and via camera. It also tells moderators that they should only review 25 seconds of each video.
The intense pressure moderators are placed under, as well as the content they are required to watch makes them more likely of suffering post-traumatic stress disorder, Ms Frazier alleges.
Because of her work, Ms Frazier said she has developed panic attacks and depression as well as symptoms associated with anxiety and post-traumatic stress disorder.
She said she has trouble sleeping and suffers from nightmares when she does manage to sleep over the content she’s watched.
Ms Frazier wants to have Tik Tok pay her and others for the psychological injuries they have suffered and wants the court to force the company to set up a medical fund for content moderators.
She has demanded the matter be heard before a jury.
A Tik Tok spokesman told news.com.au in a statement they are continuing to expand the care of its moderators.
“While we do not comment on ongoing litigation, we strive to promote a caring working environment for our employees and contractors. Our Safety team partners with third party firms on the critical work of helping to protect the Tik Tok platform and community, and we continue to expand on a range of wellness services so that moderators feel supported mentally and emotionally,” he said.
Originally published as Moderator sues Tik Tok after watching cannibalism and other depraved acts as part of job left her with PTSD