TikTok content moderator sues company, alleging trauma from hours reviewing videos of rape and killings

Published Dec 29, 2021

Share

A content moderator who reviewed videos for TikTok is suing the social media company, alleging that it did not protect her from suffering psychological trauma after "constant" exposure to violent videos that showed sexual assault, beheadings, suicide and other graphic scenes.

For as long as 12 hours a day, Candie Frazier and other moderators reviewed "extreme and graphic violence", including videos of "genocide in Myanmar, mass shootings, children being raped, and animals being mutilated" in an effort to filter out such content from being viewed by TikTok users, according to the lawsuit.

The legal action was filed in federal court in California last week against TikTok and its parent company, ByteDance.

Frazier developed "significant psychological trauma including anxiety, depression, and post-traumatic stress disorder" as a result of her exposure to the videos, according to the lawsuit, which is seeking class-action status.

The legal challenge, which alleges that TikTok violated California labour law by failing to provide a "safe work environment", requests compensation for moderators who were exposed to the material. It also asks that TikTok and ByteDance provide mental health support and treatment to former and current moderators.

Frazier is not a TikTok employee – she works for Telus International, a firm that provides workers to other businesses – but the lawsuit alleges that "ByteDance and TikTok control the means and manner in which content moderation occurred".

TikTok said the company did "not comment on ongoing litigation".

The company said that at TikTok, "we strive to promote a caring working environment for our employees and our contractors".

It added that it would "continue to expand on a range of wellness services so that moderators feel supported mentally and emotionally".

A spokesperson for Telus International, which is not a defendant in the suit, said in a the company was "proud of the valuable work our teams perform to support a positive online environment", adding that the company has a "robust resiliency and mental health programme in place".

The lawsuit comes as content management practices at one of the world's most popular social media platforms are under scrutiny. TikTok revealed in September that it had more than a billion global users.

School districts nationwide were on alert this month after authorities raised concerns over what they said were threats of violence spread on the platform, following a shooting at a high school in Michigan that left four people dead.

TikTok denied that the threats had spread widely on its platform, although gun-control advocates called for the company to improve content regulation.

TikTok also said this month that it would adjust its algorithm after an investigation by the Wall Street Journal found that the technology could feed users streams of content focused on subjects such as depression and extreme dieting.

Moderators are made to view as many as 10 videos simultaneously, while being pushed by TikTok software "to review videos faster and faster", according to Frazier's lawsuit. During a 12-hour shift, workers are allowed two 15-minute breaks and an hour for lunch, the suit says.

Telus said its employees could raise concerns through "several internal channels" but that Frazier has never done so. "Her allegations are entirely inconsistent with our policies and practices," the company said.

An attorney for Frazier did not immediately respond to a request for comment.

Similar allegations have been made by content moderators for other social media companies, including Facebook. Last year, Facebook (whose parent company is now called Meta) agreed to a $52 million settlement with thousands of moderators, after a lawsuit alleged that Facebook failed to protect them from traumatic content.

Related Topics:

TikTok