Updates to our Terms of Use

We are updating our Terms of Use. Please carefully review the updated Terms before proceeding to our website.

Thursday, April 25, 2024 | Back issues
Courthouse News Service Courthouse News Service

TikTok content moderator sues over nightmares, panic attacks from graphic content

The lawsuit comes on the heels of Facebook's settlement of similar claims by its content moderators for $52 million.

LOS ANGELES (CN) — A TikTok content moderator sued the social networking platform claiming she suffers from horrific nightmares and panic attacks because of the videos of graphic violence and sexual assaults she has to watch for her job.

Candie Frazier, who has worked as a content moderator through a third-party contractor since early 2018, accuses TikTok and its Chinese parent company ByteDance of negligence. She also seeks to represent the companies' other content moderators in a class action.

"During her employment as a content moderator, plaintiff was exposed to thousands of graphic and objectionable videos," according to the complaint filed Dec. 23 in Los Angeles federal court. "For example, plaintiff witnessed videos of: a smashed open skull with people eating from it; a woman who was kidnapped and beheaded by a cartel; a person’s head being run over by a tank; a man eating the head off a rat; a fox being skinned alive."

TikTok is the latest social media platform to be accused of exposing its content moderators to all sorts of disturbing posts without concern for their mental health. The lawsuit follows a $52 million settlement this year by Facebook over similar claims brought by its moderators. The same law firm who brought the Facebook lawsuit represents Frazier and is also involved in a case against YouTube now pending in Oakland. A federal judge in July dismissed the YouTube lawsuit but allowed the plaintiffs to try to fix the shortcomings she identified in her ruling.

"While we do not comment on ongoing litigation, we strive to promote a caring working environment for our employees and contractors," a TikTok spokesperson said. "Our safety team partners with third-party firms on the critical work of helping to protect the TikTok platform and community, and we continue to expand on a range of wellness services so that moderators feel supported mentally and emotionally." 

TikTok's content moderators review hundreds of graphic and disturbing videos each week, according to the complaint. The app's videos aren't scrutinized before they are uploaded and the company relies on users to report inappropriate content, which are then reviewed by moderators and removed if they violate TikTok's term of use, according to the complaint.

"ByteDance and TikTok require content moderators to review hundreds of thousands if not millions of potentially rule-breaking posts per week via ByteDance’s and TikTok’s review software," Frazier said. "Due to the sheer volume of content, content moderators are permitted no more than 25 seconds per video, and simultaneously view three to ten videos at the same time."

TikTok hasn't implemented workplace safety measures that other internet companies have, according to the lawsuit. Microsoft, for example, blurs photos, renders them in black and white and shows them only in thumbnail size, and removes audio from videos, according to the complaint. Microsoft also uses filtering technology to distort images and content moderators are given mandatory psychological counseling, according to the complaint.

TikTok's failure to use such tools to protect its moderators is particularly "glaring" because they aren't difficult to implement and the company has full control over the software the third-party contractor uses, Frazier says. Blurring images and videos and providing tags for ultra-graphic violence would take little time to implement and could provide significant benefits to the health and safety of the moderators, she adds.

Follow @edpettersson
Categories / Entertainment, Media, Technology

Subscribe to Closing Arguments

Sign up for new weekly newsletter Closing Arguments to get the latest about ongoing trials, major litigation and hot cases and rulings in courthouses around the U.S. and the world.

Loading...