Updates to our Terms of Use

We are updating our Terms of Use. Please carefully review the updated Terms before proceeding to our website.

Sunday, April 21, 2024 | Back issues
Courthouse News Service Courthouse News Service

Judge advances TikTok moderators’ suit over harm from disturbing content

A federal judge ruled TikTok's parent company may have shirked its duty to use reasonable care to content moderators by setting unreasonable productivity standards for weeding through disturbing content.

SAN FRANCISCO (CN) — TikTok cannot escape claims that its software may have exacerbated harm to contract content moderators, who spent many hours viewing disturbing content, including child pornography. 

In a lawsuit brought against the social media app’s parent company ByteDance, the moderators say the company did not adopt reasonable measures to mitigate harm from having to watch disturbing content. As part of their jobs, Reece Young and Ashley Velez say they watched hours of disturbing videos showing necrophilia, bestiality and violence against children, while working for independent “content moderation” firms ByteDance hired. 

The plaintiffs claim that as a result, they were unable to take breaks from graphic videos because queues of videos which were not supposed to contain graphic content often did, and the resulting harms were exacerbated by ByteDance's strict productivity standards. Young and Velez's claims include common law negligence and violation of California’s Unfair Competition Law.

ByteDance moved to dismiss the plaintiffs' third amended complaint, arguing it takes content moderation seriously but cannot be liable for injuries where workers’ compensation applies. 

“Plaintiffs still do not and cannot allege that defendants prevented their employers from implementing the safety measures they propose,” ByteDance said in its motion. “The allegations simply reiterate that defendants could have taken those measures themselves.”

U.S. District Judge Vince Chhabria dismissed the plaintiffs’ unfair competition law claim with prejudice Monday, finding the moderators worked outside California and the state law couldn't apply to them. But he found the question of whether ByteDance could have adopted safety measures to prevent harms that workers say they faced is one the plaintiffs have adequately raised.

“The primary question here is whether the complaint adequately alleges that TikTok controlled the moderators’ work to such a degree that it can be held liable for its alleged negligence, even though Young and Velez were directly employed by other companies,” Chhabria said. “The answer is yes.”

The plaintiffs raised two exceptions to state law, the “retained control” exception — where a hirer is liable to contract workers if their control over work contributed to injuries — and the “unsafe equipment” exception where a hirer is liable if their equipment was unsafe and contributed to a contract worker’s injuries. 

According to the complaint, ByteDance required all moderators to use its proprietary software, retaining full control over how videos were displayed and how audio was streamed. Young and Velez plausibly claimed that the software was defective, contributing to their harm because it prevented them from escaping harmful content. 

Chhabria rejected ByteDance's argument that “a hirer’s alleged failure to implement precautionary measures does not constitute affirmative contribution to harm” because the California Supreme Court has ruled that "neither ‘actual exercise’ nor ‘affirmative contribution’ requires that the hirer’s negligence (if any) consist of an affirmative act.”

The judge said the company could have followed the National Center for Missing and Exploited Children’s guidelines, like blurring images or muting audio. ByteDance could have followed recommendations from the Technology Coalition — an industry group that counts ByteDance as a member — by limiting time moderators spend viewing disturbing content to no more than four consecutive hours. It also could have better sorted videos into graphic and non-graphic categories. But the company’s queue sorting was “highly inaccurate,” and moderators got “no respite from seeing graphic content.” 

While Chhabria noted that contract firms provided “minimal therapeutic options,” he added: “It seems unlikely that the California Supreme Court would allow a hirer to interfere with a contractor’s ability to adopt obvious safety measures, and then escape liability just because the contractor could adopt some unrelated measure that might treat or reduce the severity of the resulting harm after the fact.”

The judge said that once ByteDance exercised control over one aspect of moderators’ work, it had a duty to use reasonable care — something it may not have done if it set unreasonable productivity standards.

He disagreed with ByteDance's argument that contract workers can only invoke exceptions if they claim physical harm, saying that a hirer’s liability to contract workers for negligence cannot be limited to physical injuries.

"Under California law, a company that hires a contractor is ordinarily not liable for the injuries of that contractor’s employees,” Chhabria said. 

“But taking the well-pleaded allegations in the complaint as true, TikTok exercised a high degree of control over its content moderators’ work (and provided them with unsafe equipment), and it’s plausible that those decisions contributed to the plaintiffs’ injuries. For that reason, California law allows the plaintiffs’ negligence claim to go forward."

The parties return to court June 16.

"We are looking forward to the opportunity to prove our claims in court, and we are especially pleased that the effort to impose forced arbitration was denied," the plaintiffs' attorney Steve Williams said. "Access to justice is important for all, especially those tasked with keeping the internet and social media safe for the public."

ByteDance didn't return a request for comment by press time.

Follow @nhanson_reports
Categories / Employment, Media, Technology

Subscribe to Closing Arguments

Sign up for new weekly newsletter Closing Arguments to get the latest about ongoing trials, major litigation and hot cases and rulings in courthouses around the U.S. and the world.

Loading...