Updates to our Terms of Use

We are updating our Terms of Use. Please carefully review the updated Terms before proceeding to our website.

Friday, April 19, 2024 | Back issues
Courthouse News Service Courthouse News Service

Once again, social media companies attempt to avoid trial over claims they harmed child users

Ahead of trial, Meta, ByteDance and other social media giants have attempted to once again toss claims that they designed apps to target children.

OAKLAND, Calif. (CN) — Meta, TikTok and other social media companies returned to court on Friday in another effort to avoid trial over class-action claims that they are liable for the harmful effects their apps have on young users. 

Ahead of a trial set for October 2025, the companies are seeking to iron out how many plaintiffs are advancing claims in the California class action. The social media giants also sought to dismiss some claims in the case after they previously tried to convince U.S. District Judge Yvonne Gonzalez Rogers to throw them out in February. The companies say they do not knowingly design their apps to be addictive to children.

The plaintiffs, including children and parents, claim Meta, Google and others make it impossible for parents to intervene to ensure that children do not engage with harmful or dangerous content.

Their arguments echo those made by California Attorney General Rob Bonta and other state attorneys general that Meta deliberately made its social media platforms addictive to children and teens and breached its duty to prevent harm. Rogers, an Obama appointee, considered both sides' arguments on Friday but did not indicate how she might rule.

Representing Meta, which owns Instagram and Facebook, attorney Paul Schmidt said that the plaintiffs have not properly pleaded a claim for violating the Children's Online Privacy Protection Act, which regulates operators of websites or online services directed to children under 13 years old. Meta does not consider third-party content directed at children as part of its own service, Schmidt said — a view he said was grounded in American legal precedent.

Schmidt also argued against claims the company violated Section 230 of the Communications Decency Act, which largely gives social media companies immunity for content published by third parties, and the failure to warn standard, which concerns whether manufacturers adequately warn consumers about the risks of their products.

The plaintiffs in the case argue that the harms caused by social media companies include depression, eating disorders, thoughts of self-harm and suicide. Representing Meta, attorney Timothy Hester pushed back, saying consumer protections do not cover “subjective examples of harm such as emotional stress.”

Rogers asked how Meta could classify some social media impacts on children as subjective.

“It costs school districts money to deal with the fact that children have experienced these effects," she said. "How can you ignore that?”

In response, Hester said the negative effects claimed by the plaintiffs are more subjective than a measurable evidence of harm.

“It may be new technology, but the kinds of harms being alleged here need jurisdictional prudence,” he said. “There’s not an allegation of documented economic harm.”

Also representing Meta, attorney Ashley Simonsen expanded on this argument, arguing that the plaintiffs’ claims about the safety of the platforms are not specific enough or measurable to show negligence.

“It’s clear that 20 or so statements plaintiffs submitted before Congress are in the heartland of First Amendment protection,” she said, referring to statements some plaintiffs have made to legislators in favor of stricter regulation of social media companies.

Anna Smith, assistant attorney general for South Carolina, cited Ninth Circuit Judge Lucy Koh’s ruling in an Arizona case, in which Koh determined that location data is valuable.

Social media companies offer free services in exchange for “tremendous amounts of a young person’s valuable data," Smith continued. The plaintiffs say this is evidence of the financial damages they are entitled to.

Judge Rogers grew frustrated with both sides throughout the five-hour hearing, cautioning them: “No interruption and no hyperbole.” 

“If something extraordinary happens, you can bring a motion — but the bar is very high. And it may have ripple effects that you don’t appreciate,” she said.

The judge already dismissed claims naming Meta CEO Mark Zuckerberg as a defendant earlier this week. Although he is one of the world's most recognizable corporate leaders — someone knowledgeable about the workings of Meta services like Facebook and Instagram — Rogers said she was not convinced he can be found personally liable for misrepresenting his company or causing emotional harm to children.

Representing ByteDance, which owns TikTok, attorney Andrea Pierson said that some plaintiffs indicated they could back off of their claims.

She asked that within the next week, each plaintiff confirm whether they plan to move forward with their claims. Judge Rogers agreed to this request.

Rogers dismissed claims brought under the First Amendment and under Section 230 in November 2023. The parties in October said the law covers content platforms for liability over content from third parties. But Rogers said the defendants know that young users are core to their platforms and found that Section 230 does not grant immunity from a negligence per se claim.

Follow @nhanson_reports
Categories / Courts, Law, Technology

Subscribe to Closing Arguments

Sign up for new weekly newsletter Closing Arguments to get the latest about ongoing trials, major litigation and hot cases and rulings in courthouses around the U.S. and the world.

Loading...