Updates to our Terms of Use

We are updating our Terms of Use. Please carefully review the updated Terms before proceeding to our website.

Tuesday, April 16, 2024 | Back issues
Courthouse News Service Courthouse News Service

Social media giants try to duck claims their apps hook adolescents

In a packed Oakland courtroom, a judge weighed key questions about what harm social media can actually cause to adolescents — and if companies are protected from liability for it.

OAKLAND, Calif. (CN) — A federal judge will decide whether to greenlight a sprawling class action accusing Meta, TikTok and other social media companies of making their apps addictive and harmful for children in the name of profit.

U.S. District Judge Yvonne Gonzalez Rogers heard arguments Friday in what could be a blockbuster case about social media 's effects on children, amid multiple similar lawsuits underway against the tech giants. The plaintiffs claim Meta, Google and others knowingly design their apps to be addictive to children, accusing the companies in a 288-page complaint of making it impossible for parents to intervene as primary caretakers to ensure that their children do not engage with harmful or dangerous content. 

The hearing in a packed courtroom Friday dove into whether social media companies can be held liable for harms to young users, and whether they should implement certain protections for adolescents. The companies' attorneys, as in their motion to dismiss, focused on Section 230 of the Communications Decency Act — which allows for web operators to moderate user speech and content as they see fit — and argued it shields the companies from all civil liability.

Rogers asked defense attorney Paul Schmidt whether Section 230 allows companies to publish potentially harmful content for child users, even when parents do not want them to have an account. Schmidt said that to label content as possibly harmful in any way is limiting speech.

The plaintiffs’ attorney Previn Warren said that Section 230 is intended to empower parents in restricting certain content, adding, “The notion that it is now being weaponized is to me a perversion of what the statute was intended to do.”

Warren said his side has a "profoundly different view" of the cited case law around purported immunity. He said the use and design of the companies' apps and algorithms is actually at issue, as computers pull data from children and decide how to keep them engaged for as long as possible. 

“We’re not here to be the content police,” Warren said. “We’re looking at the structure of how they use operating machinery to hook kids, irrespective of what’s being shown. The defendants know it will hook kids and keep them from going outside.”

Rogers asked whether Section 230 provides immunity for computer code designing algorithms, saying “To think that Congress anticipated this is also a stretch.”

She questioned attorneys Brian Willen, for the defense, and Lexi Hizam for the plaintiffs about how far the First Amendment can cover the companies.

Willen said that the plaintiffs’ claims that algorithms expose children to harmful content are liability claims that call for speech regulation. To his argument that push notifications are also covered under free speech law and are not physically compelling users, Rogers pointed out that such a feature which stimulates the brain is having a physical effect.

“Why isn’t the injury to the brain, and the brain’s interaction with the device, physical?” Rogers asked. 

Attorney for the plaintiff Elizabeth Cabraser said that such features are designed to psychologically manipulate the user and therefore aren’t protected.

“It’s seeking to override someone else’s decision making while exploiting cognitive emotional vulnerabilities,” she said. “It’s not human speech. People need to be able to look away, to disengage. If we can’t get away from it, then the First Amendment does not protect that violation of our rights and interests.”

Rogers asked both parties what the companies owe their users, including children, to prevent any foreseeable harm.

Geoffrey Drake, representing the defense, said the question of what duty is owed to child users is “amorphous.”

“I’m not sure what duty the companies would owe, based on any case I’ve been able to find,” he said.

“You seem to suggest that you have no duty,” Rogers said. 

The judge joked about inviting several members of California Attorney General Rob Bonta’s office at the hearing to weigh in. Bonta joined 33 other state attorneys general in filing similar claims against Meta, saying the company designed Instagram and Facebook to make social media platforms addictive to children and teens and arguing that it has a duty to prevent harm.

While Rogers did not indicate how she may rule, she expressed frustration with both sides relying on causation.

“It’s not clear to me that the entire thing is thrown out,” she said. She said she anticipates ruling on the matter by mid-November.

California has had a law about protecting children online on the books since 2022. Assembly Bill 2273 requires web platforms to implement safety settings to protect kids' data and prohibits companies that provide online services from collecting, retaining or using a child’s personal information or geolocation.

Follow @nhanson_reports
Categories / Consumers, Courts, Technology

Subscribe to Closing Arguments

Sign up for new weekly newsletter Closing Arguments to get the latest about ongoing trials, major litigation and hot cases and rulings in courthouses around the U.S. and the world.

Loading...