Ninth Circuit Considers Liability of Social Networks in Terror Attacks

SAN FRANCISCO (CN) – Questions posed by a Ninth Circuit panel Thursday suggest a new legal theory could open social media companies to liability for letting terrorists use their networks to spread propaganda.

Multiple courts have shot down lawsuits seeking to hold social media giants liable for violent acts of terror. In 2018, the Ninth Circuit found Twitter could not be liable for an Islamic State terror attack that killed two U.S. contractors in Aman, Jordan, in 2015, citing the lack of a direct link between the use of Twitter’s service and the act of violence.

On Thursday, a three-judge Ninth Circuit panel heard arguments in a case claiming Google played a role in the death of a 23-year-old woman killed in the 2015 Paris attacks that claimed 130 lives. The panel also heard arguments in two other cases seeking to hold Google, Facebook and Twitter liable for the Reina night club attack in Istanbul that killed 38 people on Jan. 1, 2017, and a terror attack in San Bernardino, California, that left 14 people dead in December 2015. All three lawsuits were dismissed by different federal judges.

The social media giants say the Communications Decency Act of 1996 shields them from liability based on content posted to their platforms by third parties. However, the plaintiffs have developed a novel theory related to each social network’s algorithms that recommend videos to watch and users to follow. The plaintiffs argue those suggestions are content created by the social media companies and therefore they can be held liable for publishing them.

Representing Google and its subsidiary YouTube, attorney Brian Willen said his client remains shielded from liability because the “selection arrangement and recommendation content” are traditional publisher functions.

U.S. Circuit Judge Marsha Berzon, a Bill Clinton appointee, disagreed, noting that recommending a book to a reader based on a book one just purchased could be considered more a business or promotion function.

“If you go look at an ISIS video, they’re going to tell you here are some other essentially ISIS videos you might be interested in and connect you more to ISIS than you were before and draw you in and have you ultimately become part of this network,” Berzon said.

Willen replied that because the algorithm for making video recommendations is designed in a neutral way, YouTube’s use of the algorithm cannot be viewed as “knowingly providing substantial assistance” to a terrorist group.

Berzon challenged him on that.

“If they knew about it, then why does the neutral or non-neutral matter,” she asked. “If someone had a policy of sending donations to every charity in the world, and one of them was Hamas, they would still have a material support claim.”

Representing Facebook, attorney Kristin Linsley compared the recommended content arrangement to the way The New York Times prioritizes articles differently in its western edition to focus more on West Coast news.

“The arranging of content and recommending of content by arranging it is exactly what publishers have done for eons,” she said.

Berzon pointed out that The New York Times is a content provider, not just a publisher.

“The question is whether by doing what they are doing, they are now a content provider and not just a publisher,” Berzon said.

The plaintiffs also claim that each social network’s revenue-sharing arrangements, which allow users to obtain a portion of ad revenue for their content, also equates to providing material support to terrorists.

Willen argued that even if the Islamic State earned ad money from a YouTube video, there is no plausible allegation that the money had any direct connection to the Paris attack.

“This is an allegation about a platform-wide revenue sharing program that applies to millions of accounts,” Willen said. “They’re alleging in this one instance there was an ISIS video that got caught up in this program.”

Willen further insisted that the plaintiffs failed to plausibly allege YouTube provided “substantial assistance” to terrorists in the form of ad revenue.

Representing plaintiffs in lawsuits over the Istanbul and San Bernardino attacks, attorney Daniel Weininger argued that the case should be allowed to proceed to discovery so his clients can find out how much ad money went to terrorists.

“That’s a question for discovery,” Weininger said. “It could be hundreds of thousands or perhaps millions of dollars. We don’t know.”

Before ending the hearing, defense attorney Linsley made her final pitch on why the panel should not revive the defeated lawsuits. She cited the lack of a direct connection between the social networks’ services and specific attacks that claimed the lives of plaintiffs’ family members.

“They don’t allege any assistance by the defendants in any actual terrorist attacks,” Linsley said. “Plaintiffs never provide any substantiation to the idea that any of defendants’ services were used in any fashion in connection with these attacks.”

After three hours of debate, the panel took the arguments under submission.

U.S. Circuit Judge Ronald Gould, a Clinton appointee, and Morgan Christen, a Barack Obama appointee, joined Berzon on the panel.

%d bloggers like this: