MANHATTAN (CN) – The Second Circuit dealt a big win to Facebook Wednesday, ruling that online publishers cannot be held accountable for what users post — even if they’re members of a terrorist group.
Closing the door on a suit on behalf of people who died or were injured in Hamas terror attacks, the Manhattan-based federal appeals court found that Facebook is entitled for immunity with respect to the algorithm that determines what content appears on users newsfeeds. Led by the relatives of Taylor Force, a graduate student stabbed to death in Tel Aviv, the plaintiffs accused Facebook of giving Hamas a platform to encourage terrorist attacks in Israel between 2014 and 2016.
Affirming dismissal of the suit, the Second Circuit was split 2-1 today in finding so-called information matchmaking content-neutral.
Writing on behalf of the majority, U.S. Circuit Judge Christopher Droney said such effects have been an aspect of the internet “since its beginning.”
Droney also pointed to Congressional statements that grant publishers “immunity to interactive computer services when they arrange and transmit information provided by others.”
Chief U.S. Circuit Judge Robert Katzmann argued in a partial dissent, however, that Facebook is far more than a publisher because it uses its algorithms “to create and communicate its own message: that it thinks you, the reader — you, specifically — will like this content.”
Laying out his case, Katzmann posited a hypothetical scenario where a third-party friend who has read the entire bodies of work of two authors calls one of the authors and gives her the contact information of the other one.
“I think you’d really get along,” the hypothetical friend says, and follows up a few times with the names of other authors she should know.
“Now, you might say your acquaintance fancies himself a matchmaker,” Katzmann wrote.
“But would you say he’s acting as the publisher of the other authors’ work?” the opinion continues (emphasis in original). “Facebook and the majority would have us answer this question ‘yes.’ I, however, cannot do so. For the scenario I have just described is little different from how Facebook’s algorithms allegedly work.”
Arguing that Facebook’s newsfeed suggestions contribute to the formation of “real-world social networks,” Katzmann also quoted news reports that said Facebook’s friend-suggestion feature may have introduced “thousands” of Islamic State sympathizers to each other.
When Congress passed the Communications Decency Act in 1996, Katzmann wrote, it “could not have anticipated the pernicious spread of hate and violence that the rise of social media likely has since fomented.”
Katzmann also called on Congress to take action, recalling how social media helped Russia to meddle in the 2016 U.S. presidential election.
“I do not think we should foreclose the possibility of relief in future cases if victims can plausibly allege that a website knowingly brought terrorists together and that an attack occurred as a direct result of the site’s actions,” Katzmann wrote.
Katzmann concurred in the majority’s decision not to extend jurisdiction for the claims against Facebook under foreign law. U.S. Circuit Judge Richard Sullivan concurred with Droney in full.
At February appellate arguments, Facebook’s general counsel detailed the steps it takes to ban hate speech and remove content that praises terrorist activity.
In addition to matching new posts against an inventory of known terrorist content, Facebook employs thousands of people to wade through and respond to reports of inappropriate content, as well as a 150-person team of counterterrorism specialists, while also experimenting with artificial intelligence.
In determining that Facebook cannot be considered the provider of any Hamas-related content, Droney noted that, according to its terms of service, it does not edit or suggest edits to what its users publish.
Representatives at Facebook did not respond to a request for comment, nor did Robert Tolchin, an attorney for the plaintiffs at Berkman Law.