WASHINGTON (CN) — With little to show for it, the proliferation of dangerous content online has stoked alarm from lawmakers on both sides of the political aisle for years. Come Tuesday, the Supreme Court takes up the mantle with a case that could reshape online regulation and perhaps alter the internet itself.
In Gonzales v. Google, the Supreme Court will examine the scope of Section 230 of the Communications Decency Act — a law passed by Congress in 1996 to protect internet providers from liability for what third parties post on their platforms. Instead of looking at content-moderation standards that have dominated much of the scorn for Section 230, the justices will examine algorithmic recommendations that internet companies say are essential to organize content for users.
“If the court rules in favor of Gonzales, I can see the outcome being all of a sudden the social media companies especially, but online content providers [generally] will have a choice to make,” Tom Romanoff, director of the Technology Project at the Bipartisan Policy Center, said in an interview. “Either they moderate everything and take that liability risk on if something slips through or they moderate nothing, in which the internet becomes 4chan pretty quickly.”
In the last 26 years, the internet has changed drastically, creating new problems that have outgrown Congress’ solutions. Algorithms — by which internet providers use a complex system of recommendations to serve users relevant content — are one of such problem. Every internet provider uses its own algorithms that prioritize its goals. For example, Twitter and Facebook can connect different users to each other based on their followers and likes.
As relevant in this case, YouTube uses algorithms to suggest videos users should watch based on what they have already viewed. The platform — which is owned by Google — is accused of allowing the Islamic State group to post videos inciting violence and recruiting supporters, and ultimately then recommended ISIS videos to other users by way of its algorithm.
After ISIS claimed responsibility for terrorist attacks in Paris, France, that killed 129 people, the family of one victim, 23-year-old U.S. citizen Nohemi Gonzales, sued Google for aiding and abetting ISIS in violation of the Antiterrorism Act. The Gonzales family argues YouTube was instrumental to inciting the attack by recommending ISIS videos.
“YouTube selected the users to whom it would recommend ISIS videos based on what YouTube knew about each of the millions of YouTube viewers, targeting users whose characteristics suggested they would be interested in ISIS videos,” Eric Schnapper, an attorney at the University of Washington School of Law representing the family, wrote in their brief. “The selection of the users to whom ISIS videos were recommended was determined by computer algorithms created and implemented by YouTube. Because of those recommendations, users ‘[we]re able to locate other videos and accounts related to ISIS even if they did not know the correct identifier or if the original YouTube account had been replaced.’”
A federal judge dismissed the case, agreeing with Google that Section 230 foreclosed such claims. The Ninth Circuit affirmed the dismissal and declined to rehear the case en banc. Now the family has turned to the Supreme Court.
They argue that precedent supports the application of Section 230 on recommendations, however, the text of the law itself did not create a standard for how it should be applied. According to the three-part test the family proposes, Section 230 protects some but not all recommendations. The test that the Gonzales favors would narrow the interpretation of Section 230 to exclude cases like their own from liability protection.
“The approach of some lower courts to this and other issues arising under section 230(c)(1) has been shaped by a belief that section 230(c)(1) must be broadly construed,” Schnapper wrote. “But this Court has made clear the interpretation of statutes should not be shaped by judicial efforts to advance unstated policy goals. A presumption in favor of broadly construing section 230(c)(1) would be particularly inappropriate, because section 230(c)(1) when applicable preempts state law, and there is ordinarily a presumption in favor of the narrow construction of such preemptive measures.”