Updates to our Terms of Use

We are updating our Terms of Use. Please carefully review the updated Terms before proceeding to our website.

Thursday, April 18, 2024 | Back issues
Courthouse News Service Courthouse News Service

The case that could change the internet

With congressional lawmakers at a standstill over solutions to rein in Big Tech, the Supreme Court is stepping in to take a swing at curbing internet regulation.

WASHINGTON (CN) — With little to show for it, the proliferation of dangerous content online has stoked alarm from lawmakers on both sides of the political aisle for years. Come Tuesday, the Supreme Court takes up the mantle with a case that could reshape online regulation and perhaps alter the internet itself.

In Gonzales v. Google, the Supreme Court will examine the scope of Section 230 of the Communications Decency Act — a law passed by Congress in 1996 to protect internet providers from liability for what third parties post on their platforms. Instead of looking at content-moderation standards that have dominated much of the scorn for Section 230, the justices will examine algorithmic recommendations that internet companies say are essential to organize content for users. 

“If the court rules in favor of Gonzales, I can see the outcome being all of a sudden the social media companies especially, but online content providers [generally] will have a choice to make,” Tom Romanoff, director of the Technology Project at the Bipartisan Policy Center, said in an interview. “Either they moderate everything and take that liability risk on if something slips through or they moderate nothing, in which the internet becomes 4chan pretty quickly.”  

In the last 26 years, the internet has changed drastically, creating new problems that have outgrown Congress’ solutions. Algorithms — by which internet providers use a complex system of recommendations to serve users relevant content — are one of such problem. Every internet provider uses its own algorithms that prioritize its goals. For example, Twitter and Facebook can connect different users to each other based on their followers and likes. 

As relevant in this case, YouTube uses algorithms to suggest videos users should watch based on what they have already viewed. The platform — which is owned by Google — is accused of allowing the Islamic State group to post videos inciting violence and recruiting supporters, and ultimately then recommended ISIS videos to other users by way of its algorithm. 

After ISIS claimed responsibility for terrorist attacks in Paris, France, that killed 129 people, the family of one victim, 23-year-old U.S. citizen Nohemi Gonzales, sued Google for aiding and abetting ISIS in violation of the Antiterrorism Act. The Gonzales family argues YouTube was instrumental to inciting the attack by recommending ISIS videos. 

“YouTube selected the users to whom it would recommend ISIS videos based on what YouTube knew about each of the millions of YouTube viewers, targeting users whose characteristics suggested they would be interested in ISIS videos,” Eric Schnapper, an attorney at the University of Washington School of Law representing the family, wrote in their brief. “The selection of the users to whom ISIS videos were recommended was determined by computer algorithms created and implemented by YouTube. Because of those recommendations, users ‘[we]re able to locate other videos and accounts related to ISIS even if they did not know the correct identifier or if the original YouTube account had been replaced.’” 

A federal judge dismissed the case, agreeing with Google that Section 230 foreclosed such claims. The Ninth Circuit affirmed the dismissal and declined to rehear the case en banc. Now the family has turned to the Supreme Court. 

They argue that precedent supports the application of Section 230 on recommendations, however, the text of the law itself did not create a standard for how it should be applied. According to the three-part test the family proposes, Section 230 protects some but not all recommendations. The test that the Gonzales favors would narrow the interpretation of Section 230 to exclude cases like their own from liability protection. 

“The approach of some lower courts to this and other issues arising under section 230(c)(1) has been shaped by a belief that section 230(c)(1) must be broadly construed,” Schnapper wrote. “But this Court has made clear the interpretation of statutes should not be shaped by judicial efforts to advance unstated policy goals. A presumption in favor of broadly construing section 230(c)(1) would be particularly inappropriate, because section 230(c)(1) when applicable preempts state law, and there is ordinarily a presumption in favor of the narrow construction of such preemptive measures.” 

ADVERTISEMENT

Google credits recommendation algorithms for making it possible to find “the needles in humanity’s largest haystack.” It says the argument advanced by the Gonzales family treats Youtube like a publisher of ISIS’ videos. 

“The sorting and grouping of videos is quintessential publishing,” Lisa Blatt, an attorney with Williams & Connolly representing Google, wrote in the company’s brief. “Every website that displays third-party content must select and organize that content. If Section 230(c)(1) does not apply to how YouTube organizes third-party videos, petitioners and the government have no coherent theory that would save search recommendations and other basic software tools that organize an otherwise unnavigable flood of websites, videos, comments, messages, product listings, files, and other information.” 

Section 230 has caught more than the attention of lawmakers. Justice Clarence Thomas expressed interest last March in taking up the issue. But even as it has become increasingly clear the court would take up this issue, and soon, the choice of this case surprised some experts in the area. 

“There's been a number of attempts to limit Section 230,” Romanoff said over the phone. “This is the first one that I've seen in the courts where they're going after the algorithmic recommendations. I would have expected to see content moderation Section 230 cases — especially with a court that's leaning on the conservative side — to go after political censorship issues or conservative bias against conservative thoughts.” 

It’s possible we could see one of these content-moderation cases at the high court next term. Laws from Texas and Florida targeting social media moderation promoted suits being put now before the justices. Last month the justices asked the Biden administration to weigh in on the issue — a signal the justices are interested in hearing the appeals. 

The justices are likely to run into many of the same problems lawmakers have had when looking into Section 230 reform. Experts say one of these problems is the juxtaposing pace of technology innovation to government regulation. Gonzales’ suit originated in 2015. Youtube’s algorithms have changed drastically since then. 

“The case at play now is should Google be liable for promoting ISIS recruitment videos,” Romanoff said. “You can go online now and you won't see that, and that's because the company's algorithms have progressed to the point where they’re much more efficient at identifying those issues and much more responsive to taking them down.” 

Lawmakers have struggled to bridge this gap. Bills tend to be written for old technology and, if they meet the hurdles to become a law, then it still takes a year or two to implement them. By that time, technology has likely already outgrown the regulations. 

“Let's say I get a bill passed in Congress around algorithms; that's going to take at least a year maybe two years to get passed,” Romanoff said. “Technology has already progressed. Companies can call it something else. There's different codes that make it technically not apply to the regulation.” 

Experts say regulating algorithms is made even more difficult because companies are hesitant to share how they work — if they even know themselves. 

“Some of the platforms don't want to share how their algorithms work,” Romanoff said. “They see it as their secret sauce.” 

It’s not clear if or how the justices will overcome these hurdles. Gonzales will be a test, however, for how far the justices want to limit Section 230. Experts say it is possible the court could send a signal to Congress to push for legislation in this area. If lawmakers fail to heed the court’s signal, the justices will have a shot to further limit Section 230 next term with the Texas and Florida cases. 

There is interest from lawmakers to move on Section 230 legislation, but what could gain majority support remains unclear. 

“We're tracking 56 bills over the last two Congresses,” Romanoff said. “Everything from repeal Section 230 to the algorithmic recommendation issues to what to do with content such as 3D guns being printed off on websites. So there's a clear interest there from the legislation to address this. I think that with the court stepping in it's going to circumvent the legislative democratic process.”

Follow @KelseyReichmann
Categories / Appeals, Business, Entertainment, Law, Media, Technology

Subscribe to Closing Arguments

Sign up for new weekly newsletter Closing Arguments to get the latest about ongoing trials, major litigation and hot cases and rulings in courthouses around the U.S. and the world.

Loading...