Updates to our Terms of Use

We are updating our Terms of Use. Please carefully review the updated Terms before proceeding to our website.

Thursday, April 18, 2024 | Back issues
Courthouse News Service Courthouse News Service

Supreme Court review of red state social media laws carries big changes for online platforms

The Supreme Court will decide if conservative states are violating the First Amendment by regulating content on social media platforms.

WASHINGTON (CN) — Social media companies have flocked to the Supreme Court ahead of next week’s First Amendment case, warning the justices that upholding content moderation laws from Texas and Florida would lead to an existential crisis for how people connect online. 

Reddit said the laws would strip the company and its users of the tools used to define their communities. Bluesky described the laws as fracturing the internet, giving companies the impossible task of pleasing “multiple masters” with conflicting views. And Discord forewarned of abusive content running wild with no ability to police harmful posts. 

Yelp gave the justices an example connected to their 2018 ruling in Masterpiece Cakeshop v. Colorado Civil Rights Commission. After the Supreme Court said a Colorado bakery did not have to make a wedding cake for a gay couple, the bakery was bombarded with negative reviews online. 

Yelp said the attacks ranged from creative uses of culinary language to criticize the bakery to explicit political attacks. The website used its content-moderation policies to limit reviews to users who had actually visited the bakery, versus those who were angry about the political views of the company. 

That exact action is what would be banned under the laws proposed by Texas and Florida. 

Following Donald Trump’s ban from X after the Jan. 6, 2021, insurrection, Texas and Florida accused social media platforms of systematically suppressing conservative speech online, and implemented new regulations limiting how platforms can moderate content. 

Texas’ version of the law prohibits platforms from censoring content based on expression, limiting social media companies’ decisions about how to display user content. The law also classifies social media companies as common carriers, requiring them to disclose their moderation standards. In the few circumstances where social media companies would be allowed to remove users’ content, the platforms would have to provide users with an explanation for their decision. 

Texas hasn’t been able to implement its law, which has been paused by the courts. U.S. District Judge Robert Pittman paused enforcement of the law. The Lone Star state received a favorable ruling from the Fifth Circuit, but the Supreme Court kept the law on ice while litigation continued. 

Florida’s law also targets content moderation tools like censoring or de-platforming users. The law limits how platforms use these tools for all users but prohibits their use on certain accounts, like journalists or political candidates. 

Social media companies would not only be responsible for providing explanations to users for their editorial decisions, but platforms would also be forced to disclose their content moderation standards. 

A judge also blocked Florida’s law from enforcement. The 11th Circuit then intervened, issuing a split ruling that allowed the state to implement the law’s disclosure requirements. 

Both Florida's and Texas’ laws can be enforced through the state or private individuals. In Florida, companies face fines of $250,000 per day for removing political candidates’ speech. 

The laws were challenged by a trade group representing Facebook, YouTube, Etsy, Instagram, Pinterest and X — formerly known as Twitter. Social media companies claim the laws violate the First Amendment, compelling websites to host certain types of speech. The group classified these actions as substituting the states’ judgment for private companies. 

“Just as Florida may not tell the New York Times what opinion pieces to publish or Fox News what interviews to air, it may not tell Facebook and YouTube what content to disseminate,” Paul Clement, an attorney with Clement & Murphy, wrote in the group’s brief. “When it comes to disseminating speech, decisions about what messages to include and exclude are for private parties — not the government — to make.” 

The trade group said it would be impossible for social media platforms to provide individual explanations for the billions of times each year they decide to remove users’ posts. According to the group, Facebook, Google and Twitter took action on 5 billion posts during one six-month period in 2018. Some of these cases related to spam, while others dealt with pornography, child safety concerns and hate speech. 

Florida claims its law is a first-in-the-nation approach to reining in the enormous power of internet platforms. The Sunshine State compares the websites to telephone companies, where regulations have been implemented to prevent discrimination. Florida argues its law regulates conduct, not expression.

“In hosting billions of speakers and petabytes of content, the platforms are engaged in business activity — conduct — that may be regulated in the public interest,” Henry Whitaker, Florida’s solicitor general, wrote in the state’s brief. “The First Amendment does not afford those who host third-party speech a right to silence the hosted speakers or to treat them arbitrarily.”

Texas reached further into history to defend its law, comparing platforms to “the telegraph companies of yore.” The Lone Star state thinks of social media as the modern public square, requiring government intervention to maintain freedom of speech. 

The court will hear the cases on Feb. 26. 

Follow @KelseyReichmann
Categories / Appeals, First Amendment, Media, Politics, Technology

Subscribe to Closing Arguments

Sign up for new weekly newsletter Closing Arguments to get the latest about ongoing trials, major litigation and hot cases and rulings in courthouses around the U.S. and the world.

Loading...