Supreme Court report
Supreme Court’s lack of internet expertise will be tested in social media content litigation
During oral arguments before the U.S. Supreme Court last week, in one of two thorny cases over whether social media companies can be held liable for aiding and abetting terrorist groups, Justice Elena Kagan said: He said so. We don’t really know about these things. ”
“You know,” she continued, looking at her colleagues across the bench, drawing laughter from the courtroom, “they’re nothing like the nine greatest experts on the internet.”
But this term, the justices will receive a crash course in some of the most salient regulatory and constitutional issues surrounding social media.
In October, the court heard arguments in two cases that will help determine whether public officials can block unwanted voters from their personal social media accounts. (Judgments on those cases are pending.) Meanwhile, the court ruled in March that the federal government posted misinformation on some social media sites about the coronavirus and the 2020 presidential election. It will take up a lawsuit involving allegations that it coerced some users into suppressing them.
Next week, the court will consider perhaps the most high-stakes battle for web and social media companies, at least this term. pair of cases, NetChoice v. Paxton and Moody vs. Net Choiceasks whether states can dictate content moderation standards or require separate explanations when social media removes or changes users’ posts.
Some legal experts are a little worried about how well nine non-experts on the internet will handle the latest social media battle.
“I don’t think there is any way for them to ‘understand’ the set of issues raised in this case, because the issues are so novel that even the most expert experts are still evolving their thinking. That’s why,” says university lecturer Daphne Keller. Director of the Platform Regulation Program at Stanford Law School and Stanford Cyber Policy Center.
Motivated by perceived hostility toward conservative views
The lawsuit stems from laws passed in 2021 by the Florida and Texas legislatures aimed at regulating major social media platforms such as Facebook, YouTube, and X (formerly Twitter). Although there are some differences between the laws, both have two important provisions. One is around content moderation, which limits what companies can do to display user-generated content on their platforms. The other requires separate explanation for specific content management decisions.
Opponents of the bill say state lawmakers made no secret that the measure was aimed at promoting conservative speech and combating censorship by tech companies.
“It is the law in Texas to ban conservative views on social media,” Texas Governor Greg Abbott (R) said in signing House Bill 20.
The Texas measure only applies to platforms with more than 50 million monthly active users in the United States. One of the key provisions at issue prohibits platforms from censoring users based on their views, although they are allowed to ban content in categories such as violence and pornography. Explanation provisions require platforms to contact users whose content has been removed and explain why.
In Florida, Republican Gov. Ron DeSantis signed Senate Bill 7072, saying the measure would protect the state’s residents from “Silicon Valley elites” and “big tech oligarchy.”
The Sunshine State law applies to Internet platforms with more than $100 million in annual gross revenue or 100 million or more monthly users. This prohibits censorship (such as limiting or modifying a user’s posts), shadowbanning (limiting exposure to a user’s content), and deplatforming (banning a user or removing a post for more than 14 days after his posting) . The law also includes language specifically designed to deplatform candidates for public office.
Both laws were challenged by two internet industry groups: the 52-year-old Computer and Communications Industry Association, both based in Washington, D.C., and the 23-year-old NetChoice, the parent company of Facebook and Instagram. Meta, the company that owns YouTube, Google, the owner of YouTube, and members that count X as the following companies:
“Our case is about whether the government can control or dictate the rules of the road for online speech,” said Chris Marchese, Director of NetChoice’s Litigation Center. “The government cannot force private entities to publish speech they don’t want published. If the Supreme Court rules against us, every government official in this country will try to control the internet. It would also jeopardize everyone’s First Amendment rights.”
In a pre-enforcement challenge to the Florida law, the Atlanta-based U.S. Court of Appeals for the Eleventh Circuit ruled that content moderation is First Amendment speech and state restrictions are subject to much less rigorous scrutiny. , determined that it was unlikely to survive intermediate scrutiny. The court also ruled that a separate explanation requirement would chill the exercise of editorial discretion by social media platforms.
In the Texas case, the New Orleans-based 5th Circuit Court of Appeals held that content moderation activities are not speech but “censorship” that the state can regulate. Her single judge on the three-judge panel said content moderation regulations are similar to “carrier” rules imposed on railroads, telephone companies and, more recently, internet service providers. When you are, you are only writing to yourself.
The losing parties in both cases appealed to the Supreme Court, which agreed to hear both the Florida and Texas laws.
Florida Attorney General Ashley Moody argued in a brief that social media platforms “articulate no message in the vast and disparate content provided by their users.” “Unlike newspapers and bookstores, platforms are not, to say the least, selective about the people and content they allow on their sites. Virtually anyone can sign up and post almost any content. .”
Florida law “simply requires platforms to adhere to common business practices of remaining open to all visitors and content, and that it does not depend on carrier regulations. “This is the way it has worked for centuries,” Moody wrote.
Texas Attorney General Ken Paxton said in a brief that the state’s content moderation provisions “enable voluntary communication between willing speakers and willing listeners on the world’s largest telecommunications platform. “and treats the platform like a telecom or telephone company.” ”
Adam Kandub, a law professor and director of the Intellectual Property and Information and Communications Law Program at Michigan State University, said the old Marvell telephone monopoly (and its modern successors) was able to connect calls based on subject matter. He said he could not refuse it. or unfavorable sources, Western Union had no control over telegrams sent between offices.
“The question is, are social media platforms similar to telephone companies, telegraph companies, the post office, and even cable television systems in that they have to convey messages they don’t like, or are they artistic creations? Is it a creative work?” ” said Khandub, who co-authored the court brief assisting the states. “If it is the former, it can be regulated in the same way as telecommunications carriers.”
Incorporating 21st century technology into old wires and tracks
Social media industry groups claim that social media is a common carrier.
“Websites like Facebook and YouTube are not common carriers, and governments can force private organizations to become carriers of third-party speech if they exercise editorial discretion in ways that governments dislike. ,” they wrote in their brief in the Texas lawsuit.
According to NetChoice’s Marchese, the common law has historically provided services to the public without discrimination against common carriers, innkeepers, and traditional places of public accommodation such as ferries, stagecoaches, and railroads. He says he has imposed a duty. But there is no comparable common law tradition of imposing carrier-like regulations on private entities distributing curated collections of speech, he said.
To be sure, there are other issues in this case, and there are dozens of court briefs representing a variety of opinions from all sides. Reddit, a platform that provides discussion on hundreds of thousands of different subjects, filed a brief in support of industry groups that expressed concerns about how the state law would affect volunteer moderators.
In fact, under Texas law (prior to its ban), Reddit Star Trek For violating the platform’s simple rule of “be kind.”
“State laws will continue to take Reddit to court to defend the idiosyncratic subreddit-level rules created by individual users to organize Reddit’s many communities. ” the company said in a prepared statement.
Meanwhile, Florida and Texas have received support from multiple sources, including former President Donald J. Trump, who sued Twitter, Meta and YouTube over takedowns and restrictions imposed on him.
Keller, a Stanford law lecturer, helped prepare a court brief in support of the industry group. One of her other concerns about these cases, along with concerns about the justices’ understanding of the world of social media, is the “sloppy process” in creating the Florida and Texas laws that lacked “a real deliberative process.” There was a “legislative process.”
“And there was nothing like the kind of judicial tire kicking in the lower courts that usually creates new issues for Supreme Court review,” she says. “So this is a particularly dangerous area for the court…to issue a decision that the state uses as a blueprint for its next bill.”
But, she added, “it seems likely that there will be a consequential verdict in any case.”
See also:
“Mr. Chemerinsky: The Supreme Court will hear some of this term’s biggest cases this month.”
“Supreme Court considers legislation that would prevent social media from removing certain content and users”