You are using an outdated browser.
Please upgrade your browser
and improve your visit to our site.
Cross Section

Big Tech’s Big Shield Goes to the High Court

The Supreme Court is going to get a crack at Section 230 of the Communications Decency Act—and the internet may never be the same.

Stefani Reynolds/Getty Images

The Supreme Court said on Monday that it will take up two cases on Section 230, the bedrock federal law that shields internet companies from most content-related lawsuits. While Section 230 has become a hot topic in Congress and the courts in recent years, the justices have not yet ruled on its scope. Their decision could have sweeping ramifications for the nature of the internet itself—and for the companies, large and small, that have relied on the provision to escape legal liability for user content for the last quarter-century.

In Gonzalez v. Google, the court will consider a lawsuit filed by the family of Nohemi Gonzalez, a 23-year-old American college student killed when ISIS-linked militants launched a series of terrorist attacks across Paris in 2015. Gonzalez’s family alleged that YouTube, a Google subsidiary, gave “material assistance” to and “aided and abetted” the group because the website had “knowingly permitted” ISIS and its supporters to host recruitment and radicalization videos. They also claimed YouTube facilitated the spread of that material through its recommendation algorithm.

Accordingly, they brought a civil lawsuit under the federal Anti-Terrorism Act. Google, in a brief for the justices, strongly denied the factual basis for the lawsuit. “[The plaintiffs] do not allege that Google had any role in encouraging or committing the Paris attack,” the company argued. “Nor do [they] allege that any of Ms. Gonzalez’s attackers were recruited via YouTube or used YouTube to plan the Paris attack. The only alleged link between the attackers and YouTube was that one ‘was an active user of social media, including YouTube,’ who once appeared in an ISIS propaganda video.”

Google asked a federal district court judge to dismiss the lawsuit by invoking the company’s liability shield under Section 230. That provision comes from the Communications Decency Act of 1996, one of the earliest federal laws that dealt with internet content. It states that “no provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.” In general, this means that internet companies are usually not held liable for content posted on their website by a user.

Section 230 is often seen as foundational to the modern internet because it allows companies to create social media websites like Facebook or video-hosting platforms like YouTube without facing legal risks in most circumstances for what users publish there. Not everybody who uses these websites posts vacation photos or family recipes, however. Congress has considered multiple bills in recent years that would repeal or limit Section 230 immunity in certain circumstances, usually when violent political extremism or sexual crimes are involved.

There is also a degree of bipartisan support for Section 230 reform. Both President Joe Biden and his predecessor, Donald Trump, have voiced support at various times for repealing the provision, with Biden citing the companies’ alleged failure to prevent the spread of false and extremist content. Trump and other conservatives, on the other hand, have frequently criticized Silicon Valley for purportedly suppressing conservative speech online. The Supreme Court blocked a Texas law from going into effect last month that would have punished social media companies for censoring their users.

Some judges have also wondered whether the courts’ approach to the 1996 provision should be rethought in light of modern developments. (The Supreme Court, for its part, has never ruled on the scope or nature of the law.) In 2020, Justice Clarence Thomas voted with his colleagues to turn away a Section 230-related case but voiced support for revisiting the issue in the future. He wrote that a more narrow interpretation of the law, one that hinged on the degree of control exerted by the tech companies over the content itself, might be more suitable for the social media era.

“Traditionally, laws governing illegal content distinguished between publishers or speakers (like newspapers) and distributors (like newsstands and libraries),” Thomas wrote. “Publishers or speakers were subjected to a higher standard because they exercised editorial control. They could be strictly liable for transmitting illegal content. But distributors were different. They acted as a mere conduit without exercising editorial control, and they often transmitted far more content than they could be expected to review. Distributors were thus liable only when they knew (or constructively knew) that content was illegal.” He suggested that he would apply the same framework to internet companies.

In the lower courts, however, the status quo for Section 230 reigns supreme. A federal district court judge rejected the Gonzalez family’s lawsuit on immunity grounds, citing precedents in the Ninth Circuit Court of Appeals. The Ninth Circuit itself also dismissed the lawsuit, but the three-judge panel that heard the case expressed some dubiousness about the result. The 2–1 decision in Google’s favor came about because one of the panelists, Judge Martha Berzon, felt bound by precedent that she nonetheless criticized. The Ninth Circuit later declined to review the ruling with a larger panel.

The precise question for the Supreme Court is whether “recommendation” algorithms—the ones that show similar videos to ones you’ve already watched on YouTube, for example—fall under the scope of Section 230’s immunity. “The text of section 230 clearly distinguishes between a system that provides to a user information that the user is actually seeking (as does a search engine) and a system utilized by an internet company to direct at a user information (such as a recommendation) that the company wants the user to have,” the Gonzalez family argued, referring to lower court cases that wrestled with Section 230’s scope.

Google sharply disagreed with that analysis when it urged the justices to let the Ninth Circuit’s ruling stand. “YouTube does not produce its own reviews of books or videos or tell users that a given video is ‘terrific,’” the company claimed. “What petitioners challenge is YouTube’s display of content responsive to user inputs—the internet version of a newspaper putting a story of interest to international readers on the cover of the international edition or a book publisher offering three popular mystery novels together as a bundle.” It also argued that the rest of Section 230, when read in context, supported immunity for its approach.

The other case, Twitter v. Taamneh, comes from similar origins and makes similar legal arguments. It was brought by the American family of Nawras Alassaf, a Jordanian national who died in a 2017 shooting at an Istanbul nightclub by a gunman allegedly inspired by ISIS. (Twitter, like Google, disputes the factual ties between its website and the attack.) The case takes a slightly different procedural path, however, since the same Ninth Circuit panel that ruled in Gonzalez let some of the Anti-Terrorism Act claims go forward in Taamneh without considering Section 230 at all. Twitter’s lawyers asked the justices to resolve this case simultaneously with Gonzalez if the court decided to hear it.

Since Section 230 is so foundational to twenty-first-century digital life, even modest changes to it could have far-reaching ramifications. “This Court should not lightly adopt a reading of section 230 that would threaten the basic organizational decisions of the modern internet,” Google argued, noting that websites often rely on algorithmic recommendations to ensure that users are “not overwhelmed with irrelevant or unwanted information.” At the same time, the zeitgeist of holding major Silicon Valley companies accountable for the harms caused by their products—harms that, according to the plaintiffs in these cases, had fatal consequences for their families—may prove more influential among at least some of the justices. Oral arguments have yet to be scheduled, but a decision is likely to come by next June.