U.S. Supreme Court Rulings Spared the Wikimedia Model, for Now

Wikimedia Policy
Wikimedia Policy
Published in
5 min readMay 22, 2023

--

Three takeaways for Wikipedia and free knowledge advocates from the Gonzalez v. Google and Twitter, Inc v. Taamneh et al. rulings.

A panoramic photograph of the west facade of the Supreme Court of the United States building in Washington, DC, at dusk
Panorama of the west facade of the Supreme Court of the United States building in Washington, D.C., at dusk. Image by Joe Ravi, CC-BY-SA 3.0, via Wikimedia Commons.

Written by the Wikimedia Foundation’s: Stan Adams, Lead Public Policy Specialist for North America; Allison Davenport, Public Policy Research and Analysis Manager; and, Jacob Rogers, Associate General Counsel.
(With special thanks to Leighanna Mixter, Senior Legal Manager, for ongoing work on this case.)

On 18 May 2023, the Supreme Court of the United States released its opinions on a pair of related cases with important implications for Wikipedia and other volunteer-run Wikimedia projects. Both focus on Section 230 of the 1996 Communications Decency Act (CDA) and the future of online platforms that enable people to share content on the internet.

In both cases, Gonzalez v. Google, LLC and Twitter, Inc. v. Taamneh, et al., families of victims of terrorism sued the online platforms — respectively, YouTube and Twitter — for damages based on the legal argument that the platforms’ practices for sorting and recommending content led to the promotion of content from the terrorist organization ISIS. They argued that this outcome violated the 2016 Justice Against Sponsors of Terrorism Act (JASTA), and that the platforms had “aided and abetted” acts of terrorism. Gonzalez also asked whether such platforms would be protected by Section 230 for the claims brought under the JASTA.

The Court ruled in favor of the platforms on the first issue, finding that the plaintiffs’ allegations did not support a claim for liability under the JASTA in large part because the platforms’ algorithms treated this content exactly as it would any other content uploaded by one of their billions of users. Because the plaintiffs’ claims were not viable, the Court did not need to answer the second question, and declined to interpret Section 230’s applicability in this context, leaving its liability protections intact for now.

There are three takeaways from the ruling for Wikipedia, other Wikimedia projects, and free knowledge advocates worldwide.

1. The Court’s ruling means Wikimedia volunteers can continue to share free knowledge globally

The liability protections in Section 230 are what enable the nonprofit Wikimedia Foundation to host and support free knowledge projects like Wikipedia. Section 230 protects the Foundation from civil liability for content uploaded by contributors to the Wikimedia projects. It protects volunteer editors from liability when they collaborate with others to create and edit content they did not write themselves. Importantly, section 230 also protects the Foundation and volunteer editors from being sued when content is changed or removed in line with rules that the volunteer communities have developed for the projects. Without Section 230’s protections, the Foundation could face an unmanageable stream of lawsuits seeking damages for the actions of users, such as posting defamatory content to the website. Section 230 is thus essential to maintaining the Wikimedia volunteer-led model of content self-governance: Many of the project policies are directly created and enforced by our volunteer communities.

The Foundation and many others were concerned that a ruling by the Court in favor of the Gonzalez family in Gonzalez v. Google might undermine, narrow, or weaken internet platforms’ Section 230 protections, jeopardizing free knowledge projects like Wikipedia and chilling free expression across the internet. By declining to rule on the Section 230 question, the Court preserved decades of legal precedent and the ability of volunteer-run projects like Wikipedia to allow users to contribute to, edit, and curate the content on the website.

2. Threats remain to Section 230 protections for Wikipedia and other Wikimedia projects

As the Foundation explained in our friend-of-the-court brief, limiting internet platforms’ protections under Section 230 based on the practices for sorting and recommending content identified by the plaintiffs would have far-reaching consequences, potentially impacting other helpful features such as anti-spam filters or links to “relevant articles” on Wikipedia articles. Because of these potential implications, the Gonzalez v. Google decision is not just a win for the social media websites in question, it is also a victory for thousands of other websites that allow users to upload content, including the Wikimedia projects.

Despite this positive result, legal and legislative threats to Section 230 remain on the horizon. Elements of the Court’s decision in Twitter v. Taamneh provide insight into how Justices may approach cases with different facts in the future. The judicial record in Gonzalez and Taamneh had only limited facts that did not give the Court a complete picture of how different projects in the internet ecosystem function.

For example, the Court refers to how some platforms monetize user engagement through the sale of targeted advertising. For the most part their opinion accurately describes the basics of the relationship between users’ interactions with online content, algorithms that are the typical source of recommendations on large platforms, and advertising on some platforms. However, the Court did not acknowledge the wide variety of other platform structures that could have been impacted by a reinterpretation of Section 230, including the Wikimedia projects. The Court’s narrow focus on the practices of for-profit social media platforms that depend on targeted advertising revenue was necessary in these two cases — both were appealed very early in the litigation process, before any discovery or factual development. This means that the Court’s consideration was limited to the information contained in the plaintiffs’ complaints and the defendants’ responses, which it had to accept at face value.

However, in a case where the Court does decide to address questions related to speech and content moderation online, the Foundation hopes that it will consider the potential impacts of any ruling on a wider array of platform models, including Wikimedia’s nonprofit, public interest model, which is combined with a volunteer-run content governance model. For example, a question presented in another case that may come before the Court, NetChoice, LLC v. Paxton, relates to the First Amendment rights that Section 230 helps protect — such as a platform’s ability to set rules for where, when, and how users can contribute content to the website.

3. When considering Section 230 changes, do not forget Wikipedia

Section 230 offers the foundational liability protections that helped build today’s internet. When considering any changes, courts, lawmakers, and regulators should think beyond the largest commercial social media platforms, and take into consideration their impact on a wide range of websites that allow users to contribute content and govern their own communities. The law should protect the ability of platforms to give people the power to create and share knowledge on the internet. Volunteer-run free knowledge projects like Wikipedia and the public interest depend on it.

--

--

Wikimedia Policy
Wikimedia Policy

Stories by the Wikimedia Foundation's Global Advocacy team.