Early Impressions: How Europe’s Proposed Digital Services Act Can Preserve Wikimedia, or Let it Get Constantly Trolled
The Wikimedia Foundation’s take on the DSA: Cautious optimism, but also concerns about empowering bad-faith actors
The European Commission recently released its proposal for the Digital Services Act (DSA), a law that will change the legal underpinnings of online life in Europe, and, by extension, the world. One of the main components of the proposal creates a framework of obligations for online hosts — a group which includes the Wikimedia Foundation in its role as the host of Wikipedia.
The current law on the liability of hosts, governed by Article 14 of the e-Commerce Directive, says that online hosts aren’t liable for what their users post if they don’t know about any illegal activity, and if they act upon illegal activity once they know about it. Article 15, meanwhile, says that a host can’t be legally required to monitor its services, hunting for any potentially illegal activity by its users.
There’s a lot to analyze and consider in the DSA proposal, but we wanted to share a few early impressions. First of all, we’re glad to see that the DSA preserves these provisions of the e-Commerce Directive, which ensure that the Foundation can continue hosting the knowledge of countless editors and contributors. Unique, user-driven platforms like Wikipedia thrive under strong intermediary liability protections, and we are happy to see this recognition from the Commission. In addition, there are lots of new provisions in the DSA as well, intended to encourage more effective and responsive content moderation. While some of these improve transparency, such as making it easier for people to understand why they see a certain piece of information, and are intended to promote fundamental rights, there are also others that, if applied poorly, could actually make some hostile aspects of the internet worse.
The other concern we want to raise comes up in Article 14, which says that an online provider will be presumed to know about illegal content — and thus be liable for it — once it gets a notice from anyone that that illegal content exists. There’s a number of different ways that ambiguities in this section can create problems, including potentially contradicting the prohibition on general monitoring obligations. For example, if the Foundation got a notice from someone alleging they had been defamed on one article, what would the Foundation be responsible for, if the alleged defamation was referenced in or spread across multiple articles, or talk pages, that the user may not have specified? There must be significantly more clarity around this provision if it is going to operate as intended and not pose an undue burden on platforms.
Finally, we want to make sure that the particular structure, mission, operation, and community self-governance of Wikimedia projects and other participatory platforms are accounted for in this piece of legislation that was likely designed with very different kinds of platforms in mind. We still see some gaps and omissions in the Commission’s proposal and look forward to collaborating with colleagues and members of the Wikimedia movement in Europe (with a particular shout-out to the tireless work of the Free Knowledge Advocacy Group EU) to work with lawmakers to ensure that the law can support and foster the kind of free, open, collaborative, and collegial space that is the best of the Wikimedia movement.
Sherwin Siy, Lead Public Policy Manager, Wikimedia Foundation
Allison Davenport, Senior Public Policy Counsel, Wikimedia Foundation
Jan Gerlach, Lead Public Policy Manager, Wikimedia Foundation