The Wikimedia Foundation’s Perspective on the DSA and its Global Implications

Wikimedia Policy
Wikimedia Policy
Published in
8 min readDec 12, 2023

--

A world map of submarine telecommunications cables as of July 2015
World map of submarine telecommunications cables as of July 2015. Image by Openstreetmap contributors and data by Greg Mahlknecht, CC BY-SA 2.0, via Wikimedia Commons.

Written by Jan Gerlach, Director of Public Policy at the Wikimedia Foundation; and, Dimitar Dimitrov, Policy Director at Wikimedia Europe

On 25 August 2023, the European Union’s (EU) new Digital Services Act (DSA) — a law that aims to establish common rules that will govern online content moderation — started to apply to the most heavily-used platforms and search engines in the EU, including Wikipedia. The regulation will have a significant impact on the online encyclopedia, the only website designated under the DSA as a very large online platform (VLOP) that is hosted by a not-for-profit organization and largely community-governed.

A subset of the DSA’s rules will apply to other services from 17 February, 2024. Policymakers, regulators, and other various stakeholders around the world are watching how internet platforms are going to respond to the new law. While the DSA will have global implications, we warn against legislators simply copy-pasting it elsewhere, without adequate protections for free expression and user rights.

The DSA’s obligations for platforms have the potential not only to improve people’s experience, but also to protect their rights on the internet. However, if governments and regulators aren’t careful about the way in which they implement new requirements, they might harm smaller community-governed platforms. We call on governments to promote and protect communities’ ability to own and govern online spaces together.

Our engagement in the DSA legislative process

The Wikimedia Foundation (the global nonprofit organization that hosts Wikipedia and other Wikimedia free knowledge projects) and Wikimedia Europe (the association of European Wikimedia affiliates) have been engaging with policymakers on the DSA since before the European Commission (now the DSA’s key regulator for VLOPs) introduced the draft proposal. We have supported the DSA’s aim to make content moderation more accountable and transparent.

However, we have also cautioned that modeling rules around business practices of big, for-profit websites could lead to the opposite of what EU authorities intended to achieve: It could lock in specific models, instead of diversifying the online platform ecosystem. In addition, if not carefully worded, such rules could pose a threat to the functioning of Wikipedia. For these reasons, we asked policymakers to recognise and protect community-governed public interest projects.

Obligations that will apply to Wikimedia projects

There are various obligations, systems, and measures that the DSA will establish and with which the Wikimedia Foundation and projects will have to comply. We discuss some of them below.

1. New “notice and action” framework

The DSA harmonizes a “notice-and-takedown” system for intermediary liability protections across the EU (i.e., the rules by which internet platforms can allow users to upload content without having to evaluate the content’s legality first). The platforms aren’t liable for hosting illegal content as long as they follow a certain process, where they review and potentially remove allegedly illegal content. This protects users’ ability to make decisions about content together. Community-driven online projects like Wikipedia can only function when the law refrains from pushing all responsibility for content onto platforms, and instead allows for decentralized content moderation by users.

With the DSA the EU now has a harmonized set of rules around that “notice-and-action” system: it applies across all member states and to any company with a substantial amount of users in those states. It also allows both users and online platforms to follow a specific path with clear rules when content moderation decisions by a platform are contested. Importantly, EU legislators clearly recognized that these rules and procedures should apply to actions taken by the platform — but not to actions by communities of users, who can also sometimes take actions such as moderating content and banning users, among others. On Wikipedia, this happens hundreds of times a minute and is vital for its functioning and the quality of information it offers: Holding the platform operator liable for those user actions would contradict the nature of Wikipedia to such an extent that its very existence would have been at stake. Thankfully, the DSA’s drafting carefully avoids that problem.

2. Transparency

The DSA strives to improve transparency about content moderation, user numbers, and business models across different platforms. We endorse this effort, which will benefit the entire information ecosystem. However, we caution that differences between platforms and moderation models may produce data that is not always precisely comparable across platforms. We encourage regulators to allow for flexibility in the presentation of this data. In this manner, platforms can talk about moderation in ways that make sense in their context, rather than flattening every approach and being forced into a single way of thinking about transparency.

Wikimedia projects have long embraced openness by default. This extends to the Foundation, which produces biannual transparency reports that include information about requests to alter or remove content from the projects, or for information about Wikimedia project users. We have already released our first DSA Transparency Report, and will fold this information into our regular reports in the future.

3. Systemic risk and mitigation obligations

Wikipedia’s VLOP status brings mandatory “systemic risk and mitigation” obligations to that Wikimedia project specifically.

This means that the Foundation will need to report on systemic risks with which Wikipedia may be linked in the EU. The Foundation is looking very carefully at risks such as electoral disinformation, harassment of users, and risks to child safety. The report must also include mitigation measures where appropriate, that is, actions to address such risks. This report will be updated at least once annually, but we consider that it will be more of a living document, rooted in our broader (i.e., global) work around the Foundation’s Human Rights Policy.

We see value in having major online platforms assess their role in society and what they can and should do to mitigate systemic risks. However, we caution that if every jurisdiction mandates separate risk assessments, the administrative burden of such exercises will quickly become overwhelming. The Foundation would have to spend a substantial amount of time and budget to prepare them — rather than working to mitigate the systemic risks themselves. The risk assessments and mitigation reports in the DSA are not specifically worded legal obligations, but may be used by regulators over the next several years to bring legal actions against VLOPs if they feel that there is a problem. It will be very important that regulators stay away from loosely-framed demands and dictating politicized measures that would be prone to abuse.

4. Crisis response mechanism

The DSA includes an additional obligation for VLOPs: assessing whether their service significantly contributes to a given crisis, and taking measures to prevent, eliminate, or limit that contribution. This obligation can be triggered by a decision of the European Commission, which in turn would need to act upon recommendation by the board of national regulatory authorities, officially called the European Board for Digital Services.

Political pressures around the COVID-19 pandemic and the Russian invasion of Ukraine have led the EU to add such a provision to the DSA in the last stages of the legislative process. However, many digital rights groups and Wikimedia communities are concerned that authorities will be tempted to use this new mechanism to dictate how projects like Wikipedia handle content. This could force the Foundation to interfere with the user community’s decentralized content moderation. As a result, the governance model of Wikipedia and other projects would be severely disrupted.

The Foundation will be able to decide on the substance of the measures, but the obligation to take these measures will be binding. It is hard to foresee how this will play out in practice once an actual crisis occurs. This is certainly a provision that raises worrisome questions and creates uncertainties as to how far the power of the regulators goes.

The DSA will have global implications, but it would not function everywhere

Policymakers in Brussels made no secret that it is their ambition for the DSA to become a global benchmark for online content moderation regulation. Zooming out of the EU, we can indeed observe that the DSA is significantly influencing a wave of laws rolling out worldwide, which seek to make platforms more accountable.

The DSA gets many things and principles right. Transparency, accountability, and user right provisions are welcome. Given our early conversations with the European Commission and the way the provisions were drafted, we are hopeful that they will prove to be well-balanced and continue to improve over iterations of practice.

However, the DSA is only fully coming into force now, and many new norms, including new powers for regulators, have not been tested in practice yet. We frankly do not know how the crisis response mechanism, for instance, will work.

In the EU, two factors relieve our concerns over limitations to freedom of expression and community-led decision-making about online content.

  • Most EU member states have a solid democratic tradition, an effective legal system and rule of law, and are supported by EU institutions (including the Court of Justice of the European Union [CJEU]) that safeguard fundamental rights.
  • The DSA was, in many ways, an evolution of an existing, time-tested system: it builds on what EU member states have already done nationally under the 2000 E-Commerce Directive, which has always worked reasonably well for Wikipedia.

In contrast, in regions of the world where there aren’t strong protections for democratic and fundamental rights, and where the legal system is more prone to subversion, some aspects of the DSA might be outright dangerous.

Many principles and balances that exist within the DSA — including explicit assurances against obligations to actively monitor content or a so-called Good Samaritan clause — could (and perhaps even should) be global. However, the concrete provisions and tools that the DSA uses cannot and should not be automatically copy-pasted into other jurisdictions without legal frameworks and institutions to support them.

Respecting both human rights as well as user rights

The DSA is a challenge and an opportunity. While compliance will require considerable resources and a constant effort by the Foundation and Wikimedia communities to adapt to its requirements and obligations, we also think it can help content moderation online as a whole become more transparent, predictable, and fairer to users. Similarly, it should help the online ecosystem become more respectful of both human rights as well as user rights.

Any improvements, though, are highly dependent not only on DSA’s legal provisions, but also on the actions that platforms and regulators will take in the first years after these come into force. As the only not-for-profit organization hosting a VLOP, we want to demonstrate that the DSA can be a valuable instrument to strengthen and support people and their fundamental and user rights online.

Achieving that across the internet, however, will require the support of regulators and governments, in the EU and around the world, to promote access to knowledge as well as to protect communities in their ability to own and operate online spaces and make decisions about content together.

--

--

Wikimedia Policy
Wikimedia Policy

Stories by the Wikimedia Foundation's Global Advocacy team.