The European Parliament has adopted the Digital Services Act (DSA), landmark regulation that sets rules for internet platforms across the European Union.
The DSA is a promising step forward in respecting rights online. It introduces important measures to increase transparency by requiring platforms to explain to users how they moderate content, how automated tools are used, and how many content moderators they use for each official EU language. The act also aims to subject large platforms to external scrutiny by providing access to data for researchers, including from nongovernmental organizations, and requiring annual third-party audits to assess their compliance with the regulation.
The DSA should have significant consequences for human rights online and offline in the EU and potentially beyond because of its potential to inspire legislation in other regions and set standards that companies may apply globally.
The new rules begin to address some of the more systemic harms posed by dominant platforms. They require platforms to assess and mitigate systemic risks, both actual and foreseeable, of some human rights harms stemming from the design, algorithms, functioning, and use of their services in the EU. The DSA also takes steps to rein in some of the most invasive forms of surveillance-based advertising, which is at the core of dominant platforms’ business model.
While the DSA avoided some typical regulatory pitfalls – like unrealistically short timeframes for platforms to remove potentially illegal content – it also falls short in some important ways.
As Human Rights Watch outlined, the DSA should have been more ambitious and introduced stronger safeguards to protect people’s rights by closing loopholes that potentially expand government censorship, more directly taking on the surveillance-based business model of dominant platforms, and taking a more comprehensive approach to due diligence. The final text also includes some problematic loopholes, such as allowing “trade secrets” to be used to justify not providing researchers with access to data.
Despite these shortcomings, the DSA has the potential to better protect people’s rights, but only if enforcement is robust. In particular, the European Commission – which will be responsible for supervision, investigation, enforcement, and monitoring large online platforms, including imposing penalties – will need significant resources and expertise.
Finally, since the DSA will require very large companies like Meta and Google to be more transparent and provide remedies to EU users, the measures put in place to comply with the new rules should be extended to people who rely on their platforms and services around the world.