Digital Services Act


The Digital Services Act is an EU regulation that entered into force in 2022, establishing a comprehensive legal framework for digital services accountability, content moderation, and platform transparency across the European Union. It significantly updates the Electronic Commerce Directive 2000 in EU law by introducing graduated obligations based on service size and risk levels, and was proposed alongside the Digital Markets Act.
The DSA applies to all digital intermediary services, including hosting services, online platforms, and search engines. It establishes a tiered regulatory approach: basic obligations for all services, enhanced duties for online platforms, and the most stringent requirements for Very Large Online Platforms and Very Large Online Search Engines with over 45 million monthly active users in the EU.

Objectives

proposed a "new Digital Services Act" in her 2019 bid for the European Commission's presidency.
The expressed purpose of the DSA was to update the European Union's legal framework for illegal content on intermediaries, in particular by modernising the e-Commerce Directive that had been adopted in 2000. In doing so, the DSA aimed to harmonise different national laws in the European Union that have emerged to address illegal content at national level. Most prominent amongst these laws was the German NetzDG, and similar laws in Austria and France. With the adoption of the Digital Services Act at European level, those national laws were planned to be overridden and would have to be amended.
In practice, this would lead to new legislation regarding illegal content, transparent advertising and disinformation.

Rules

New obligations on platform companies

The DSA is meant to "govern the content moderation practices of social media platforms" and address illegal content. It is organised in five chapters, with the most important chapters regulating the liability exemption of intermediaries, the obligations on intermediaries, and the cooperation and enforcement framework between the commission and national authorities.
The DSA proposal maintains the current rule according to which companies that host others' data become liable when informed that this data is illegal. This so-called "conditional liability exemption" is fundamentally different from the broad immunities given to intermediaries under the equivalent rule in the United States.
The DSA applies to intermediary service providers that offer their services to users based in the European Union, irrespective of whether the intermediary service provider is established in the European Union.
In addition to the liability exemptions, the DSA introduces a wide-ranging set of new obligations on platforms, including some that aim to disclose to regulators how their algorithms work, while other obligations aim to create transparency on how decisions to remove content are taken and on the way advertisers target users. The European Centre for Algorithmic Transparency was created by the European Commission to aid the enforcement of this. The European Commission also hosts a DSA Transparency Database for platforms to submit explanations for their moderation decisions as required under the Act.
Article 40 of the DSA requires platforms to grant data access to researchers and non-profit organisations in order to detect, identify and understand systemic risks in the European Union. As of 29 October 2025 the delegated act specifying procedures for data access came into force and researchers can now apply. Implementation of Article 40 and compliance with data access requests is being monitored by the Data Access Collaboratory, a joint project of the European New School of Digital Studies at the European University Viadrina and the Weizenbaum Institute in Germany.
A December 2020 Time article said that while many of its provisions only apply to platforms which have more than 45 million users in the European Union, the Act could have repercussions beyond Europe. Platforms including Facebook, Twitter, TikTok, and Google's subsidiary YouTube would meet that threshold and be subjected to the new obligations.
A 16 November 2021 Internet Policy Review listed some of the new obligations, including mandatory "notice-and-action" requirementsfor example, respect for fundamental rights, mandatory redress for content removal decisions, and a comprehensive risk management and audit framework.
Companies that do not comply with the new obligations risk fines of up to 6% on their global annual turnover. In addition, the Commission can apply periodic penalties up to 5% of the average daily worldwide turnover for each day of delay in complying with remedies, interim measures, and commitments. As a last resort measure, if the infringement persists and causes serious harm to users and entails criminal offences involving threat to persons' life or safety, the Commission can request the temporary suspension of the service.

New rights for users

Users can contest moderation decisions by online platforms restricting their accounts or sanctioning their content in several ways. This right also applies to notices of illegal content that were rejected by the platform. According to the DSA, users may appeal through the internal complaint-handling system of platforms. Platforms are required to promptly review their decisions.
However, the DSA also gives users the right to turn to out-of-court dispute settlement bodies, if they think a decision by an online platform was wrong. One out-of-court dispute settlement body is Appeals Centre Europe, they challenge decisions by social media platforms and there is no charge for users. The platforms they currently review are Facebook, Instagram, Tiktok, Pinterest, Threads and YouTube.

How do out-of-court dispute settlement bodies work?

Users may select any entity that has been certified as a dispute settlement body in the EU for their type of dispute and request a review of a platform's content moderation decision.
For the user, dispute settlement will usually be available free of charge or at a low cost. If the body settles the dispute in favour of the user, the online platform shall bear all the fees. Users should always review information on applicable fees on the website of the respective body before lodging a request for dispute settlement.
Dispute settlement bodies do not have the power to impose a binding settlement of the dispute on the parties, but platforms and users are required to engage with them in good faith.

Legislative history

The European Commission submitted the DSA alongside the Digital Markets Act to the European Parliament and the Council on 15 December 2020. The DSA was prepared by von der Leyen Commission members Margrethe Vestager and Thierry Breton.
The Digital Services Act builds in large parts on the non-binding Commission Recommendation 2018/314 of 1 March 2018 when it comes to illegal content on platforms. However, it goes further in addressing topics such as disinformation and other risks especially on very large online platforms. As part of the preparatory phase, the European Commission launched a public consultation on the package to gather evidence between July and September 2020. An impact assessment was published alongside the proposal on 15 December 2020 with the relevant evidence base.
The European Parliament appointed Danish Social Democrat Christel Schaldemose as rapporteur for the Digital Services Act. On 20 January 2022 the Parliament voted to introduce amendments in the DSA for tracking-free advertising and a ban on using a minor's data for targeted ads, as well as a new right for users to seek compensation for damages. In the wake of the Facebook Files revelations and a hearing by Facebook Whistleblower Frances Haugen in the European Parliament, the European Parliament also strengthened the rules on fighting disinformation and harmful content, as well as tougher auditing requirements.
The Council of the European Union adopted its position on 25 November 2021. The most significant changes introduced by the Member States are to entrust the European Commission with the enforcement of the new rules on VLOPs and VLOSEs, in the wake of allegations and complaints that the Irish Data Protection Commissioner was not effectively enforcing the EU's data protection rules against many platform companies domiciled in Ireland.
With Russia using social media platforms to spread misinformation about the 2022 Russian invasion of Ukraine, European policymakers felt a greater sense of urgency to move the legislation forward to ensure that major tech platforms were transparent and properly regulated, according to The Washington Post. On 22 April 2022, the Council of the European Union and the European Parliament reached a deal on the Digital Services Act in Brussels following sixteen hours of negotiations. According to The Washington Post, the agreement reached in Brussels solidifies the two-bill plan the Digital Services Act and the Digital Markets Act, a law regulating competition. The latter is aimed at preventing abuse of power against smaller competitors by larger "gatekeepers".
On 5 July 2022, the European Parliament approved both the DSA and the DMA. Following this, on 4 October 2022, the Council gave its final approval to the DSA. The DSA was adopted on 19 October 2022 and was published in the Official Journal of the European Union on 27 October 2022. It came into force on 16 November 2022. Most services were given 15 months to comply with its provisions. However, VLOPs and VLOSEs, after their designation as such, had only four months to comply.

Influence of the European Court of Human Rights

The DSA was passed alongside the Digital Markets Act and the Democracy Action Plan. The latter of these is focused on addressing the nuanced legal interpretation of free speech on digital platforms, a fundamental right that has been extensively guided by the European Court of Human Rights and the European Convention on Human Rights. Accordingly, the Democracy Action Plan, and subsequently the DSA, were strongly influenced by the Delfi AS v. Estonia and Magyar Tartalomszolgáltatók Egyesülete and Index.hu Zrt v. Hungary ECtHR cases, which outlined a framework for assessing intermediary liability on digital platforms.
In Delfi AS v. Estonia, the ECtHR applied proportionality analysis when considering whether the Estonian courts' decision to hold the online platform Delfi liable for hate speech posted by its users was a proportionate restriction on Delfi's right to freedom of expression. The court found that, given the serious nature of the hate speech, the Estonian courts' actions were justified to protect the rights of others. In other words, the ECtHR upheld the liability of online platforms for hate speech posted by their users, underlining that platforms could be expected to take proactive steps to control content when there is a clear risk of harm from unlawful comments. This case highlighted the responsibilities of platforms to prevent the spread of harmful content.
On the other hand, the MTE and Index.hu v. Hungary case illustrated the nuanced limits of freedom of speech on digital platforms. In its application of proportionality analysis, the ECtHR found that the Hungarian courts had failed to strike a fair balance between protecting reputation and ensuring freedom of expression. The Hungarian courts imposed strict liability on the platforms for user comments that were offensive but did not constitute hate speech, constituting a disproportionate interference in the platforms' right to freedom of expression. The ECtHR ruled that imposing strict liability on platforms for user comments, without consideration of the nature of the comments or the context in which they were made, could infringe on freedom of expression. This judgment emphasized the need for a balance between protecting reputation and upholding free speech on digital platforms.
These decisions by the ECtHR provided critical legal precedents that shaped the EU's decision-making process on the framework of the DSA. In particular, the DSA drew from the ECtHR's distinction between different types of illegal content, as well as its proportionality analysis in both cases, by incorporating nuanced rules on intermediary liability and ensuring that measures taken by platforms do not unreasonably restrict users' freedom of expression and information.