The Digital Services Act (Regulation (EU) 2022/2065, “DSA”) regulates online intermediaries and platforms such as marketplaces, social networks, content sharing platforms, app stores and more. Its main objective is to prevent illegal and harmful activities online and the spread of disinformation. It aims to ensure user safety, protect fundamental rights, and create a fair and open environment for online platforms.
Online intermediary services include very large online platforms and search engines (see the full list of the designated VLOPs and VLOSEs here) , online platforms (e.g. online marketplaces, app stores), hosting services (e.g. cloud and web hosting services) and intermediary services (e.g. internet access providers and domain name registrars).
The DSA became fully applicable as of February 17, 2024, however, since the end of August 2023, the rules of the DSA had already applied to designated platforms with more than 45 million users in the EU (the so-called “Very large online platforms”, i.e. VLOPs or “Very large online search engines”, i.e. VLOSEs).
The Commission enforces the DSA together with national authorities, who supervise the compliance of the platforms established in their territory. The Commission is primarily responsible for the monitoring and enforcement of the additional obligations applying to VLOPs and VLOSEs, such as the measures to mitigate systemic risks.
More than a year has passed since the DSA came into force, so there is more and more news about the implementation and enforcement of the DSA and more resources are available regarding the application of this Regulation. In this newsletter, which I plan to publish from time to time regarding developments in the implementation of the DSA, I will summarize the most important events and news related to the application of the DSA.
Regulatory & enforcement actions
The European Commission finds five EU Member States in breach of the DSA. What does this mean? The European Commission decided to refer Czechia, Spain, Cyprus, Poland and Portugal to the Court of Justice of the European Union (CJEU) for failing to designate and/or empower a national Digital Services Coordinator (DSC) and to lay down the rules on penalties applicable to infringements under the DSA.
What is the background to this? “The DSA required Member States to designate one or more competent authorities for the supervision and enforcement of the DSA, and to designate one of them as their national DSC by 17 February 2024. Member States are also required to empower their DSCs to enable them to carry out their tasks under the DSA, and to lay down rules on penalties applicable to infringements of that Regulation. DSCs are essential in supervising and enforcing the DSA and in ensuring the uniform application of that Regulation across the Union, working in cooperation with the Commission. Poland failed to designate and empower the DSC to carry out its tasks under the DSA. Although Czechia, Cyprus, Spain and Portugal each designated a DSC, they have failed to entrust them with the necessary powers to carry out their tasks under the DSA. The DSA also requires Member States to lay down the rules on penalties applicable to infringements of that Regulation, which all the aforementioned Member States have failed to do.”
The European Commission called on Bulgaria to comply with the DSA.
What does this mean? “The European Commission decided to send a reasoned opinion to Bulgaria for failing to empower a national Digital Services Coordinator (DSC) under the Digital Services Act and for failure to lay down the rules on penalties applicable to infringements of that Regulation.”
What is the background to this? “Bulgaria now has two months to respond and address the shortcomings raised by the Commission. In the absence of a satisfactory response, the Commission may decide to refer Bulgaria to the Court of Justice of the European Union.”
Guidelines, opinions, reports & more
The European Commission seeks feedback on the guidelines on protection of minors online under the DSA (“Commission guidelines on measures to ensure a high level of privacy, safety and security for minors online pursuant to Article 28(4) of Regulation (EU) 2022/2065“).
What does this mean? “The guidelines aim to support platforms accessible by minors in ensuring a high level of privacy, safety, and security for children, as required by DSA. The draft guidelines are open for final public feedback until 10 June 2025. The Commission is seeking contributions of all stakeholders, including children, parents and guardians, national authorities, online platform providers, and experts. The publication of the guidelines is expected by the summer of 2025.”
“In parallel, the Commission is working on an age-verification app, intended to provide an interim solution until the EU Digital Identity Wallet becomes available by the end of 2026. This app, based on the same technology as the EU Wallet, will enable online service providers to verify if users are 18 years or older without compromising their privacy, further enhancing the protection of minors online. The aim of the project is to develop an EU harmonised privacy-preserving age verification solution, including a white-label open-source app by summer 2025. The first version of the technical specifications and the beta version are already available on GitHub.“
What is the background to this? ”The guidelines outline a non-exhaustive list of measures that all platforms, with the exception of micro and small enterprises, can implement to protect minors, using a default approach that is guided by privacy by design. The guidelines adopt the same risk-based approach that underpins the DSA, recognising that different platforms pose varying levels of risks to minors. This ensures that platforms can tailor their measures to their specific services, avoiding undue restrictions on children’s rights to participation, information, and freedom of expression.”
“The development of these guidelines is the result of extensive research, consultations and workshops with various stakeholders, including children via Better Internet for Kids (BIK+), online platform providers and experts from civil society and academia. The Commission has also collaborated with the Digital Services Coordinators through the European Board for Digital Services and its working group on the protection of minors.”
The European Commission presented new best-practice election toolkit on the DSA.
What does this mean? “The Commission published an elections toolkit, providing practical details on how the Digital Services Act (DSA) Election Guidelines can be applied during electoral processes. Aimed at national regulators – known as Digital Services Coordinators – the toolkit provides advice and guidance on how they can be implemented in practice.”
What is the background to this? “The toolkit offers recommended practices and suggestions in four key areas: stakeholder management, communication and media literacy, incident response, and monitoring and analysis of election-related risks. By providing this guidance, the toolkit reinforces the Commission's and Member States' ongoing efforts to help safeguard the integrity of electoral processes in the EU.
The DSA Election Toolkit builds on the Election Guidelines for VLOPs and VLOSEs published in March 2024, as well as on the experience gained in the implementation of the Code of Practice on Disinformation and the DSA election integrity readiness dialogues that the Commission has held with public authorities, VLOPs, VLOSEs and other stakeholders since September 2023.”
The NATO Strategic Communications Centre Of Excellence published a study regarding the impact of Digital Services Act on the spread of harmful content on social media (“Impact of the Digital Services Act: A Facebook Case Study“).
What does this mean? “As a result, the study demonstrated that despite certain improvements the platform made in creating a safe online environment, we could not claim an overall enhancement after DSA enforcement. Additionally, our results highlight the current vulnerabilities and areas for improvement that the platform should address.”
What is the background to this? “The aim of this research was to measure the effects of the DSA in curbing the spread of harmful content on social media. As measuring the results of such a broad goal was challenging, our study focused on one of the dominant social media platforms: Facebook. To assess the impact of the DSA, we compared the share of harmful content published by Polish and Lithuanian accounts on Facebook before and after the DSA entered into force. Our multi-stage approach involved using a small AI model, GPT-4o mini, to initially flag harmful content, followed by applying larger models for validating and in-depth reasoning. In total we classified 959 harmful posts from 2023 and 1,392 posts from 2024.”
The Communications Authority in Austria (KommAustria) published the first study in a series of KommAustria publications on the DSA: “Der Schutz der Meinungsäusserungsfreiheit im Digital Services Act” [The Protection of Freedom of Expression in the Digital Services Act] (authors: Matthias C. Kettemann, Caroline Böck, Martin Müller). (The full study is available here in German. Key findings from the summary is available here in English.)
What is the background to this? “The Digital Services Act (DSA) aims to ensure freedom of expression on the internet. It prohibits platform operators from arbitrarily deleting content and obliges them to align their community guidelines and general terms and conditions with freedom of expression.“ KommAustria is the authority that is responsible for monitoring the compliance with the DSA in Austria.
Further DSA resources