DSA tracker #8
Developments in enforcing the Digital Services Act
There are more and more developments regarding the implementation and enforcement of the DSA and more resources are available regarding the application of thes Regulation. In this newsletter, I summarize the most important events and news related to the application of the DSA.
Regulatory & enforcement actions
The European Commission has opened formal proceedings to investigate if Snapchat is ensuring a high level of safety, privacy and security for children online, in compliance with the DSA.
What does this mean? The investigation will focus on five areas:
Age assurance
Grooming and recruitment of minors for criminal activities
Inadequate default account settings
Dissemination of information on the sale of prohibited products
Reporting of illegal content
What is the background to this? “The Commission’s opening of an investigation is based on the analysis of Snapchat’s risk assessment reports from 2023, 2024 and 2025, as well as on the replies to the request for information sent on 10 October 2025. This request sought details on Snapchat’s age verification system, as well as on the measures to prevent users from accessing illegal products, including drugs, and minors from accessing age-restricted products, such as vapes. The Commission has taken into account the information gathered by ACM in its investigation of Snapchat’s compliance with the DSA with regard to the sale of vapes to minors in the Netherlands as well as information provided by the Federal Network Agency for Electricity, Gas, Telecommunications, Post and Railways (BNetzA), the DSC for Germany. The Commission also received input from academic researchers, civil society organisations and other public authorities.”
What are the next steps? “The Commission will now carry out an in-depth investigation. This involves gathering further evidence, for example by sending requests for information to Snapchat and conducting interviews or inspections.
The opening of formal proceedings empowers the Commission to take further enforcement steps, such as adopting interim measures and a non-compliance decision. The Commission is also empowered to accept commitments from Snapchat to remedy issue raised in the proceedings. Today’s opening of formal proceedings means that the Commission takes charge of the investigation that the Dutch Digital Service Coordinator (DSC), Authority for Consumers and Markets (ACM), launched on 9 September 2025 into the sales of vapes to minors on Snapchat. ACM will be associated to the Commission’s investigation and will continue to support it.”
The European Commission has opened formal proceedings against Shein, under the DSA, for its addictive design, the lack of transparency of recommender systems, as well as the sale of illegal products, including child sexual abuse material.
What does this mean? “[…] the investigation will focus on the following areas:
The systems Shein has in place to limit the sale of illegal products in the European Union, including content which could constitute child sexual abuse material, such as child-like sex dolls.
The risks linked to the addictive design of the service, including giving consumers points or rewards for engagement, as well as the systems Shein has in place to mitigate such risks. Addictive features could have a negative impact on users’ wellbeing and consumer protection online.
The transparency of the recommender systems that Shein uses to propose content and products to users. Under the DSA, Shein must disclose the main parameters used in its recommender systems and it must provide users with at least one easily accessible option that is not based on profiling for each recommender system.”
What is the background to this? The “decision follows preliminary analyses of the risk assessment reports provided by Shein, the replies to the Commission’s formal requests for information, as well as information shared by third parties. The Commission sent three requests for information to Shein on 28 June 2024, 6 February 2025 and 26 November 2025 seeking more information on the company’s compliance with the DSA, in particular in relation to consumers’ and minors’ protection, and on the transparency of its recommender systems.”
What are the next steps? “After the formal opening of proceedings, the Commission will continue to gather evidence, for example by sending additional requests for information to Shein or third parties or conducting monitoring actions or interviews. The opening of formal proceedings empowers the Commission to take further enforcement steps, including interim measures or the adoption of a non-compliance decision. The Commission is also empowered to accept commitments made by Shein to remedy matters subject to the proceeding. The DSA does not set any legal deadline for bringing formal proceedings to an end. […]”
The Commission and EUIPO sign agreement to support DSA enforcement on intellectual property.
What does this mean? “The European Commission has partnered with the EU Intellectual Property Office to strengthen enforcement of digital rules targeting counterfeit goods and online piracy. […] Under the agreement, the EUIPO will expand its role in supporting enforcement efforts. This includes providing assistance to judicial and enforcement authorities, as well as to online intermediaries that are not classified as very large platforms under the DSA framework. Intellectual property rights holders are also expected to be involved in addressing infringement risks.”
What is the background to this? “The Digital Services Act establishes a set of rules aimed at improving safety and transparency in the EU’s digital environment. Cooperation between EU institutions and specialised bodies such as the EUIPO is part of how these rules are implemented in practice.”
The European Board for Digital Services convened in Brussels for its 18th meeting on 15 April 2026.
What does this mean? The Board “reaffirmed its commitment to the protection of minors online, discussed ways to streamline its working methods, and exchanged on the latest enforcement activities.”
The Irish Media Authority (Coimisiún na Meán) has commenced two separate formal investigations into Meta, the provider of the Instagram and Facebook platforms, under the EU Digital Services Act (DSA).
Why does this matter? “The investigations will examine:
Whether users can select and modify their preferred recommender system and the functionality to do so (which is required to be directly and easily accessible), through Facebook and Instagram’s interfaces – Article 27(3),
Whether the Facebook and Instagram online interfaces deceive or manipulate users away from choosing a recommender system feed that is not based on profiling of their personal data – Article 25(1).”
What is the background to this? “Following reviews by An Coimisiún’s Platform Supervision team, and an assessment of complaints, concerns arose in relation to potential ‘dark patterns’, or manipulative and deceptive interface designs, which may prevent people from exercising their right to choose a recommender system feed which is not based on profiling. This concern includes the possible inability of users to select and modify a recommender system feed not based on profiling in a direct and easily accessible way, at any time. An Coimisiún has also closely co-operated with the European Commission and other Digital Services Coordinators (national regulators) across the EU on this matter. A recommender system feed based on profiling is a list of posts, videos, products, or articles a person sees that has been chosen and ranked for the user by a system that learns from what they like, interact with, or spend time on. Profiling is the use of automated systems to personalise content or ads based on patterns in a person’s data or behaviour.”
What are the next steps? “The investigations commenced today by the Investigations Team will be conducted pursuant to Part 8B of the Broadcasting Act 2009, as amended. If a platform is found in violation of the DSA, Coimisiún na Meán can apply an administrative financial sanction, including a fine of up to 6% of turnover.”
The European Commission has adopted a recommendation that sets out a common approach for EU-wide Age Verification technologies.
What does this mean? “The recommendation lays down the actions the Commission encourages Member States to take to make sure that all EU citizens have access to robust and privacy-preserving age verification by 31 December 2026. It also defines an EU-wide governance approach to enable further availability of such technologies.”
During the fourth meeting of the EU-Japan Digital Partnership Council in Brussels, the Commission services responsible for the enforcement of the Digital Services Act (DSA) signed a cooperation arrangement with Japan’s Ministry of Internal Affairs and Communications (MIC), which also serves as Japan’s platform regulator.
What does this mean? “The arrangement will support the Commission's and MIC's supervisory work towards online platforms, respectively the EU's DSA and the Japan's Information Distribution Platform Act. Areas of common interest are transparency requirements and notice and action mechanisms of online platforms. The cooperation will be carried out through technical expert dialogues, joint training of technical staff, sharing of best practices, joint studies, and coordinated research projects.”
Guidelines, opinions, reports & more
At the website of the Atlas Institute for International Affairs, Pierce Leslie published an article titled “DSA Enforcement is the New Regulatory Shock: Mapping the First Wave of Platform Risk in 2026”.
What is it about? “Digital service providers must increasingly navigate the commercial risk of EU regulations and their enforcement.”
An insight into the implementation of the DSA in Poland has been published: “Too Much Time on Minitrue: Implementing the Digital Services Act in Poland” (authors: —Katarzyna Łakomiec and Mateusz Grochowski).
What is it about? “Is entrusting the protection of citizens’ rights to private actors truly preferable to administrative oversight? The question arises naturally when following the recent debates over the DSA in Poland and it is worth reflecting on. Poland’s implementation struggle – remarkably unusual in comparison with other EU Member States – is not entirely a domestic matter. As we argue, it also raises broader, more troubling questions for the protection system the DSA aims to create.”
The DSA Observatory published an analysis, titled “What makes a risk “systemic”? The CJEU’s first interpretation of systemic risks under the Digital Services Act” (by Andrea Palumbo).
What is it about? “This post analyses the first major CJEU interpretation of “systemic risks” under the Digital Services Act in Amazon v. European Commission (2025). It argues that the judgment clarifies systemic risks as large-scale societal risks, rejects analogies with financial systemic risk regulation, and provides some guidance on the scope of Article 34 DSA.”
The DSA Observatory published an analysis: “If at first you don’t succeed: Reflections on a rejected Art. 40 DSA data access request” (By Catalina Goanta & Anda Iamnitchi).
What is it about? “Article 40 of the Digital Services Act was hailed as a breakthrough for platform research. But what does the the procedure look like in practice? Drawing on their own rejected data access request, the authors reflect candidly on early lessons for the first wave of Article 40 applications, and what researchers should know before applying for access to platform data. Readers are also invited to contribute to an ongoing researcher survey and join a webinar on 20 March to unpack more lessons learned about DSA data access.”
An article by Cecilia Isola, titled “How Have Platforms Addressed Addictive Design Under DSA” was also published by the DSA Observatory.
What is it about? “[…] the analysis suggests that addictive design, although not explicitly defined in the DSA, qualifies as a systemic risk within its regulatory framework and must therefore be addressed under Articles 34–35. Yet, it remains substantively under-addressed in VLOPs’ systemic risk assessments.”
The NATO Strategic Communications Centre of Excellence published a study about Russian information influence operations.
What is it about? “This report examines Russian Information Influence Operations targeting audiences in Ukraine and neighbouring regions, including Ukrainian civilians and defence forces, civilians in nearby states, and European pro-Kremlin groups. […] The aim is to test and refine the Information Influence Attribution Framework (IIAF) by applying it to real-world Russian campaigns, in a context where EU sanctions on Russian state media, the Foreign Information Manipulation and Interference (FIMI) policy framework, and the Digital Services Act (DSA) are raising evidential standards for attribution. As Information Influence Operations increasingly involve both governmental and civil-society actors, the report focuses on clarifying practical evidential thresholds and confidence levels that can withstand prospective legal and regulatory scrutiny.”
Upcoming webinars regarding the enforcement of the DSA (EU Disinfo LAB):
Further DSA resources

