DSA tracker #6
Developments in enforcing the Digital Services Act
There are more and more developments regarding the implementation and enforcement of the DSA and more resources are available regarding the application of thes Regulation. In this newsletter, I summarize the most important events and news related to the application of the DSA.
Regulatory & enforcement actions
In December 2025, the European Commission issued the first fine under the Digital Services Act (DSA) in the amount of EUR 120 million to X for breaching its transparency obligations under the DSA.
What does this mean? “The breaches include the deceptive design of its ‘blue checkmark’, the lack of transparency of its advertising repository, and the failure to provide access to public data for researchers.”
What is the background to this? …
What are the next steps? “X now has 60 working days to inform the Commission of the specific measures it intends to take to bring to an end the infringement of Article 25 (1) DSA, related to the deceptive use of blue checkmarks. X has 90 working days to submit to the Commission an action plan setting out the necessary measures to address the infringements of Articles 39 and 40(12) DSA, relating to the advertising repository and to the access to public data for researchers. The Board of Digital Services will have one month from receipt of X’s action plan to give its opinion. The Commission will have another month to give its final decision and set a reasonable implementation period. Failure to comply with the non-compliance decision may lead to periodic penalty payments. The Commission continues to engage with X to ensure compliance with the decision and with the DSA more generally.”
Euractiv reported that the Polish president, Nawrocki vetoes the national DSA implementation law.
What does this mean? “The Commission has already referred Poland to the EU Court of Justice in May for failing to designate a national Digital Services Act coordinator, and officials are expected to pursue infringement proceedings as long as the DSA remains unenforced at national level.” Without the approval of the Polish implemenation law, the national digital services coordinator cannot be designated.
What is the background to this? “The law would have amended Poland’s Act on the Provision of Electronic Services to bring it into line with the Digital Services Act (DSA). […] The vetoed bill would also have enabled Poland to designate a national digital services coordinator to enforce the rules and liaise with the European Commission and other member states.”
“Nawrocki argued that the state should not be allowed to decide what its citizens are permitted to say or think.” He argued “that the bill granted excessive powers to government officials rather than independent courts to decide what content is allowed online. He also criticised what he called insufficient judicial oversight and overly short deadlines for citizens to challenge administrative decisions.”
What are the next steps? “Under Poland’s constitution, parliament can override a presidential veto with a three-fifths majority in the Sejm, but lacking that margin, the government would need opposition support, pass a revised bill, or wait for a change of president.”
According to Euractive reports, WhatsApp could be designated as a very large online platform (VLOP) by the Commission under the DSA as soon as the end of January.
What does this mean? “WhatsApp would become the first messaging app service to fall under the EU’s online governance rules for large platforms […]”
What is the background to this? “The DSA classifies platforms or search engines that have more than 45 million users per month in the EU as very large online platforms (VLOPs) or very large online search engines (VLOSEs). The Commission has begun to designate VLOPs or VLOSEs based on user numbers provided by platforms and search engines, which regardless of size, they were required to publish by 17 February 2023. Platforms and search engines will need to update these figures at least every 6 months as explained on DSA: Guidance on the requirement to publish user numbers. […] Once the Commission designates a platform as a VLOP or a search engine as a VLOSE, the designated online service has 4 months to comply with the DSA.”
Based on the “Information on WhatsApp Channels Average Monthly Active Recipients in the European Union” published by WhatsApp in August 2025, “for the six month period from 1 January 2025 to 30 June 2025, there were approximately 51.7 million average monthly active recipients of WhatsApp Channels in the EU.”
What are the next steps? The Commission shall officially make the designation.
Guidelines, opinions, reports & more
The DSA Observatory published an analysis on “The Missing Metrics in DSA Content Moderation Transparency”
What is it about? “The Digital Services Act makes platform transparency reporting mandatory and standardised, but the metrics it requires still fall short of what is needed for real accountability. Counts of removals and appeals alone cannot tell us whether content moderation systems are accurate, proportionate, or effective, making the absence of evaluation metrics such as precision and recall increasingly difficult to justify under the DSA’s risk-based logic. While these metrics are unlikely to surface in baseline transparency reports under Articles 15 and 24, the post argues they may yet emerge through heightened scrutiny of the largest online platforms and search engines (VLOPSEs), as regulatory expectations take shape through enforcement, systemic risk reporting, audits, and related obligations.”
The DSA Observatory also published an analysis titled “What are DSA audits doing for systemic risk enforcement? The case of X”.
What is it about? “Of the nineteen service providers initially designated as VLOPSEs under the DSA, X’s first compliance audit stands apart. Its auditor, FTI Consulting, broke from industry peers by offering relatively critical opinions — including findings that were unfavourable to the platform on obligations under active Commission investigation. How did X respond? Rather than work toward implementing the auditor’s recommendations, X simply reshuffled the deck: it went out and hired a new auditor (BDO). The move raises a deeper question about what the DSA audit regime is actually doing — and how seriously the Commission treats audits as part of systemic-risk enforcement, which, in principle, relies on auditors to provide an additional, independent layer of scrutiny.”
In a publication, also published by the DSA Observatory, the enforcement of the DSA was analysed: “Waiting for the DSA’s Big Enforcement Moment”.
What is it about? “This blog post explores the issue of DSA enforcement by the European Commission, focusing on the law’s systemic risk management provisions. It first briefly sketches the Commission’s role in regulatory oversight of the systemic risk framework and then sums up enforcement efforts to date, considering also the role of geopolitics in the Commission’s enforcement calculus.”
In a publication, Pietro Mattioli looks into the complexity of DSA´s enforcement framework: “Navigating the Complexities of the DSA’s Enforcement Framework: Sincere Cooperation in Action?”.
What is it about? “The Digital Services Act (DSA) represents an important development in the EU regulation of online services, tackling online disinformation and harmful content. However, its success depends on effective enforcement across the European Union. As the DSA enters its implementation phase, its enforcement regime is under scrutiny, particularly the cooperation mechanisms between Member States’ competent authorities. It appears that Member States, while acting in compliance with their institutional autonomy, may have unintentionally created an enforcement structure that is not well-suited for cooperation. Therefore, this article examines the role of the principle of sincere cooperation in the EU legal order and how this principle could enhance cooperation among national competent authorities also in the context of the DSA.”
On 3rd October, the Chair on Online Content Moderation brought together dozens of experts on the DSA for the “Hack the DSA” workshop.
What is the background to this? “The DSA introduces more than 80 transparency obligations which involve the publication of a variety of documents: transparency reports, reports on systemic risks, audit reports, reports by national authorities or out-of-court dispute settlement bodies… […] the aim was to collectively explore these documents in order to understand and analyse their content, compare their different approaches and methodologies, identify good and bad practices, and then formulate recommendations. […]
“The participants were divided into five teams, each of which dealt with a different aspect of the DSA:
inter-transparency (i.e. a cross-analysis of different transparency reports),
data archiving,
online political advertising,
reports on systemic risks,
and the role of national authorities.”
For the summary of key findings and recommendations, please see this article. You can also find a summary on Verfassungsblog.
Experts from the law firm Fieldfisher, in their blog post, analysed the question whether online shops fall within the scope of the DSA.
What is it about? “The classification of an online shop depends on the specific circumstances of each case. We recommend a thorough analysis of which service category applies to avoid unnecessary implementation of DSA obligations that may not be relevant. Classification as a hosting service or online platform can trigger extensive legal duties: operators would need to establish a contact point for users and authorities, implement internal processes for effective reporting and remedy of illegal content, publish annual transparency reports, and adapt their terms and conditions to meet the DSA’s transparency requirements. Conversely, non-compliance with DSA obligations can result in substantial fines, depending on the nature of the violation. Compliance measures should therefore be carefully documented to demonstrate adherence to the DSA.”
According to news reports, TikTok will begin to roll out new age-verification technology across the EU.
What is the background to this? “ByteDance-owned TikTok, and other major platforms popular with young people such as YouTube, are coming under increasing pressure to better identify and remove accounts belonging to children. The system, which has been quietly piloted in the EU over the past year, analyses profile information, posted videos and behavioural signals to predict whether an account may be belong to a user under the age of 13. As well as analysing information the account holder provides about themselves, the technology looks at behaviour such as the videos a user publishes, and “other on-platform behaviour”. […] The rollout of the system comes as European authorities scrutinise how platforms verify users’ ages under data protection rules.”
The Weizenbaum Institute published a policy paper in last September, titled “Data Access for Researchers under the Digital Services Act: From Policy to Practice”.
What is it about? “[…] This paper provides an overview of researchers’ initial practical experience with access to publicly available data based on Art. 40(12) DSA as well as an in-depth description of procedure for access as set out in Art. 40(4) DSA, thereby comprehensively characterising the data access options outlined in the DSA and DA [Delegated Act on data access]. We outline key provisions and their underlying rationales to provide an overview of the goals, procedures and limits of DSA-based data access, as well as an account of external factors likely to weigh in its realisation. […]”
Further DSA resources

