More than a year has passed since the DSA came into force, so there is more and more news about the implementation and enforcement of the DSA and more resources are available regarding the application of this Regulation. In this newsletter, I summarize the most important events and news related to the application of the DSA.
Regulatory & enforcement actions
The European Commission has opened formal proceedings against Pornhub, Stripchat, XNXX, and XVideos for suspected breaches of the Digital Services Act. In parallel, Member States are taking a coordinated action against smaller pornographic platforms.
What does this mean? “These actions will reinforce the Commission’s effort to protect minors from harmful content online, both as regards very large adult platforms supervised by the Commission and smaller ones that fall under the supervision of the Digital Services Coordinators.”
What is the background to this? “The Commission's investigations into Pornhub, Stripchat, XNXX, and XVideos focus on the risks for the protection of minors, including those linked to the absence of effective age verification measures. The Commission preliminarily found that the platforms do not comply with putting in place: (i) Appropriate and proportionate measures to ensure a high level of privacy, safety and security for minors, in particular with age verification tools to safeguard minors from adult content. (ii) Risk assessment and mitigation measures of any negative effects on the rights of the child, the mental and physical well-being of users, and to prevent minors from accessing adult content, notably via appropriate age verification tools.
Protecting young users online is one of the key enforcement priorities under the DSA. Online platforms must ensure that the rights and best interests of children are central to the design and functioning of their services. If proven, failure to comply with these requirements would constitute infringements of the DSA. […]”
What are the next steps? “The Commission will now carry out an in-depth investigation as a matter of priority and will continue to gather evidence, which can include sending additional requests for information, conducting interviews or inspections. The opening of formal proceedings empowers the Commission to take further enforcement steps, such as adopting interim measures and non-compliance decisions. The Commission is also empowered to accept commitments made by Pornhub, Stripchat, XNXX and XVideos to remedy the issues raised in the proceedings.”
The European Commission preliminarily found TikTok's ad repository in breach of the DSA.
What does this mean? “On May 15, the Commission informed TikTok of its preliminary view that the company does not fulfil the Digital Services Act (DSA)'s obligation to publish an advertisement repository. Such an advertising repository is critical for researchers and civil society to detect scam advertisements, hybrid threat campaigns, as well as coordinated information operations and fake advertisements, including in the context of elections. The Commission has found that TikTok does not provide the necessary information about the content of the advertisements, the users targeted by the ads, and who paid for the advertisements. Moreover, TikTok's advertisement repository does not allow the public to search comprehensively for advertisements on the basis of this information, thereby limiting the usefulness of the tool.”
What is the background to this? “On 19 February 2024, the Commission opened formal proceedings to assess whether TikTok may have breached the Digital Services Act. In addition to advertising transparency, the opening of proceedings also covered the negative effects stemming from the design of TikTok's algorithmic systems (such as ‘rabbit hole effects' and behavioural addiction), age assurance, its obligation to ensure a high level of privacy, safety and security for minors, and data access for researchers, for which the investigation continues. The Commission has also opened formal proceedings against TikTok in December 2024 on its management of risks related to elections and civic discourse, for which the investigation continues.”
What are the next steps? “TikTok now has the possibility to exercise its rights of defence by examining the documents in the Commission's investigation file and by replying in writing to the Commission's preliminary findings. In parallel, the European Board for Digital Services will be consulted. If the Commission's preliminary views were to be ultimately confirmed, the Commission may issue a non-compliance decision, which may trigger a fine of up to 6% of the total worldwide annual turnover of the provider as well as an enhanced supervision period to ensure compliance with the measures the provider intends to take to remedy the breach. The Commission can also impose periodic penalty payments to compel a platform to comply.”
Poland appointed its (interim) Digital Services Coordinator under the DSA. (Brief summary in English is available from Mateusz Kupiec at LinkedIN.)
What does this mean? A few weeks ago, the European Commission decided to refer Czechia, Spain, Cyprus, Poland and Portugal to the Court of Justice of the European Union (CJEU) for failing to designate and/or empower a national Digital Services Coordinator (DSC) and to lay down the rules on penalties applicable to infringements under the DSA. Now, Poland made the (interim) appointment of its national DSC to comply with its obligations under the DSA.
Please see the previous issue of this newsletter regarding the Commission´s action against the Member States failing to comply with their obligations under the DSA:
What is the background to this? The deadline for appointing Digital Services Coordinators (DSCs) under the DSA was February 17, 2024.
The Coimisiún na Meán (the Digital Services Coordinator in Ireland) announced the decision to grant Trusted Flagger status to the Central Bank of Ireland.
What does this mean? “The Central Bank of Ireland have been granted the Trusted Flagger status for three years, from 2 April 2025 to 2 April 2028. Their designated area of expertise is financial scams and fraud, including the provision and/or offer of financial services without authorisation. Upon the expiry of the accreditation period the Trusted Flagger status is reassessed and, where appropriate, re-granted.” The Central Bank of Ireland is the first entity to be given the status of “trusted flagger” in Ireland.
What is the background to this? “Trusted Flaggers are empowered to identify, detect and notify illegal content within their area of expertise to online platforms. Providers of online platforms are then legally obliged to ensure that notices of the presence of illegal content, reported by Trusted Flaggers are given priority and decided upon without undue delay.” Under Article 22 of the DSA, the status of “trusted flagger” shall be awarded to an entity that meets the following conditions: “(a) it has particular expertise and competence for the purposes of detecting, identifying and notifying illegal content; (b) it is independent from any provider of online platforms; (c) it carries out its activities for the purposes of submitting notices diligently, accurately and objectively.” (See Article 22 of the DSA about trusted flaggers.)
The full list of “trusted flaggers” is available here. (A study on trusted flaggers titled “Article 22 Digital Services Act: Building trust with trusted flaggers”, van de Kerkhof, J. (2025). Internet Policy Review, 14(1) is available here.)
The Belgian Institute of Post and Telecommunications (BIPT) initiated proceedings against Telegram to check whether the platform complies with EU rules combating terrorist content online and BIPT also checks Telegram’s compliance with the EU’s online platform rules, the Digital Services Act (DSA). (See also BIPT´s Annual Report for 2024.)
What is the background to this? “Telegram claims, it has 41 million monthly average users in the EU, which is below the threshold of 45 million per month required to be qualified as a Very Large Online Platform (VLOP) and be checked by the European Commission.” On the basis of this, BIPT has the authority to check Telegram´s compliance with the DSA.
Guidelines, opinions, reports & more
The Institute for Information Law (IViR, University of Amsterdam) and the DSA Observatory published a report on “Pathways to Private Enforcement of the Digital Services Act (DSA)”.
What does this mean? “By way of illustration, we discuss key provisions across four categories of due diligence obligations —content moderation, risk governance, transparency and design obligations. Through this analysis, we raise new questions, and develop some preliminary answers, for the field of private enforcement in this important new field of EU law. […] Looking forward, the private enforcement potential of the DSA will involve strategic choices along various parameters including (1) the regulatory obligation at issue; (2) the cause of action (e.g. direct effect, contract or tort); (3) the type of claimant (individual user, representative action, collective action); (4) remedies sought (e.g., damages, injunctive relief; and (5) the appropriate jurisdiction. Selecting the best possible targets involves normative judgements about the most urgent policy priorities, combined with pragmatic, tactical judgements about the viability of different litigation strategies.”
What is the background to this? “The DSA contains various provisions that clearly signal an intention to convey individual rights and enable private enforcement (most notably Article 54 on the right to compensation, Article 86 on representative actions and Article 90 on collective redress). Not every DSA obligation is equally enforceable through means, however. The options depend on whether direct effect can be established, or, in the alternative, indirect effect via tort or contractual causes of action under Member State law. National law also retains procedural autonomy regarding the procedures for enforcement, and jurisdiction may also vary based on the specific circumstances of each case.”
The DSA Observatory published an analysis by Taylor Annabell (Utrecht University) on “DSA Audits: How do platforms compare on influencer marketing disclosures?”
What is the background to this? “Under the DSA, social media platforms must provide clear tools for influencers to disclose paid content. But how well do they meet this obligation—and how rigorously is compliance assessed? This post compares eight DSA audit reports on influencer marketing disclosures and finds striking inconsistencies in how audits were conducted, what was measured, and how “compliance” was defined. The findings raise broader concerns about audit transparency, platform-defined standards, and the need for clearer guidance on what adequate disclosure—and meaningful oversight—should look like.”
The DSA Observatory published a summary of the results of a workshop held in March 2025 by Magdalena Jóźwiak, which focused on the DSA's systemic risk framework.
What is the background to this? “The DSA introduced some genuine regulatory firsts. Among the most significant is the legal requirement for Very Large Online Online Platforms and Search Engines (shortly VLOs) to manage systemic risks linked to how their services are designed, operated, and used. This framework raises considerable questions – not only due to its legal and technical complexity, but also because of the political context surrounding its enforcement. With key operational guidance still forthcoming – including a best practices report from the European Board for Digital Services (Art. 35(2) DSA) and a Commission-funded study by Open Evidence – this moment represents a critical juncture in the DSA’s implementation.Against this backdrop , the workshop explored three interlinked themes: (1) the DSA’s legal framework for systemic risk management, (2) the enforcement challenges ahead, and (3) the role of researchers and civil society in shaping its future.”
The Institute of International and European Affairs held a panel discussion on the implementation of the DSA to date and assesses the ongoing debates relating to various features of the DSA. (The recorded discussion is available on YouTube or on Spotify.)
Further DSA resources
List of designated very large online platforms and search engines under DSA
The Code of conduct on countering illegal hate speech online +
You can find the previous issue of DSA tracker here: