DSA tracker #7
Developments in enforcing the Digital Services Act
There are more and more developments regarding the implementation and enforcement of the DSA and more resources are available regarding the application of thes Regulation. In this newsletter, I summarize the most important events and news related to the application of the DSA.
Regulatory & enforcement actions
The European Commission preliminarily found TikTok in breach of the Digital Services Act for its addictive design. This includes features such as infinite scroll, autoplay, push notifications, and its highly personalised recommender system.
What does this mean? “The Commission’s investigation preliminarily indicates that TikTok did not adequately assess how these addictive features could harm the physical and mental wellbeing of its users, including minors and vulnerable adults. […] Additionally, in its assessment, TikTok disregarded important indicators of compulsive use of the app, such as the time that minors spend on TikTok at night, the frequency with which users open the app, and other potential indicators. […] TikTok seems to fail to implement reasonable, proportionate and effective measures to mitigate risks stemming from its addictive design. […] At this stage, the Commission considers that TikTok needs to change the basic design of its service. For instance, by disabling key addictive features such as ‘infinite scroll' over time, implementing effective ‘screen time breaks', including during the night, and adapting its recommender system.”
What is the background to this? “The Commission's preliminary views are based on an in-depth investigation that included an analysis of TikTok's risk assessments reports, internal data and documents and TikTok's responses to multiple requests for information, a review of the extensive scientific research on this topic, and interviews with experts in multiple fields, including behavioural addiction.”
What are the next steps? “TikTok now has the possibility to exercise its right to defence. It may examine the documents in the Commission’s investigation files and reply in writing to the Commission’s preliminary findings. In parallel, the European Board for Digital Services will be consulted. If the Commission’s views are ultimately confirmed, the Commission may issue a non-compliance decision, which can trigger a fine proportionate to the nature, gravity, recurrence and duration of the infringement and reach up to but not more than 6% of the total worldwide annual turnover of the provider.”
The European Commission has formally designated WhatsApp as a Very Large Online Platform (VLOP) under the DSA, as its ‘Channels’ feature reaches the designation threshold of at least 45 million users in the EU. (The updated overview of the designated Very Large Online Platforms (VLOPs) and Very Large Online Search Engines (VLOSEs) supervised by the Commission is available here.)
What does this mean? “Following its designation as a very large online platform, the Commission will be competent to supervise WhatsApp’s compliance with the DSA, in cooperation with Coimisiún na Meán, the Irish Digital Services Coordinator.”
What is the background to this? “WhatsApp is a hybrid service comprising features of private messaging and of an online platform. WhatsApp Channels, the feature of WhatsApp that allows recipients to disseminate information, updates and announcements to a broad audience of WhatsApp users, falls under the definition of an online platform service and is therefore already subject to the general DSA obligations that online platforms in the EU must respect. WhatsApp’s private messaging service enabling users to send text messages, voice notes, photos, videos, documents, and make voice and video calls to other users remains explicitly excluded from the application of the DSA”
What are the next steps? “Following the designation, Meta, the provider of WhatsApp, has four months, i.e. by mid-May 2026, to ensure WhatsApp complies with the additional DSA obligations for VLOPs. These obligations include duly assessing and mitigating any systemic risks, such as violations of fundamental human rights and freedom of expression, electoral manipulation, the dissemination of illegal content and privacy concerns, stemming from its services.”
The European Commission has launched a new formal investigation against X under the DS). The new investigation will assess whether the company properly assessed and mitigated risks associated with the deployment of Grok’s functionalities into X in the EU. In parallel, the Commission extended its ongoing investigation launched in December 2023 into X’s compliance with its recommender systems risk management obligations.
What does this mean? “The new investigation will assess whether the company properly assessed and mitigated risks associated with the deployment of Grok’s functionalities into X in the EU. This includes risks related to the dissemination of illegal content in the EU, such as manipulated sexually explicit images, including content that may amount to child sexual abuse material. These risks seem to have materialised, exposing citizens in the EU to serious harm. In light of this, the Commission will further investigate whether X complies with its DSA obligations to:
Diligently assess and mitigate systemic risks, including of the dissemination of illegal content, negative effects in relation to gender-based violence, and serious negative consequences to physical and mental well-being stemming from deployments of Grok’s functionalities into its platform.
Conduct and transmit to the Commission an ad hoc risk assessment report for Grok’s functionalities in the X service with a critical impact on X’s risk profile prior to their deployment.”
What is the background to this? “As a designated very large online platform (VLOP) under the DSA, X has the obligations to assess and mitigate any potential systemic risks related to its services in the EU. These risks include the spread of illegal content and potential threats to fundamental rights, including of minors, posed by its platform and features. This investigation complements and extends the investigation launched on 18 December 2023, which focuses on the functioning of X’s notice and action mechanism, its mitigation measures against illegal content, such as terrorist material, in the EU, and risks associated with its recommender systems. These proceedings covered also the use of deceptive design, the lack of advertising transparency and insufficient data access for researchers, for which the Commission adopted a non-compliance decision on 5 December 2025, fining X €120 million. On 19 September, the Commission sent to X a request for information related to Grok, including also questions in relation to the antisemitic content generated by @grok in mid-2025.”
What are the next steps? “The Commission will continue to gather evidence, for example by sending additional requests for information, conducting interviews or inspections, and may impose interim measures in the absence of meaningful adjustments to the X service. […] The opening of formal proceedings relieves Digital Services Coordinators, or any other competent authority of EU Member States, of their powers to supervise and enforce the DSA in relation to the suspected infringements.”
The full text of Commission´s decision regarding the first fine under the Digital Services Act (DSA) in the amount of EUR 120 million to X became available.
Why does this matter? “The breaches include the deceptive design of its ‘blue checkmark’, the lack of transparency of its advertising repository, and the failure to provide access to public data for researchers.”
More details regarding the first fine under the DSA in the last issue of the newsletter:
According to POLITICO´s report, Democracy Reporting International (DRI) is suing X to gain access to data on Hungary’s upcoming elections to assess the risk of interference.
What does this mean? “The lawsuit by Democracy Reporting International (DRI) comes after the civil society group, in November, applied for access to X data to study risks to the Hungarian election, including from disinformation. After X rejected their request, the researchers took the case to the Berlin Regional Court, which said it is not competent to rule on the case.”
What are the next steps? “DRI — with the support of the Society for Civil Rights and law firm Hausfeld — is now appealing to a higher Berlin court, which has set a hearing date of Feb. 17.”
What is the background to this? “Under article 40 of the Digital Services Act (DSA), vetted researchers will be able to request data from very large online platforms (VLOPs) and search engines (VLOSEs) to conduct research on systemic risks in the EU.”
“Providing access to data to researchers according to Article 40 DSA is a legal obligation for VLOPs and VLOSEs. […] If researchers consider they comply with the requirements set out in Article 40.12 of the DSA and have submitted a request to access data to a VLOP or VLOSE, they should reach out either to DSCs or to the Commission, explaining the situation. The Commission, as enforcer of the DSA for VLOPs and VLOSEs, will not express an opinion on the single cases, but it will assess whether there is a suspicion of systemic infringement and, in this case, may decide to take action in line with the provisions included in the DSA. In this sense, feedback from researchers is particularly useful to monitor compliance with the obligations under Article 40 of the DSA.”
More information about the data access for reserachers under the DSA is also available in the policy paper published by the Weizenbaum Institute last September (“Data Access for Researchers under the Digital Services Act: From Policy to Practice”).
Please also see the recent blog post published by RWTH Aachen University about the rules of data access: “Online Access Begins: Data Access for Research Under the Digital Services Act”.
Guidelines, opinions, reports & more
The U.S. House of Representatives published a detailed report, titled “The Foreign Censorship Threat, Part II: Europe’s Decade-Long Campaign to Censor the Global Internet and How It Harms American Speech In The United States”. The press release is available here. (Part I of the Report, titled “The Foreign Censorship Threat: How the European Union’s Digital Services Act Compels Global Censorship and Infringes on American Free Speech” is available here.)
What is it about? “The Committee on the Judiciary of the U.S. House of Representatives is investigating how and to what extent foreign laws, regulations, and judicial orders compel, coerce, or influence companies to censor speech in the United States. As part of this oversight, the Committee has issued document subpoenas to ten technology companies, requiring them to produce communications with foreign governments, including the European Commission and European Union (EU) Member States, regarding content moderation. In July 2025, the Committee published a report detailing how the European Commission—the executive arm of the EU—weaponizes the Digital Services Act (DSA), a law regulating online speech, to impose global online censorship requirements on political speech, humor, and satire. Since then, pursuant to subpoena, technology companies have produced to the Committee thousands of internal documents and communications with the European Commission. These documents show the extent—and success—of the European Commission’s global censorship campaign.”
Why does this matter? The report shows a very strong position in the United States, which frames European digital regulatory measures, including the DSA, as a weapon (!) against political freedom of speech. Among others, the report provides that “the DSA is the culmination of a decade-long European effort to silence political opposition and suppress online narratives that criticize the political establishment.”
The Report also states that “the European Commission successfully pressured major social media platforms to change their global content moderation rules, directly infringing on American online speech in the United States.”
Part I of the Report (published in July 2025) concludes that “the threat to American speech is clear: European regulators define political speech, humor, and other First Amendment-protected content as disinformation and hate speech, and then require platforms to change their global content moderation policies to censor it. The Commission classifies important conversations on key political topics as "hate speech" that must be censored under the DSA. Then, it warns platforms that they must change their global content moderation policies to comply with the DSA's mandates.”
The Judiciary Committee held a hearing on February 4: “Europe’s Threat to American Speech and Innovation: Part II”. (The vieo recording of the hearing is available here.)
The Irish Minster for Foreign Affairs, Trade and Defence, Helen McEntee has also some discussions around the DSA with US officials.
What is it about? “The meeting in Washington DC took place as the House Judiciary Committee was holding a hearing on the Digital Services Act in which it was alleged the legal framework infringes on American’s First Amendment freedom of speech rights.”
“Ms McEntee said the DSA is designed to protect consumers and children, adding it did not undermine free speech in Europe or elsewhere.”
In a short post on the Information Technology & Innovation Foundation’s website, which is a US-based science and technology policy think tank, the U.S. perspective on the EU’s digital regulations (especially the DSA) and similar regulations in other countries (such as Argentina, Brazil, South Korea, and India) is presented.
Why does this matter? It is worth reading this post to see some of the recent measures and statements of the US related to digital regulation in a broader context.
What is it about? “By holding leading U.S. companies to a higher regulatory standard than their own firms, the EU has set a precedent that could reshape how major digital platforms operate worldwide. The treatment of VLOPs and VLOSEs under the DSA highlights the growing tension between the EU’s regulatory ambitions and the interests of American technology firms. In this evolving digital landscape, U.S. policymakers should more proactively resist these types of policies by raising them directly and systematically in trade negotiations, where regulatory discrimination against foreign firms can be addressed as a non-tariff attack. Trade negotiations provide a practical and credible forum for linking digital regulation to broader economic commitments, giving the United States leverage to achieve these changes. Unless U.S. policymakers make clear to foreign policymakers that the United States will not tolerate this type of unfair treatment of their firms, countries will continue to adopt and implement DSA-like laws. Moreover, without engagement, the U.S. risks ceding influence over the laws and norms governing digital platforms, including on transparency, safety, and content moderation. Such engagement is therefore both instrumental and critical for U.S national interests including digital innovation, global competitiveness, and technological leadership.”
A recent publication in Science highlights why the EU’s digital regulations, and the DSA in particular, may be the target of fierce criticism from the U.S. government and some online platforms, and why it is important to hold such platforms accountable: “Internet platforms must be held accountable for their actions” (author: Stephan Lewandowsky, University of Bristol, UK).
What is it about? “There are legitimate reasons to be critical of the DSA—one might question, for example, the lack of guidance given to platforms about how to moderate content—but Rubio’s claim that its architects are part of a “global censorship-industrial complex” inverts the policy’s true nature. If anything, such claims illustrate the first risk factor identified at the outset of this column, namely that democratic backsliding is tied to norm violations and dishonesty by elites. In this instance, the misleading claims about European censorship are best understood through the lens of the administration’s new National Security Strategy, which is explicitly hostile to the European Union and which the Russian government sees as being aligned with its own views on Europe.”
According to the Financial Times, the European Union plans to enforce the Digital Services Act (DSA) and the Digital Markets Act (DMA) more strictly in 2026. (See the summary of the article here on Heise Online.)
What is it about? “[…] they intend to continue the investigations against Meta and Google initiated in December. In these, the EU is examining whether Meta is preventing competing AI providers from accessing WhatsApp and whether Google is using online content for training AI models. However, the focus is on discreet work and less on sensational sanctions, individuals involved in the implementation of the digital legislation told the British newspaper.”
The European Board for Digital Services held its 17th regular meeting early February. (The press statement of the Board, published after the meeting, is available here.)
What is it about? “It discussed the impact of the use of AI chatbots, latest developments and enforcement activities as well as future actions. This meeting of the Board marks its second anniversary. Digital Services Coordinators (‘DSCs’) and the Commission used this meeting to look ahead to future deliverables and new challenges, building on the past experiences and achievements.”
What is the background to this? “The European Board for Digital Services is an independent advisory group that has been established by the Digital Services Act […]. The European Board for Digital Services (the “Board”) is composed of the Member States’ Digital Services Coordinators and chaired by the European Commission. […] the Board advises the Digital Services Coordinators and the European Commission in accordance with the Digital Services Act to achieve the following objectives:
contributing to the consistent application of the Digital Services Act and effective cooperation of the Digital Services Coordinators and the European Commission with regard to matters covered by the Digital Services Act;
coordinating and contributing to guidelines and analysis of the European Commission and Digital Services Coordinators and other competent authorities on emerging issues across the internal market with regard to matters covered by the Digital Services Act; and
assisting the Digital Services Coordinators and the European Commission in the supervision of very large online platforms and very large online search engines.”
The DSA Observatory at the University of Amsterdam will host its second international conference on ‘The DSA and Platform Regulation’ on 16-17 February 2026. The programme of the conference is available here.
Further DSA resources


This is the shift: regulating architecture, not chasing individual posts. When design becomes enforceable, engagement” stops being a vibe and starts being liability.