April 2026: The Enforcement Dial Turns Further
The European Board for Digital Services convened its 18th meeting on April 15, 2026. According to the official meeting statement, the Board “reaffirmed its commitment to the protection of minors online” and discussed coordinating enforcement activities across national Digital Services Coordinators (DSCs) and the European Commission. The statement reflects an enforcement posture that has moved from investigation-building (2024) to active proceedings (2025) to coordinated multi-platform action (2026).
The Digital Services Act, which became applicable to very large online platforms (VLOPs) and very large online search engines (VLOSEs) in August 2023, established a two-tier enforcement structure. The European Commission holds exclusive enforcement jurisdiction over the largest platforms — those with more than 45 million monthly active EU users. National DSCs enforce the Act for smaller platforms in their jurisdictions. The Board coordinates the two tiers, promotes consistent application, and can issue non-binding opinions and binding decisions in certain circumstances.
The first enforcement wave (late 2023 through 2024) focused on establishing the procedural framework: formal designations of VLOPs, audit requirements, algorithmic transparency reporting, and investigations into illegal content moderation. The second wave — the one unfolding in 2026 — is focused on a specific harm category that has drawn the most political attention across EU member states: the protection of children and minors from algorithmic harm, addictive design, and inadequate age verification.
Four Platforms, One Month: The Enforcement Picture in March 2026
The scale and coordination of March 2026 enforcement actions mark a qualitative shift in DSA implementation. Within a single month, the Commission took action against at least five platforms across different categories of minors-protection violations.
TikTok received preliminary findings in February 2026 for algorithmic design features identified as “intrinsically addictive.” The Commission’s specific concerns included infinite scroll, autoplay video, push notifications, and personalized recommendation intensity — features that the Commission argued are engineered to maximize engagement at the expense of minors’ mental health and autonomy. TikTok has 60-90 working days to respond to the preliminary findings before the Commission issues a final decision, which could include structural remedies and fines.
Snapchat became subject to a formal Commission investigation in March 2026, originating from a referral by the Dutch Digital Services Coordinator. The investigation’s focus includes Snapchat’s age-verification approach — specifically, its reliance on self-declaration by users, which the Commission deemed insufficient to meet the platform’s DSA obligations for minors. Concerns about inadequate moderation of illicit sales and facilitation of harmful interactions with minors were also cited.
Four adult-content platforms — Pornhub, Stripchat, XNXX, and XVideos — received Commission preliminary findings in March 2026, all citing inadequate safeguards for minors. These platforms had been designated as VLOPs in 2023, triggering the full DSA compliance burden. Their failure to implement robust age verification and content moderation for minor access represents the clearest-cut category of violation: the content is harmful to minors by definition, and the platforms’ verification systems were found to be structurally inadequate.
The precedent-setting enforcement action was the €120 million fine levied against X (formerly Twitter) in December 2025 for DSA violations related to advertising practices and content moderation transparency. This fine — while not the largest possible under the regulation’s 6% of global turnover ceiling — established the enforcement credibility that had been questioned when the Act first came into force.
Advertisement
What “Tightening” Means: Four Platform Obligations Gaining Enforcement Focus
For platform operators and their compliance teams, understanding which specific obligations are driving the 2026 enforcement acceleration is more useful than tracking individual investigations. Based on Commission communications and enforcement actions through April 2026, four obligation categories are receiving the most active scrutiny.
1. Age Verification: Moving from Self-Declaration to Verified Systems
The Snapchat investigation and the adult-content platform findings both center on the inadequacy of self-declaration as an age verification mechanism. DSA Article 28 requires VLOPs to implement appropriate and proportionate measures to ensure a high level of privacy, safety, and security for minors. The Commission’s emerging enforcement position is that self-declaration — asking users to enter a birth date without verification — does not meet this standard when the platform hosts content that is harmful to minors.
The practical implication is that platforms will need to move toward verified age assurance mechanisms: document verification, device-level signals, payment card verification, or third-party age assurance services. The UK’s Online Safety Act has already established a comparable requirement in the British market. EU enforcement is now converging on the same standard, and platforms operating across both jurisdictions will need a unified verification architecture.
2. Algorithmic Design: Addictive Features as a Compliance Risk
The TikTok preliminary findings introduce a new enforcement frontier: treating certain algorithmic design choices as inherently non-compliant, independent of content. Infinite scroll, autoplay, and notification intensity are not neutral technical features — the Commission’s position is that when deployed to minors, they create engagement patterns that override minors’ autonomy and constitute a systemic risk under Article 34 of the DSA.
This has broad implications beyond TikTok. Any VLOP that uses infinite scroll, autoplay, or high-frequency push notifications directed at users who may be minors now faces potential scrutiny. Compliance teams should audit which platform features are designed to maximize session length or return frequency, assess whether those features are adequately restricted for minor users, and document the design choices and mitigation measures that were considered.
3. Recommender System Transparency and Opt-Out
DSA Article 38 requires VLOPs to offer users at least one option for a recommender system not based on profiling. For minors, the obligation is stronger: the default must be the non-profiling option. Platforms that profile minor users’ behavior to drive personalized content recommendations — without ensuring the default is off — are in violation.
Audit findings and investigation notices consistently cite recommender system non-compliance as a co-occurring violation alongside age verification failures. Platforms should ensure that minor-flagged or unverified-age accounts are defaulted to non-profiling recommendations and that the opt-out mechanism for profiling is prominently accessible to all users.
4. Targeted Advertising to Minors: Hard Prohibition
DSA Article 28(2) contains a hard prohibition on presenting targeted advertising to minors on VLOPs — no exceptions, no opt-in alternatives, no legitimate interest justification. This is among the most absolute obligations in the regulation. Yet enforcement findings consistently note advertising targeting practices that were not reliably suppressed for minor-age users.
For ad-supported platforms, this requires a technically robust system for identifying minor users across all surfaces (web, mobile, embedded players), suppressing the personalization pipeline for those users, and ensuring that age-verified adult users are not inadvertently excluded from advertising while unverified-minor users remain exposed. The technical architecture to implement a clean advertising suppression for minors is more complex than a policy flag — it requires integration between age assurance, consent management, and ad-serving systems.
The Bigger Picture: From GDPR to DSA as the Second Compliance Wave
The enforcement trajectory of the DSA in 2026 mirrors the trajectory of GDPR between 2018 and 2022 — an initial period of framework-building followed by an acceleration of substantive enforcement once the regulator had built institutional capacity and enforcement precedent. The €120 million X fine plays the same role that early GDPR fines played: signaling that the regulation has teeth, that enforcement will not be indefinitely deferred, and that the largest platforms will not receive special forbearance.
The difference from GDPR is the subject matter. GDPR enforcement focused on data flows, consent, and information rights — abstract issues that rarely generated public pressure. DSA enforcement on minors protection is politically charged: every EU government has constituents who are parents, and every national DSC operates under political pressure to show results on child safety. This political dimension accelerates enforcement timelines in ways that GDPR’s more abstract violations did not.
For compliance teams at consumer platforms — whether very large or smaller, whether operating under Commission or national DSC jurisdiction — the message from April 2026 is clear: the standard for minors protection has moved from aspirational to enforceable, and the coordination between Commission and national authorities means there is no safe harbor in jurisdictional ambiguity.
Frequently Asked Questions
What threshold makes a platform subject to EU Commission (rather than national) DSA enforcement?
Platforms with more than 45 million monthly active users in the EU are designated as Very Large Online Platforms (VLOPs) or Very Large Online Search Engines (VLOSEs) and fall under direct European Commission enforcement jurisdiction. Smaller platforms are subject to enforcement by the national Digital Services Coordinator in each EU member state where they operate. The designation changes the enforcement authority and the resource intensity of compliance obligations, but not the substance of the rules — the DSA’s platform obligations apply to all in-scope platforms, not only VLOPs.
How does DSA enforcement on minors protection interact with national laws like France’s age verification requirements?
The DSA sets a minimum standard that all EU member states must apply, but it does not prevent member states from enacting stricter national rules for minors protection, provided those rules comply with EU law. France’s Digital Majority Law (SREN law, 2023) requires age verification for social media account creation — stricter than the DSA’s “appropriate and proportionate measures” standard. UK Online Safety Act requirements are even more prescriptive. Platforms operating across multiple jurisdictions must identify the most stringent applicable standard and build to that, as a compliant-with-France system will also be compliant with the DSA minimum.
What fine did X receive in December 2025 and what violations triggered it?
X (formerly Twitter) received a €120 million fine from the European Commission in December 2025 for DSA violations. The violations cited related to advertising targeting practices and content moderation transparency failures — specifically, X’s advertising system was found to be using user data in ways inconsistent with DSA transparency and user choice obligations. The fine established the enforcement precedent that Commission enforcement is operational and that violations identified through the VLOP audit process will result in material financial penalties.
Sources & Further Reading
- Digital Services Act — European Commission
- DSA Enforcement is the New Regulatory Shock: Mapping the First Wave of Platform Risk in 2026 — Atlas Institute
- European Commission Fines X €120M for DSA Violations — IAPP
- DSA Enforcement and Penalties — EDAA
- EU Prepares Tougher Tech Enforcement in 2026 — European Business Magazine
- Digital Markets Act and DSA Enforcement State of Play — EP Think Tank
















