Children’s Digital Safety: The European Commission Issues Guidelines under the DSA

Children’s Digital Safety: The European Commission Issues Guidelines under the DSA

Key Takeaways

The European Commission’s guidelines under the DSA establish an unprecedented framework for protecting minors online, imposing broad obligations on digital platforms such as age assurance, safety by design, enhanced moderation rules, and accountable governance.


On July 14, 2025, the European Commission published guidelines pursuant to Article 28 of the Digital Services Act (DSA), aimed at strengthening the protection of minors in the digital environment. (1) These guidelines mark a major step in the EU’s policy framework, complementing the “Better Internet for Kids” strategy (2), and seek to ensure a high level of privacy protection for children online.

Although not legally binding, the guidelines set a reference framework intended to guide regulatory authorities, serve as a basis for the enforcement of the DSA, and in practice, become the compliance standard across the EU.

After recalling the main objectives of these guidelines, we explain which entities are subject to them and provide a summary of the key recommendations.


1. The Objective: A Safer Digital Environment for Children

The primary purpose of the guidelines is to provide online platforms with a structured framework to comply with Article 28 of the Digital Services Act (DSA), which requires them to implement proportionate and appropriate measures to ensure “a high level of privacy, safety, and security of minors, on their service.” The guidelines transpose this obligation through a series of practical rules for providers of online platforms.

More specifically, the guidelines aim to clarify the obligations set out in the DSA, particularly the concept of a “high level of privacy, safety, and security,” thereby reducing uncertainty for providers of online platforms and regulators. They are intended to serve as a reference document for Digital Services Coordinators and national authorities when assessing the compliance of platforms accessible to minors.

The guidelines are based on a risk-oriented methodology. They embed children’s privacy protection into the design of digital services so that these services are tailored to the age ranges of minors.

By pursuing these objectives, the Commission seeks not only to mitigate the immediate risks associated with the digital environment but also to foster long-term trust in online services and promote a safer ecosystem for children. Although voluntary, the guidelines are designed to establish a benchmark standard for the protection of minors online.


2. Scope of the Guidelines

The guidelines apply to all online platforms, as defined in Article 3(i) of the DSA, that are accessible to minors. However, exemptions exist for micro and small enterprises, in line with Article 19 of the DSA.

The notion of accessibility is interpreted broadly: a platform is considered accessible to minors if its terms and conditions, or terms of service (ToS), allow minors to use the service, if it is targeted at minors or predominantly used by them, or if the provider is aware that some users are minors.

It is important to note that a provider cannot evade these obligations simply by prohibiting minors’ access in its terms of service; effective access restrictions must be implemented to substantiate such a claim.

By clarifying the scope of Article 28 of the DSA, the guidelines ensure that the obligations to protect minors extend across the entire digital ecosystem, covering both mainstream and niche services likely to be used by minors. The guidelines therefore encompass a wide range of platforms, including:

    - Social media platforms, video-sharing services, and online communities where minors can share content and interact;

    - App stores and online marketplaces offering digital goods and services;

    - Gaming and streaming platforms that integrate interactive or commercial features appealing to minors;

    - Digital services incorporating AI, such as chatbots and metaverses, where minors may be exposed to increased risks.

Although the implementation of the guidelines is not mandatory, in practice they serve as an interpretive and supervisory tool for Digital Services Coordinators (in France, ARCOM) and the European Board for Digital Services, guiding them in their monitoring and enforcement activities.


3. Key Recommendations of the Guidelines

    3.1 Age Assurance

Platforms must adopt robust, accurate, and proportionate age assurance mechanisms to prevent minors from being exposed to harmful or age-inappropriate content, such as pornography, gambling, or certain addictive services.

The guidelines distinguish between age estimation (an approximate assessment) and age verification (checks based on ID documents or European digital identity wallets). Self-declaration is excluded as insufficient.

Providers must carry out an ex-ante analysis of proportionality and necessity, ensure compliance with the GDPR, and publish the results.

    3.2 Registration and Account Settings

The registration process is the first step to integrate enhanced protection measures. In this regard, platforms are required to:

    - Use child-friendly language, understandable and accessible, to explain the risks and benefits of registration, and to allow children to easily delete their accounts;

    - Discourage registration by minors below the legal age and obtain parental consent where required by law;

    - Configure minors’ accounts as “private” by default so that their personal information, data, and social media content are visible only to approved contacts;

    - Prevent any design practices that would encourage minors to lower their protection settings;

    - Restrict or disable by default features that encourage excessive use of the service, such as autoplay, push notifications during rest periods, streaks (continuous usage series), and algorithmic filters that could impact body image.

    3.3 Interface Design

Interfaces must be age-appropriate, accessible (including for minors with disabilities), and provide time management tools (e.g., reminders, usage dashboards).

Platforms must avoid “dark patterns” such as infinite scrolling, urgency cues, or persuasive reward loops.

AI-based features, such as chatbots, must not be enabled by default, must include clear, child-appropriate warnings, and must respect transparency requirements throughout the interaction.

    3.4 Commercial Practices

The vulnerability of minors to manipulative, misleading, or aggressive commercial practices is a major concern. To limit minors’ exposure to manipulative commercial practices, platforms must:

    - Prohibit advertising based on the profiling of minors, in accordance with Article 28.2 of the DSA, whenever the platform knows “with reasonable certainty” that the service recipient is a minor;

    - Ensure transparency and clear identification of commercial content;

    - Prevent minors’ exposure to products or services harmful to them, such as gambling, weight-loss products, or adult services;

    - Avoid deceptive monetization practices, including loot boxes, virtual currencies, or “pay-to-win” models;

    - Display prices in the national currency and implement parental control mechanisms for approving or limiting spending.

These rules complement the Unfair Commercial Practices Directive of 11 May 2005, and consumer protection regulations.

    3.5 Moderation and Reporting Tools

The guidelines set out enhanced moderation requirements for minors to reduce the risk of exposure to harmful content and behaviors. In this respect, platforms must:

    - Define harmful or illegal behaviors in clear terms that minors can understand;

    - Implement a moderation service available 24/7, combining automated detection with human oversight;

    - Establish protective measures for AI-generated content and recommendation systems;

    - Provide visible, accessible, and child-friendly reporting tools;

    - Offer user feedback options (such as “show less” or “this makes me uncomfortable”) that directly influence recommendation systems, thereby empowering children to better control content feeds;

    - Provide access to dedicated support resources;

    - Ensure that minors can block or disable users, manage comments, and give explicit consent before joining groups.

Parental control tools should complement (not replace) these protective measures.

    3.6 Governance and Monitoring

From now on, the principle of child protection will have to be embedded in the corporate governance of digital platform providers. Accordingly, platforms should:

    - Appoint dedicated officers for children’s digital safety and train staff on children’s rights and risks specific to minors;

    - Conduct regular risk assessments (such as impact assessments), with an annual evaluation of the impact on children’s rights, and involve minors, their guardians, and experts in the risk assessment process. VLOPs and VLOSEs may include this process into their global risk assessment obligation under art. 34 of the DSA;

    - Monitor and adapt protective measures in line with evolving risks, technological changes, and stakeholder feedback.

More broadly, platforms must respect transparency obligations by drafting clear, understandable, and child-appropriate terms and conditions.


       The Commission’s guidelines on the protection of minors under Article 28 of the DSA provide a comprehensive framework for integrating children’s rights into the architecture of digital services. By combining risk-based assessments, privacy- and safety-by-design principles, and age-appropriate service standards, they offer both a roadmap for compliance and a forward-looking vision for online child protection.

Although not legally binding, these measures are expected to become the benchmark standard for Digital Services Coordinators and national authorities in enforcement.

The European Commission has announced that it will review these guidelines within 12 months and adapt them if necessary, based on practical experience and technological developments.


      Please don’t hesitate to contact us should you need legal guidance on DSA compliance and child protection measures.

* * * * * * * * * * *


(1) Guidelines on measures to ensure a high level of privacy, safety and security for minors online, pursuant to Article 28(4) of Regulation (EU) 2022/2065

(2) The “Better Internet for Kids” (BIK+) strategy is an initiative launched by the European Commission in May 2022. The purpose of the Better Internet for Kids strategy is to improve the digital environment to ensure that children have access to safe, age-appropriate online services.


Bénédicte DELEPORTE
Avocat

Deleporte Wentz Avocat
www.dwavocat.com

September 2025