Regulation

On 13th May 2025, the European Commission (EC) published its long-awaited draft guidelines on the protection of minors online under Article 28 of the Digital Services Act (DSA). This marks a pivotal moment for the evolving field of age assurance, setting expectations for how online platforms should assess and manage risks to minors. The guidelines […]

3 min read
May 16, 2025
Share this article:

On 13th May 2025, the European Commission (EC) published its long-awaited draft guidelines on the protection of minors online under Article 28 of the Digital Services Act (DSA). This marks a pivotal moment for the evolving field of age assurance, setting expectations for how online platforms should assess and manage risks to minors.

The guidelines reaffirm platform-level accountability, resisting calls for shifting responsibility to devices or operating systems. The Commission is clear: platforms are best placed to understand their product, user experiences and therefore tailor protections accordingly, based on the content, services and risks specific to their offerings.

A tiered, risk-based framework

Consistent with the DSA, the guidelines adopt a risk-based approach to age assurance. Rather than mandating a one-size-fits-all solution, they recognise that the appropriate level of intervention will depend on the nature of the service and the potential risk to minors. 

For high-risk services – including access to pornography, gambling, or alcohol, or platforms that self-impose an 18+ threshold due to inherent risks – age verification is the required standard. This reflects the Commission’s view that where significant risks exist, robust methods relying on verified identity documents or physical identifiers are necessary to ensure compliance and protection.

In medium- and lower-risk contexts, such as platforms with under-18 age limits or general-use services like social media, age estimation is acknowledged as an appropriate and proportionate alternative. This is especially relevant when the goal is to tailor rather than restrict access – for example, modifying recommender systems, limiting harmful content exposure, or enabling safer interactions.

Embracing age estimation

However, age estimation technologies can be just as effective in practice as age verification, especially when measured against the Commission’s own criteria. While the guidance differentiates between the two in terms of preferred use cases, both methods are expected to meet the same bar for effectiveness.

The guidelines require that any age assurance method – whether estimation or verification – be assessed against five key criteria:

  • Accuracy: the ability to correctly estimate a user’s age or age range
  • Reliability: consistent performance in real-world conditions
  • Robustness: resistance to circumvention
  • Non-intrusiveness: minimal impact on privacy and user experience
  • Non-discrimination: equitable access for all users regardless of background or ability

Where age estimation meets regulatory standards, it can deliver a level of assurance sufficient to satisfy compliance requirements – while offering significant advantages over traditional age verification techniques, which are often more intrusive, more discriminatory, and more burdensome for users. Age estimation typically enables faster, lower-friction user experiences and better aligns with privacy, accessibility, and inclusion principles. In many real-world scenarios, this makes age estimation not just a viable option but the preferable approach for achieving the DSA’s objectives effectively and at scale.

Crucially, self-declaration is explicitly ruled out as inadequate – a welcome clarification that raises the bar across the industry.

One of our own approaches, which uses just an email address for age estimation, is an example of how data already collected by platforms can be leveraged to deliver fast, privacy-preserving and inclusive age assurance. With 99.94% of active online users also holding an active email account, this method inherently supports broad accessibility and inclusivity. This makes it not only compliant, but also a commercially viable option that supports adoption across diverse user bases.

Pragmatic & harmonised implementation for success

The guidelines also emphasise user choice and redress. Where age assurance is required, platforms should offer more than one method, such as a combination of verification and estimation, to avoid excluding eligible users. Mechanisms must also be in place for users to challenge incorrect age assessments – an important safeguard for transparency and trust.

While we welcome the Commission’s emphasis on proportionality, guidelines alone won’t be enough. Their success will depend on consistent enforcement, regulatory clarity, and a harmonised approach across Member States. Crucially, highly effective age estimation methods must be recognised as valid for high-risk services (such as pornographic content) to ensure both strong protection for children and alignment with privacy and inclusion principles. Fragmentation in approach would risk undermining both compliance and user safety

We also note that the Commission is working on an age-verification app, intended to provide an interim solution until the EU Digital Identity Wallet becomes available by the end of 2026. While such innovation may offer value, it must remain optional, not mandatory, and should uphold the same principles of proportionality, privacy, and user choice.

Looking ahead

The draft guidelines are open for public feedback until 10th June 2025. We encourage all stakeholders — from industry to civil society, parents to young people themselves — to engage in this process. With the right tools and a clear regulatory framework, we can better protect minors online without compromising their rights or experiences. The publication of the guidelines is expected by the summer of 2025.

About the author

Lina Ghazal

Lina is Head of Regulatory & Public Affairs at Verifymy, with over 10 years of experience working across media and tech, in both the public and private sectors — including at Ofcom, TF1, and Meta. Lina specialises in building impactful policy initiatives and partnerships, and has worked closely with regulators, industry leaders, and civil society across Europe, the Middle East, Africa, and the US to help shape the future of online safety.

Subscribe and keep up to date

Related articles

Regulation

Australia today becomes the first country in the world to introduce an outright ban on under-16s holding social media accounts, a move that has captured global attention. Governments from Europe to North America are already signalling that they are watching closely – not only to see whether the policy reduces online harms, but also to […]

4 min read
December 9, 2025
Regulation

Brazil recently took a significant step toward protecting minors in digital spaces. On September 17, 2025, the country’s president signed Bill 2628/2022, known as the Digital ECA (Estatuto Digital da Criança e do Adolescente). The law will take effect on March 17, 2026, and will be enforced by the ANPD, Brazil’s data protection authority. The […]

3 min read
October 20, 2025
Webinar

As part of our ongoing collaboration with Internet Matters, we recently hosted a webinar focused on one of the most important recent developments in online safety: the rollout of age checks across online platforms. The background Under the UK’s Online Safety Act, sites and apps that host or publish pornography must implement highly effective age […]

2 min read
October 1, 2025
our solutions
industries
Company
resources