Regulation

Brazil recently took a significant step toward protecting minors in digital spaces. On September 17, 2025, the country’s president signed Bill 2628/2022, known as the Digital ECA (Estatuto Digital da Criança e do Adolescente). The law will take effect on March 17, 2026, and will be enforced by the ANPD, Brazil’s data protection authority. The […]

3 min read
October 20, 2025
Share this article:

Brazil recently took a significant step toward protecting minors in digital spaces. On September 17, 2025, the country’s president signed Bill 2628/2022, known as the Digital ECA (Estatuto Digital da Criança e do Adolescente). The law will take effect on March 17, 2026, and will be enforced by the ANPD, Brazil’s data protection authority.

The Digital ECA introduces some of the world’s strictest rules for age assurance, parental supervision, and youth-oriented monetisation. It applies to any digital product or service “aimed at” or “likely to be accessed” by children or adolescents in Brazil, regardless of the provider’s location.

Who must comply

The Digital ECA applies broadly to any provider of digital products or services directed to or likely to be accessed by children and adolescents in Brazil. This includes social networks, messaging apps, video-sharing platforms, games, app stores, streaming services, and adult content websites.

The test for being “likely to be accessed” considers how attractive or accessible the service is to minors and whether it presents risks to their privacy, safety, or development – especially where social interaction is involved.

In practice, this means almost any platform or online service that can be used by children or teenagers in Brazil will be covered by the new law, regardless of where the company is based.

Age assurance: No more self-declaration

A core principle of the Digital ECA is that companies must provide “effective and reliable” age-verification mechanisms to ensure age-appropriate experiences. Platforms can no longer rely on self-declaration to restrict access to adult or otherwise unsuitable content.

Even if app stores or operating systems supply “age signals,” the platform itself remains responsible for implementing its own verification processes. Providers will need to use robust, auditable methods to distinguish adults from minors.

Adult sites and pornography require strict age verification

The law introduces explicit obligations for providers of adult or pornographic content. They must adopt effective technical measures to prevent access by anyone under 18. Simple disclaimers or self-certification are expressly prohibited.

This requirement means pornography sites and other adult-only services must integrate age-verification systems highly effective at verifying that a user is legally an adult, using privacy-preserving but reliable technologies.

Bans on profiling, targeted advertising and loot boxes

The Digital ECA also targets commercial practices that exploit or monetise minors.

  • Profiling and targeted advertising are prohibited for anyone under 18. Platforms cannot use behavioural data, emotional analysis, or immersive tools such as augmented or virtual reality to target or personalise ads for children or adolescents.
  • In the gaming sector, the law bans paid loot boxes (randomised in-game purchases that function like gambling mechanics) in any product accessible to minors. The definition covers any paid feature that offers a random virtual reward of uncertain value.
  • Developers must also avoid “pay-to-win” features that provide unfair gameplay advantages through purchases when their games are accessible to under-18s.

Social media: Parental linkage for under-18s

Social networks face new requirements for young users. Accounts for users under 16 must be linked to a parent or legal guardian’s account. The parent must be able to view, configure and manage the child’s privacy, communication, and usage settings.

If no guardian account is linked, platforms must restrict access or prevent account creation. The aim is to give parents visibility and control over their child’s online interactions while ensuring social platforms maintain protective default settings.

Parental supervision tools

The Digital ECA mandates that platforms provide easy-to-use parental supervision tools, configured by default to the highest protection level. Required features include:

  • Time-limit and activity monitoring controls
  • Restrictions on communication with unauthorised users
  • Purchase and financial-transaction limits
  • Control over personalised recommendations and geolocation
  • Metrics showing total time spent online
  • All parental tools and notices available in Portuguese

These features are designed to make online services safer and more transparent for families, while giving guardians active control over children’s digital experiences.

Enforcement and penalties

The ANPD will oversee compliance and can issue warnings, fines, or service suspensions. Fines can reach up to 10% of Brazilian revenue, capped at BRL 50 million per violation.

Providers with over one million minor users must also publish semi-annual transparency reports in Portuguese detailing complaints, moderation measures, and age-assurance practices.

Preparing for 2026

With the Digital ECA taking effect in March 2026, global companies offering digital services in Brazil should act now to:

  • Integrate robust age-verification systems for adult and youth audiences
  • Disable targeted advertising and profiling for under-18s
  • Remove or redesign loot boxes and similar randomised monetisation models
  • Build guardian-linked accounts for users under 16
  • Ensure parental control features are active by default and fully localised

The Digital ECA marks a pivotal shift in how Brazil governs children’s digital rights. It moves beyond general data protection and into a comprehensive youth safety-by-design approach, embedding child protection and age assurance at the heart of their products and services.

By March 2026, online services operating in Brazil will need not only technical upgrades but also a cultural shift – treating child protection and age assurance as core compliance priorities rather than optional safeguards.

About the author

Verifymy

Verifymy is a safety technology provider on a mission to safeguard children and society online.

Subscribe and keep up to date

Related articles

Regulation

Australia today becomes the first country in the world to introduce an outright ban on under-16s holding social media accounts, a move that has captured global attention. Governments from Europe to North America are already signalling that they are watching closely – not only to see whether the policy reduces online harms, but also to […]

4 min read
December 9, 2025
Webinar

As part of our ongoing collaboration with Internet Matters, we recently hosted a webinar focused on one of the most important recent developments in online safety: the rollout of age checks across online platforms. The background Under the UK’s Online Safety Act, sites and apps that host or publish pornography must implement highly effective age […]

2 min read
October 1, 2025
Regulation

On September 15th, New York’s Attorney General has released proposed rules for the SAFE for Kids Act, a landmark law requiring social media companies to protect under-18 users from algorithmically personalised (or “addictive”) feeds and late-night push notifications.  While the law is driven by evidence that algorithmically-curated feeds fuel depression, anxiety and other harms, the […]

3 min read
September 17, 2025
our solutions
industries
Company
resources