15 abr 2026

Digital ECA: new decrees regulate the protection of children and adolescents in the digital environment

On March 18, 2026, Decrees No. 12,880, 12,881 and 12,882 were published, which regulate the ECA Digital (Law No. 15,211/2025) – legislation that established an unprecedented regulatory framework for the protection of children and adolescents in the online environment. This law establishes specific rules and duties for providers of information technology products and services aimed at children and adolescents, as well as for those that are likely to be accessed by this public, in Brazil, regardless of their location, development, manufacture, supply, commercialization or operation.

This newsletter presents the main points of the new regulation, which details the practical application of the law and directly impacts digital platforms, social networks, electronic games, app stores, software companies, computer programs, as well as the digital advertising market in general.

Decree No. 12,880/2026:

  1. Risk-based approach: the main regulatory decree adopts a risk-based principle as the cornerstone of the obligations imposed on companies. This means that requirements vary according to the degree of exposure of the service to children and adolescents:

For prohibited content (e.g. pornography, gambling, weapons, loot boxes, alcoholic beverages, tobacco products), the following measures should be implemented:

  • Effective age verification mechanisms, with a high degree of reliability;
  • Effectively prevent access, enjoyment or consumption by minors;
  • In certain cases, prohibit the creation of profiles of children and adolescents and remove existing accounts;
  • For certain products and services, block, by default, the acquisition of products and services by minor users, and unblocking by self-declaration is prohibited.

For improper or inappropriate content (those that may present a risk to the privacy, safety, psychosocial development, mental and physical health and well-being of children and adolescents), less intrusive measures are allowed, such as:

  • Indicative rating
  • Safe settings by default;
  • Parental supervision tools.

 

  1. Verification vs. Age measurement: the decree expressly distinguishes these two concepts as follows, and this distinction is fundamental for companies to correctly size the mechanisms to be adopted in each scenario
  • Age verification: a specific procedure for confirming age or age group, with a high degree of reliability, through the use of technical or documentary mechanisms, required for prohibited content and services and for social networks.
  • Age measurement: a broader set of procedures for verifying, estimating or inferring the user’s age or age group, applicable in proportion to the risk of the service.

 

  1. “Age signals” and shared responsibility: the decree introduces the concept of “age signals” – information about the user’s age or age range that must be shared, free of charge, by operating systems and app stores with suppliers of technology products or services, promoting interoperability. However, additional personal data of users should not be revealed, and such signs are limited only to data strictly necessary to confirm the minimum age required to access the product or service. In addition, the receipt of signs of age does not exempt the supplier from responsibility for the effectiveness of the age adequacy and its own protection measures adopted.

 

  1. Personal data protection as a central pillar: age measurement mechanisms must respect principles such as proportionality, data minimization, and prohibit excessively invasive practices, including the prohibition of solutions that imply massive or disproportionate surveillance, reinforcing the necessary compatibility between child and adolescent protection and fundamental privacy rights.

  

  1. Prevention of manipulative practices and excessive use: suppliers of technology products or services must implement mechanisms to avoid their excessive, problematic or compulsive use, and must not adopt measures such as offering rewards for time of use, excessive notifications, hiding natural stopping points and automatic activation of content. AI-based systems that interact with minors must also adopt specific safeguards. In addition, the ANPD will regulate the minimum security requirements by default and act to curb the adoption of manipulative, deceptive, or coercive practices, defined as those that employ tactics that interfere with the user’s decision-making autonomy or that exploit cognitive vulnerabilities of children and adolescents.

 

  1. Age rating for electronic games and digital applications: the age rating policy now considers risks related to interactive features, loot boxes (prohibited for children and adolescents), encouragement of problematic or excessive use, especially through functionalities that induce compulsive engagement, microtransactions, manipulative practices and health impacts. The rating must be presented in a clear, standardized and easily identifiable manner, under the terms defined by the Minister of State for Justice and Public Security.

 

  1. Prohibitions on advertising: the decree reinforces and details the restrictions on advertising provided for in the ECA Digital by confirming that all advertising that exploits the lack of discernment and experience of children is abusive, and by prohibiting suppliers from using techniques such as profiling, emotional analysis, and augmented, extended, or virtual reality tools in advertising aimed at children and adolescents. It also reiterates that it will be up to the ANPD to regulate the forms and minimum requirements to prevent and mitigate child exposure to the promotion or commercialization of prohibited products and services, such as gambling and betting, tobacco products, alcoholic beverages and narcotics.

 

  1. Artistic activity: the decree regulates the artistic activity of children and adolescents in digital environments by imposing on suppliers the obligation to require judicial authorization, under the terms of the ECA, for monetized or boosted content that habitually exploits the image or routine of children or adolescents, determining the immediate removal of the content in the absence of this authorization and providing for a period of 90 days for the application of the rule to new monetization or boosts. There is also an express prohibition on the publication, monetization or promotion of content that exposes minors to violating, vexatious or degrading situations.

 

  1. Prevention and combating of serious violations in the digital environment: the Federal Police was defined as the authority responsible for the centralized receipt, processing, screening, and management of reports of notification of content with indications of crimes and infractions related to the exploitation, sexual abuse, kidnapping, and enticement of children and adolescents. The creation of the National Notification Screening Center was also authorized, whose function is to receive, validate, store, screen and process content notification reports sent by suppliers, with the objective of identifying suspects and forwarding the information to the competent judicial police. The decree also provides for the liability of suppliers for non-compliance with the duties of removal and communication of this content when a repeated failure in their moderation mechanisms is found, characterized by negligence or insufficiency of the measures adopted.

 

  1. Transitional provisions: finally, the decree provides that the ANPD will define the stages of implementation of age measurement solutions, adopting a responsive approach that considers the functionalities and risk level of each product, service, and content, as well as technological evolution. In this context, in parallel with the decrees, the ANPD published an official schedule for the implementation of the ECA Digital, confirming that the inspection will occur progressively, and not simultaneously and immediately.

 

Decree No. 12,881/2026

  1. Role of the ANPD and institutional framework: the National Data Protection Agency (ANPD) is confirmed as the main regulatory and supervisory authority of the ECA Digital, with the competence to issue complementary rules, apply sanctions, and evaluate the effectiveness of the mechanisms adopted. The ANPD’s performance must observe the principle of minimum intervention and adopt an approach proportional to the size and risk of each agent’s activities.

 

Decree No. 12,882/2026

  1. Administrative adjustments: the third decree promotes restructuring within the Federal Public Administration, especially in the Ministry of Justice and Public Security, to enable the implementation of the new attributions.

 

What changes in practice?

The new rules represent a significant advancement in the regulation of Brazil’s digital environment and will require companies across various sectors to review their data governance policies, content classification, age verification mechanisms and child protection tools.

It is important to note that non-compliance with the provisions of the ECA Digital may lead to the application of administrative sanctions, including a warning with a deadline for the adoption of corrective measures, a simple fine that can reach up to 10% of the economic group’s revenue in Brazil, as well as the temporary suspension or even the prohibition of the exercise of activities related to non-compliant products or services.

 

In case of any doubts about this subject, please do not hesitate to contact us.

Comentários