EU Commission: Meta Accused of Failing to Protect Children Under 13

EU Commission finds DSA violations as Instagram and Facebook fail to block children under 13. Meta faces potential fines of up to 6% of its global revenue.

EU Commission: Meta Accused of Failing to Protect Children Under 13

The European Commission has released the preliminary results of a two-year investigation, accusing Meta of lacking effective tools to prevent children under 13 from accessing Instagram and Facebook. The violation reportedly concerns the Digital Services Act (DSA), the EU regulation governing digital services and the protection of vulnerable users. According to the EU executive, the controls currently implemented by the company do not work effectively, despite the terms of use clearly establishing 13 as the minimum age for platform access.

What the European Commission Found

The preliminary conclusions of the investigation are unequivocal: Instagram and Facebook are doing very little to prevent children under 13 from using their services. Henna Virkkunen, European Commissioner for Technological Sovereignty, Security and Democracy, stated that the failure to comply with EU legislation to protect minors from social network risks represents a violation of the DSA. The investigation, launched two years ago, examined the age verification procedures implemented by Meta and judged them insufficient.

The Commission has called on Meta for a greater commitment to reducing the presence of users under 13 on its social networks. The company can respond to the accusations and modify its behavior according to the requests made. In the event of non-compliance, the European institution could impose a penalty of up to 6% of Meta's global annual revenue, which amounts to approximately $200 billion.

Meta’s Response and the Age Verification Problem

Meta responded to the accusations by emphasizing that it continues to invest in technologies to identify and remove users who do not meet the minimum age requirement. The company announced that next week it will share further information on new tools currently being implemented. Meta also described age verification as an industry-wide challenge that requires a systemic solution, expressing its willingness to work constructively with the European Commission.

The technical problem of age verification remains central to the debate. Methods based solely on self-declaration—asking the user to enter their date of birth during registration—have proven easy to bypass. Minors can enter false dates without the system being able to verify their authenticity. However, more robust solutions, such as integration with identity documents or third-party verification systems, involve issues related to privacy and the management of sensitive data.

The Role of the European Commission App

In the statement regarding the accusations, the European Commission promoted its own app dedicated to protecting minors online. The tool, developed as part of digital security initiatives, is presented as an alternative or complement to ensure greater protection for younger users. The application is part of a broader framework of tools provided by the European institution to support parents and educators in managing access to digital services.

The mention of the app by the Commission is significant in the context of the regulatory action against Meta. While the European institution sanctions the shortcomings of large private operators, it also proposes its own tools as a possible model or reference for best practices to be adopted. The implicit message is that technically valid solutions exist and can be implemented.

Implications of the Digital Services Act

The Meta case represents one of the first significant applications of the Digital Services Act in the social media sector. The legislation, which came into force with the aim of creating a safer digital environment, imposes strict obligations on platforms designated as "very large" based on their number of users. These obligations include the protection of minors and transparency in access mechanisms.

The preliminary violation specifically concerns the article of the DSA that requires intermediary service providers to adopt appropriate measures to ensure a high level of privacy, safety, and security for minors. The European Commission has determined that Meta's current measures do not meet this requirement, opening the way for potentially significant economic sanctions.

The proceedings will now follow their formal course. Meta will have the opportunity to present its arguments and, if necessary, introduce changes to its age verification systems. The Commission will evaluate the responses before issuing a final decision. In the meantime, the case marks a step forward in the concrete application of the European regulatory framework to Big Tech.

Frequently Asked Questions

What is the minimum age to use Instagram and Facebook?
According to Meta's terms of use, the minimum age to access Instagram and Facebook is 13. The European Commission has found that the controls to enforce this rule are not effective.
What is the maximum penalty Meta faces?
If it does not comply with the European Commission's requests, Meta risks a fine of up to 6% of its global annual turnover. With revenues of approximately $200 billion, the penalty could reach very high amounts.
What is the Digital Services Act?
The Digital Services Act (DSA) is the EU law that regulates digital services in the European Union. It imposes specific obligations on large online platforms, including the protection of minors and transparency in access mechanisms.

This article is a summary based exclusively on the listed sources.

Sources