The European Commission has formally accused Meta of breaching the European Union's Digital Services Act (DSA) over its failure to adequately protect children under 13 years old from accessing Facebook and Instagram. The action marks one of the most significant regulatory confrontations between EU authorities and Meta since the DSA came into full force for very large online platforms.
The Commission's Findings
According to the Commission, Meta's platforms "failed to diligently identify, assess and mitigate the risks of minors under 13 years old accessing their services." Under the DSA, very large online platforms — those with more than 45 million monthly active users in the EU — are required to conduct annual risk assessments covering systemic risks including harms to minors, and to implement reasonable mitigation measures proportionate to those risks.
The Commission's preliminary findings indicate Meta did not meet this obligation across Facebook and Instagram, despite the platforms having age restriction policies nominally prohibiting access by children under 13. The gap between stated policy and effective enforcement is at the heart of the regulatory action.
Digital Services Act Framework
The DSA, which came into force for the largest platforms in August 2023, represents the EU's most comprehensive attempt to regulate digital platforms' obligations to users and society. Key requirements relevant to this case include:
- Systemic risk assessment — Platforms must identify and evaluate risks to fundamental rights, public security, and vulnerable groups, including minors
- Mitigation measures — Platforms must implement reasonable, effective measures to address identified risks
- Independent auditing — Compliance with DSA obligations is subject to annual independent audits
- Transparency reporting — Platforms must publish transparency reports covering their content moderation and risk mitigation activities
Failure to comply can result in fines of up to 6% of a company's global annual turnover, and repeated infringements can lead to temporary service suspensions within the EU market.
Context: Child Safety and Big Tech
The action against Meta comes amid intensifying regulatory focus on how social media platforms handle underage users across multiple jurisdictions:
- The UK's Online Safety Act imposes similar child protection duties on platforms operating in Britain
- The US Federal Trade Commission proposed significant updates to the Children's Online Privacy Protection Act (COPPA) to strengthen protections for minors
- Australia passed landmark legislation in 2024 banning children under 16 from social media platforms
- Multiple EU member states have enacted or proposed national supplementary measures targeting minors' online access
Meta has invested in features such as age-gating, parental supervision tools, and "Teen Accounts" on Instagram, but critics and regulators have repeatedly questioned whether these measures are meaningfully effective at preventing underage access or merely create compliance optics.
Meta's Position
Meta has consistently argued that age verification at scale is technically challenging without introducing privacy risks for all users, and that the company has implemented industry-leading protections for younger users on its platforms. The company is expected to contest the Commission's preliminary findings during the formal proceedings.
What Comes Next
The Commission's formal accusation initiates a process that could ultimately result in binding compliance orders and substantial financial penalties. The next steps include:
- Preliminary findings notification — Meta has been formally notified of the Commission's preliminary assessment
- Right of defense — Meta will have the opportunity to respond to the findings and present its case before the Commission
- Final decision — Following the response period, the Commission will issue a final determination
- Remedies and fines — If violations are confirmed, the Commission can order specific compliance measures and levy fines proportionate to the severity and duration of the breach
The outcome of this case will set a significant precedent for how DSA child safety obligations are interpreted and enforced across the EU's digital regulatory regime — with implications not only for Meta but for every large platform operating in the European market.