Skip to main content
COSMICBYTEZLABS
NewsSecurityHOWTOsToolsStudyTraining
ProjectsChecklistsAI RankingsNewsletterStatusTagsAbout
Subscribe

Press Enter to search or Esc to close

News
Security
HOWTOs
Tools
Study
Training
Projects
Checklists
AI Rankings
Newsletter
Status
Tags
About
RSS Feed
Reading List
Subscribe

Stay in the Loop

Get the latest security alerts, tutorials, and tech insights delivered to your inbox.

Subscribe NowFree forever. No spam.
COSMICBYTEZLABS

Your trusted source for IT intelligence, cybersecurity insights, and hands-on technical guides.

861+ Articles
122+ Guides

CONTENT

  • Latest News
  • Security Alerts
  • HOWTOs
  • Projects
  • Exam Prep

RESOURCES

  • Search
  • Browse Tags
  • Newsletter Archive
  • Reading List
  • RSS Feed

COMPANY

  • About Us
  • Contact
  • Privacy Policy
  • Terms of Service

© 2026 CosmicBytez Labs. All rights reserved.

System Status: Operational
  1. Home
  2. News
  3. European Commission Accuses Meta of Breaching Child Safety Rules
European Commission Accuses Meta of Breaching Child Safety Rules
NEWS

European Commission Accuses Meta of Breaching Child Safety Rules

The European Commission has formally accused Meta of violating the Digital Services Act by failing to adequately protect children under 13 from accessing Facebook and Instagram.

Dylan H.

News Desk

April 30, 2026
4 min read

The European Commission has formally accused Meta of breaching the European Union's Digital Services Act (DSA) over its failure to adequately protect children under 13 years old from accessing Facebook and Instagram. The action marks one of the most significant regulatory confrontations between EU authorities and Meta since the DSA came into full force for very large online platforms.

The Commission's Findings

According to the Commission, Meta's platforms "failed to diligently identify, assess and mitigate the risks of minors under 13 years old accessing their services." Under the DSA, very large online platforms — those with more than 45 million monthly active users in the EU — are required to conduct annual risk assessments covering systemic risks including harms to minors, and to implement reasonable mitigation measures proportionate to those risks.

The Commission's preliminary findings indicate Meta did not meet this obligation across Facebook and Instagram, despite the platforms having age restriction policies nominally prohibiting access by children under 13. The gap between stated policy and effective enforcement is at the heart of the regulatory action.

Digital Services Act Framework

The DSA, which came into force for the largest platforms in August 2023, represents the EU's most comprehensive attempt to regulate digital platforms' obligations to users and society. Key requirements relevant to this case include:

  • Systemic risk assessment — Platforms must identify and evaluate risks to fundamental rights, public security, and vulnerable groups, including minors
  • Mitigation measures — Platforms must implement reasonable, effective measures to address identified risks
  • Independent auditing — Compliance with DSA obligations is subject to annual independent audits
  • Transparency reporting — Platforms must publish transparency reports covering their content moderation and risk mitigation activities

Failure to comply can result in fines of up to 6% of a company's global annual turnover, and repeated infringements can lead to temporary service suspensions within the EU market.

Context: Child Safety and Big Tech

The action against Meta comes amid intensifying regulatory focus on how social media platforms handle underage users across multiple jurisdictions:

  • The UK's Online Safety Act imposes similar child protection duties on platforms operating in Britain
  • The US Federal Trade Commission proposed significant updates to the Children's Online Privacy Protection Act (COPPA) to strengthen protections for minors
  • Australia passed landmark legislation in 2024 banning children under 16 from social media platforms
  • Multiple EU member states have enacted or proposed national supplementary measures targeting minors' online access

Meta has invested in features such as age-gating, parental supervision tools, and "Teen Accounts" on Instagram, but critics and regulators have repeatedly questioned whether these measures are meaningfully effective at preventing underage access or merely create compliance optics.

Meta's Position

Meta has consistently argued that age verification at scale is technically challenging without introducing privacy risks for all users, and that the company has implemented industry-leading protections for younger users on its platforms. The company is expected to contest the Commission's preliminary findings during the formal proceedings.

What Comes Next

The Commission's formal accusation initiates a process that could ultimately result in binding compliance orders and substantial financial penalties. The next steps include:

  1. Preliminary findings notification — Meta has been formally notified of the Commission's preliminary assessment
  2. Right of defense — Meta will have the opportunity to respond to the findings and present its case before the Commission
  3. Final decision — Following the response period, the Commission will issue a final determination
  4. Remedies and fines — If violations are confirmed, the Commission can order specific compliance measures and levy fines proportionate to the severity and duration of the breach

The outcome of this case will set a significant precedent for how DSA child safety obligations are interpreted and enforced across the EU's digital regulatory regime — with implications not only for Meta but for every large platform operating in the European market.

#Regulation#Privacy#EU#Meta#Child Safety

Related Articles

Louis Vuitton, Dior, and Tiffany Fined $25 Million Over

South Korea's data protection authority has fined three LVMH luxury brands a combined $25 million for data breaches affecting millions of customers, with...

3 min read

Firefox Vulnerability Allows Tor User Fingerprinting Across 'New Identity' Resets

A high-severity Firefox vulnerability (CVE-2026-6770) exploits the internal ordering of IndexedDB database names to generate a stable 44-bit fingerprint that persists across Tor Browser's New Identity resets, linking anonymous sessions. Patched in Firefox 150 and Tor Browser 15.0.10.

5 min read

DORA and Operational Resilience: Credential Management as a Financial Risk Control

Article 9 of DORA makes authentication and access control a legal obligation for EU financial entities. With stolen credentials now the single largest...

7 min read
Back to all News