Brussels Finds Meta in Breach of Digital Services Act Over Underage Users
The European Commission has issued a preliminary finding that Meta, the parent company of Facebook and Instagram, violated the Digital Services Act (DSA) by failing to adequately prevent children under the age of 13 from accessing its platforms. The landmark decision, announced on April 29, 2026, could result in a fine of up to 6% of Meta's global annual turnover — potentially exceeding $12 billion based on the company's $201 billion revenue in 2025.
The investigation, launched in 2024, concluded that Meta's age-verification measures are insufficient. Children under 13 can easily bypass the minimum-age requirement by entering a false birth date when creating an account, with no robust system in place to verify the accuracy of self-declared information. The Commission also found that Meta's tool for reporting underage accounts is cumbersome, requiring up to seven clicks to access, and that the company fails to act promptly on such reports.
What Are the Specific Violations Under the DSA?
The DSA, which came into force in 2022, imposes strict obligations on Very Large Online Platforms (VLOPs) like Facebook and Instagram to assess and mitigate systemic risks, including those posed to minors. The European Commission's preliminary findings highlight three key areas of non-compliance by Meta:
- Inadequate age verification: Meta does not effectively prevent children under 13 from creating accounts. Users can simply lie about their age during registration, and no reliable controls exist to detect false birth dates.
- Failure to remove underage accounts: Even when underage users are identified, Meta lacks an efficient process to remove them. The reporting mechanism is difficult to navigate, and the company does not prioritize these reports.
- Insufficient risk assessment: Meta's risk assessment under the DSA did not adequately address the specific vulnerabilities of minors, including exposure to harmful content and the addictive design of its platforms.
The Commission cited research indicating that between 10% and 12% of European children under 13 have an account on at least one Meta platform. This represents millions of minors who may be exposed to age-inappropriate content, cyberbullying, and predatory behavior.
Meta Responds: An 'Industry-Wide Challenge'
Meta has pushed back against the Commission's findings, describing underage detection as an "industry-wide challenge." In a statement, the company said it continues to invest heavily in technology to identify and remove underage users, including the use of artificial intelligence and age-estimation tools. However, the European Commission dismissed these efforts as insufficient, noting that Meta has not provided a timeline for implementing more robust measures.
The preliminary finding is not a final decision. Meta now has the opportunity to respond and propose corrective actions. If the Commission deems the response inadequate, it will issue a binding order requiring Meta to overhaul its child-safety protocols — and impose a fine that could reach up to 6% of Meta's global annual revenue.
Broader EU Crackdown on Social Media and Child Safety
This action against Meta is part of a wider European push to protect children online. Earlier in 2026, the Commission also found TikTok in breach of the DSA for its addictive design, and in April 2026, it unveiled a new age-verification app that would require users to present a valid ID to prove their age. The EU's Digital Services Act enforcement is rapidly becoming a cornerstone of digital regulation worldwide.
Member states are also taking national action. Several EU countries, including France and Germany, have debated or introduced legislation to restrict or ban social media access for children under 16. The European Commission's actions signal that Brussels is prepared to use its regulatory powers aggressively to force tech giants to comply.
What Happens Next? Timeline and Potential Outcomes
The coming weeks will be critical. Meta must submit a formal response to the Commission's preliminary findings. The Commission will then decide whether to issue a final infringement decision. If Meta is found to have violated the DSA, the company could face:
- A fine of up to 6% of global annual turnover — approximately $12 billion based on 2025 revenue.
- A binding order to implement effective age-verification measures, such as mandatory ID checks or biometric age estimation.
- Enhanced oversight, including regular audits by the Commission or third-party experts.
The case also sets a precedent for other social media platforms. YouTube, Snapchat, and X (formerly Twitter) could face similar scrutiny if they fail to protect minors. The 2025 TikTok DSA case already established that the Commission is willing to take enforcement action against addictive platform designs.
FAQ: Meta's DSA Breach and Child Safety
What is the Digital Services Act (DSA)?
The DSA is an EU regulation that sets strict rules for digital platforms to ensure user safety, transparency, and accountability. It requires Very Large Online Platforms to assess and mitigate systemic risks, including risks to minors.
Why is Meta being penalized?
The European Commission found that Meta fails to prevent children under 13 from accessing Facebook and Instagram, violates the DSA by not having effective age verification, and does not adequately remove underage accounts or assess risks to minors.
How much could the fine be?
Meta faces a fine of up to 6% of its global annual turnover. Based on 2025 revenue of $201 billion, the maximum fine would be approximately $12 billion.
What will Meta have to change?
Meta must implement robust age-verification measures, improve its reporting and removal process for underage accounts, and conduct a proper risk assessment for minors. Failure to do so will result in fines and binding orders.
How does this affect other social media platforms?
The DSA applies to all VLOPs. Other platforms like TikTok, YouTube, and Snapchat are also under scrutiny and may face similar enforcement actions if they fail to protect minors.
Follow Discussion