
The European Union caught tech giant Meta allowing millions of children under 13 to access Facebook and Instagram despite claiming to enforce age restrictions, exposing the company to billions in fines while raising serious questions about whether Silicon Valley platforms can be trusted to protect our kids.
Story Snapshot
- European Commission finds Meta violated Digital Services Act by failing to prevent underage users from creating accounts with false birth dates
- Evidence shows 10-12% of EU users on Facebook and Instagram are children under 13, accessing platforms designed for teens and adults
- Meta faces potential fines up to 6% of global annual turnover, amounting to billions of dollars
- Commission officials blast Meta’s “incomplete and arbitrary” safety measures as mere words without concrete action to protect vulnerable children
Meta’s Age Verification System Falls Apart
The European Commission released preliminary findings on April 29, 2026, determining that Meta systematically breached the Digital Services Act through inadequate age verification systems on Facebook and Instagram. Children under 13 can easily create accounts by entering false birth dates during registration, with no meaningful verification checks to stop them. The Commission’s investigation revealed that between 10 and 12 percent of current EU users on both platforms are underage children, directly contradicting Meta’s stated minimum age requirement of 13 years old.
EU Finds Meta Failing To Keep Under-13s Off Facebook, Instagram
The EU said on Wednesday Meta is failing to prevent children under 13 using Facebook and Instagram, potentially exposing them to inappropriate content — and putting the tech giant at risk of a massive fine.…
— Channels Television (@channelstv) April 29, 2026
Executive Vice-President Henna Virkkunen delivered a sharp rebuke to the California-based corporation, stating that terms and conditions should not be mere written statements but rather the basis for concrete action. Her criticism highlighted a frustrating pattern familiar to Americans watching big tech companies prioritize profits over principles. Meta’s platforms rely entirely on user-reported birth dates without secondary verification, making the age restriction essentially voluntary and unenforceable for determined children.
Parents Left Without Effective Tools
The Commission identified severe deficiencies in Meta’s parental reporting mechanisms, finding them difficult to use and largely ineffective. Parents attempting to report underage children using the platforms encounter bureaucratic obstacles and unclear processes that rarely result in account removal. This failure leaves families without practical recourse when discovering their young children have accessed social media platforms filled with content designed for older audiences. The findings underscore a broader concern among Americans across the political spectrum: unaccountable corporations operating platforms that affect millions of families while resisting meaningful oversight.
Meta’s risk assessments came under particular scrutiny for ignoring scientific research on the vulnerabilities of younger children to online harms. The Commission characterized these assessments as incomplete and arbitrary, suggesting the company prioritized user growth metrics over child safety considerations. This pattern reflects the kind of corporate behavior that frustrates citizens who feel powerless against wealthy elites more concerned with stock prices than protecting society’s most vulnerable members.
Billion-Dollar Consequences and Precedent
Meta now faces potential fines reaching up to 6 percent of its global annual turnover under the Digital Services Act, which became effective in 2024. Based on the company’s revenue, such penalties could amount to billions of dollars, representing one of the most significant enforcement actions against a U.S. tech giant by European regulators. The preliminary findings give Meta an opportunity to respond, review evidence, and propose remedies before a final non-compliance decision is issued. However, the strength of the evidence and unanimity of the findings suggest the company faces an uphill battle defending its practices.
The investigation, which formally began in May 2024, represents the first Digital Services Act breach finding specifically targeting age enforcement failures. This case establishes a precedent that could affect other platforms like TikTok and accelerate development of age verification technologies, including an EU-designed verification app currently in development. The enforcement action reflects broader tensions between American tech companies and international regulators, with implications for how social media platforms operate globally. While the EU pursues what it calls tech sovereignty, the fundamental issue transcends borders: can parents trust these companies to keep their children safe, or do profits always come first?
Sources:
Facebook, Instagram charged with breaching rules, must do more to protect kids below 13, EU says
Commission finds that Meta’s Facebook and Instagram breach the Digital Services Act
EU finds Meta failing to keep under-13s off Facebook, Instagram























