Meta Faces EU Scrutiny Over Child Safety as Regulators Cite Violations of Digital Services Act
Meta Platforms is facing intensified scrutiny from the European Commission after regulators concluded that the company has not done enough to enforce minimum age requirements on Facebook and Instagram. The findings stem from a preliminary investigation under the Digital Services Act, which governs online platform accountability across the European Union.
According to the Commission, Meta’s current systems allow minors to easily bypass age restrictions by entering false birth dates during account registration, with insufficient verification mechanisms in place. Regulators also flagged the platform’s reporting tools as overly complex, noting that users must navigate multiple steps to report underage accounts, and even then, enforcement actions are often inconsistent or ineffective.
The Commission emphasized that Meta needs to overhaul its risk assessment framework to better identify and mitigate harms faced by young users across its platforms in the EU. The case highlights growing concern among regulators about the impact of social media on minors, particularly in areas such as safety, exposure to harmful content, and mental well-being.
Meta has pushed back against the findings, stating that its platforms are designed for users aged 13 and above and that it already deploys technologies to detect and remove underage accounts. The company also acknowledged that age verification remains a broader industry challenge and said it plans to introduce additional measures in the near future.
The investigation is still ongoing, and Meta now has the opportunity to respond to the Commission’s conclusions. However, if the violations are confirmed, the company could face fines of up to 6% of its global annual revenue, marking a potentially significant regulatory and financial setback.
The EU action comes amid wider global pressure on Meta, including recent U.S. court rulings questioning the company’s handling of child safety and the potential impact of its platform design on younger users. Together, these developments underscore the increasing regulatory focus on Big Tech’s responsibility to protect vulnerable audiences online.











