Content
Recent Posts
Meta’s Facebook & Instagram Charged by EU Over Kids’ Safety

The European Commission has charged Meta Platforms with breaching key provisions of the Digital Services Act, saying the company has failed to adequately prevent children under the age of 13 from accessing Facebook and Instagram.
The preliminary findings follow a two-year investigation into Meta’s compliance with the EU’s flagship digital regulation, which requires large online platforms to take stronger measures against harmful content and risks to vulnerable users, including minors.
Content
Weak Enforcement of Age Restrictions
According to regulators, Meta’s existing safeguards are insufficient to enforce its own minimum age requirement. While both platforms prohibit users under 13, the Commission found that the company lacks effective systems to verify users’ ages and remove underage accounts.
Officials said the process for reporting minors is cumbersome and often ineffective. In some cases, users must go through multiple steps to submit a report, and follow-up action is inconsistent. This allows accounts belonging to underage users to remain active without meaningful review.
The Commission estimates that between 10 and 12 percent of children under 13 in the European Union are currently using Facebook and Instagram, suggesting widespread circumvention of the platforms’ rules.
“Instagram and Facebook are doing very little to prevent children below this age from accessing their services,” said Henna Virkkunen, the Commission’s executive vice president for tech sovereignty, security, and democracy. She added that platform policies must translate into concrete enforcement, not just written terms.
Regulatory Expectations and Potential Penalties
The Commission has instructed Meta to revise its risk assessment framework and strengthen its systems for detecting, preventing, and removing underage users. These changes are expected to address both account creation and post-registration monitoring.
The charges do not constitute a final ruling. Meta has the opportunity to respond and propose corrective measures before the Commission reaches a formal decision. The process could take more than a year.
If found in violation, Meta could face fines of up to 6 percent of its global annual turnover, one of the most significant enforcement tools available under the Digital Services Act.
Meta’s Response
Meta said it disagrees with the Commission’s findings, describing age verification as an industry-wide challenge. The company stated that Facebook and Instagram are designed for users aged 13 and older and that it has systems in place to detect and remove underage accounts.
“We continue to invest in technologies to find and remove underage users,” the company said, adding that additional measures are expected to be announced in the coming weeks.
Broader Regulatory Pressure
The case reflects a broader push by European regulators to tighten oversight of social media platforms, particularly in relation to child safety. Other companies, including Snap Inc. and TikTok, are also facing scrutiny under the Digital Services Act and related national initiatives.
Several EU member states, including Spain, France, and Denmark, are considering additional restrictions on minors’ access to social media, including stricter age verification requirements.
Separately, the Commission is investigating Meta over other concerns, including the potential addictive nature of its platform design and the role of its recommendation systems in shaping user behavior.
For more articles like this, visit our Lifestyle News Page!