The EU said on Wednesday Meta is failing to prevent children under 13 using Facebook and Instagram, potentially exposing them to inappropriate content — and putting the tech giant at risk of a massive fine.
The European Union has in recent months stepped up efforts to protect children online, with several member countries considering social media bans for under-16s.
The EU executive is also exploring a possible bloc-wide age limit on social media after coming under intense pressure to take broader action following Australia’s groundbreaking ban on using such platforms for under-16s.
In its latest move to enhance protections for children online, the EU said a probe showed Meta broke digital content rules, and told the US firm to “strengthen” its measures to prevent, detect and remove under-13s on Facebook and Instagram.
Under Meta’s own terms and conditions, the minimum age to access the social media platforms is 13.
In its preliminary view, the EU found Meta had ineffective measures to enforce its own restrictions on children using Facebook and Instagram.
“Terms and conditions should not be mere written statements, but rather the basis for concrete action to protect users — including children,” said EU tech tsar Henna Virkkunen.
If the regulator’s views on Meta are confirmed, the EU can impose a fine of up to six percent of the company’s total worldwide annual turnover.
Meta disagreed with the EU’s findings.
“We’re clear that Instagram and Facebook are intended for people aged 13 and older and we have measures in place to detect and remove accounts from anyone under that age,” a Meta spokesperson said, adding the company would continue to engage with the EU.