Meta under EU investigation for child safety risks

  • Meta may contribute to behavioural addictions in children and create “rabbit-hole effects,” where users are led to increasingly harmful content.
  • Meta has shown its readiness to collaborate with the European Commission and provide information about its safety protocols.
  • This probe adds to Meta’s EU challenges, with ongoing scrutiny over election disinformation ahead of the upcoming European Parliament elections.

Meta Platforms, the parent company of Facebook and Instagram, is under investigation by EU regulators for potential breaches of online content rules related to child safety, EU regulators said on Thursday, a move that could lead to hefty fines.

Child safety concerns

The European Commission has launched an in-depth investigation into Facebook and Instagram, citing concerns that Meta has not adequately mitigated risks to children. According to the Commission, the algorithms used by these platforms may contribute to behavioural addictions in children and create “rabbit-hole effects,” where users are led to increasingly harmful content.

Also read: Meta’s AI unveils full image creation for advertisers

Also read: Meta to halt Workplace app, prioritise AI and metaverse tech

The investigation will also scrutinise Meta’s age-assurance and verification methods, focusing on how effectively the company prevents children from accessing inappropriate content. These concerns were prompted by a risk assessment report submitted by Meta in September.

Meta’s response

Meta has emphasised its commitment to child safety, highlighting its extensive efforts to develop tools and policies designed to protect young users. “We want young people to have safe, age-appropriate experiences online and have spent a decade developing more than 50 tools and policies designed to protect them,” a Meta spokesperson stated. The company expressed its willingness to cooperate with the European Commission and share details of its safety measures.

Potential consequences under DSA

Digital Services Act, which came into effect last year, holds tech companies accountable for tackling harmful and illegal content. Non-compliance can result in fines of up to 6% of a company’s annual global turnover. This investigation adds to Meta’s existing challenges in the EU, where it is also under scrutiny for issues related to election disinformation, particularly with European Parliament elections approaching next month.


Jinny Xu

Jinny Xu is an intern reporter at Blue Tech Wave specialising in Fintech and AI. She graduated from Chongqing Institute of Foreign Studies.Send tips to

Related Posts

Leave a Reply

Your email address will not be published. Required fields are marked *