• Ofcom is introducing a comprehensive online child safety code, aiming to regulate digital platforms and enforce measures to protect young users from harm.
  • The proposed code includes stringent age verification processes, improved content filters, and 40 additional checks aimed at enhancing online safety for children.
  • Ofcom’s initiative responds to growing concerns over children’s exposure to harmful content and online risks.

Ofcom, the UK’s communications regulator, is set to introduce a robust set of measures aimed at enhancing online child safety.

Regulatory intervention

The new online child safety code, spearheaded by Ofcom, represents a proactive approach to addressing the growing concerns surrounding children’s exposure to harmful content and online risks. Under the proposed regulations, online platforms will be required to implement a series of stringent measures to ensure the safety and well-being of young users.

Also read: Meta, allies oppose Nevada’s child online protection rollback

Key safety measures

Central to the new code are measures aimed at bolstering age verification mechanisms, thereby restricting access to age-inappropriate content and services. Online platforms will be mandated to implement robust age verification processes to verify the age of users, ensuring that minors are not exposed to harmful material.

Furthermore, the code emphasises the importance of implementing effective content filtering systems to prevent children from accessing potentially harmful or inappropriate content. Platforms will be required to deploy advanced content filtering technologies capable of identifying and blocking harmful material, including but not limited to violence, pornography, and hate speech.

In addition to age verification and content filtering, the code also includes provisions for implementing 40 other checks aimed at enhancing online safety for children. These checks encompass a wide range of areas, including privacy protection, cybersecurity, and parental controls, among others, to create a comprehensive framework for safeguarding young users online.

Also read: Zuckerberg insists Apple, Google responsible for child safety online, not Meta

Addressing concerns

Ofcom’s initiative comes in response to growing concerns over the impact of digital technology on children’s mental health and well-being, as well as the increasing prevalence of online risks such as cyberbullying, grooming, and exposure to inappropriate content. By introducing stringent regulations and accountability measures, Ofcom seeks to mitigate these risks and create a safer online environment for children to explore and learn.