- Platforms must remove illegal content promptly
- Non-compliance may result in substantial fines
What happened: Online platforms now required to combat illegal content
As of 17 March 2025, online platforms operating in the UK are mandated to implement measures to protect users from illegal content, following the enforcement of the Online Safety Act. This legislation obliges tech companies, including social media platforms and file-sharing services, to conduct thorough risk assessments and establish systems to swiftly detect and remove unlawful material, such as child sexual abuse material (CSAM). The Office of Communications (Ofcom), the UK’s media regulator, has initiated an enforcement programme to monitor compliance across the industry. Companies that fail to adhere to these requirements face significant penalties, including fines of up to £18 million or 10% of their global turnover.
Also read: TalkTalk and O2 lead complaints in Ofcom survey
Also read: Ofcom proposes 6GHz band sharing for Wi-Fi and mobile
Why it’s important
The enforcement of the Online Safety Act marks a significant step in holding online platforms accountable for the content shared on their services. By mandating proactive measures against illegal material, the Act aims to create a safer digital environment for users, particularly vulnerable groups such as children. Ofcom’s authority to impose substantial fines underscores the seriousness of these obligations and serves as a deterrent against non-compliance. This regulatory approach reflects a broader global trend towards enhancing online safety and protecting individuals from digital harms.