- Ofcom says tech companies must do more to verify users’ ages under the UK’s Online Safety framework.
- Platforms that host adult or harmful content may face penalties if they fail to implement stronger safeguards.
What Happened
The UK communications regulator Ofcom has ordered technology companies to introduce stronger age-verification measures to protect children online. The warning comes as the regulator begins enforcing new requirements under the UK’s Online Safety Act.
According to the report, Ofcom said platforms that host pornographic or harmful content must ensure children cannot access such material. Companies will need to deploy more robust age-assurance systems rather than relying on weak checks such as simple self-declaration.
The regulator said firms must take proactive steps to assess risks and implement tools capable of verifying whether users are adults. This could include technologies such as age estimation, identity checks, or other technical verification methods. Ofcom warned that companies that fail to comply could face enforcement action once the new rules take full effect.
The move forms part of the wider implementation of the Online Safety Act, a major UK law that aims to hold internet platforms responsible for harmful content hosted on their services. Ofcom now acts as the regulator responsible for overseeing compliance and ensuring companies meet the law’s safety obligations.
In its guidance, the regulator emphasized that protecting children from harmful online material remains a priority as digital platforms continue to grow in scale and influence.
Also Read: https://btw.media/it-infrastructure/ofcom-launches-ai-strategy-for-telecoms-and-online-safety/
Why It’s Important
The policy marks one of the most concrete attempts by a national regulator to force large internet platforms to adopt stricter age-verification systems. Governments across Europe and elsewhere have increasingly questioned whether voluntary safety measures from tech firms provide enough protection for minors.
The UK’s Online Safety Act gives Ofcom significant powers. These include the ability to impose fines of up to 10% of a company’s global revenue for serious breaches. Such penalties could affect large global platforms if they fail to comply with safety obligations.
Yet the approach raises several practical questions. Age verification systems can be technically complex and may require users to provide personal information. Critics have warned that stronger checks could create privacy risks or increase data collection by platforms.
There is also uncertainty over how effective these measures will be. Determined users often find ways to bypass online restrictions. Regulators therefore face the challenge of balancing child protection with privacy, freedom of expression, and technical feasibility.
For technology companies operating in the UK, the message from Ofcom is clear: the era of minimal age checks is ending, and regulatory scrutiny of online safety is intensifying.
