LONDON (Reuters) -Britain’s tech regulator said on Thursday that online platforms hosting pornographic and other content considered harmful must implement “highly effective” age checks by July 25 to stop children from accessing the material.
Ofcom published its children’s safety codes meant to protect UK users under the age of 18 from viewing harmful content on subjects such as suicide and self-harm or pornography in its next step in enforcing Britain’s Online Safety law.
Consequences for services that fail to implement these measures include fines of up to 10% of their global revenue or 18 million pounds($23.89 million), whichever is higher, it said.
The regulator added that for ongoing non-compliance, it could seek a court order to compel banks, internet service providers and other third parties to take actions that could disrupt the service, such as limiting payment options or restricting access to the service itself.
The Online Safety Act, which became law in 2023, sets tougher standards for platforms to tackle criminal activity, with an emphasis on child protection and illegal content.
($1 = 0.7535 pounds)
(Reporting by Chandini Monnappa in Bengaluru and Paul Sandle in London; Editing by Varun H K and Saad Sayeed)