Websites that host pornography and other types of ‘harmful content’ will have to introduce checks that validate the age of their users by July, Ofcom has said, as it clamps down on online risks posed to children and teenagers.
The media regulator, whose remit covers the internet, has said websites that host such content will need to use “highly effective age assurance” to make sure no children are able to view it.
These methods include age verification, age estimation or a combination of both and must be “highly effective” at correctly determining a user’s age to comply with regulation.
The new rules extend beyond websites dedicated to hosting pornography or other ‘harmful content’ to include social media platforms and other sites.
Ofcom argues that its approach will still protect users’ privacy and still allow adults to access legal pornography, saying that it is “flexible, tech-neutral, and future-proof”.
Ofcom boss Melanie Dawessaid: “For too long, many online services which allow porn and other harmful material have ignored the fact that children are accessing their services.”
“Today, this starts to change. We’ll be monitoring the response from industry closely. Those companies that fail to meet these new requirements can expect to face enforcement action from Ofcom.”
The protections come as analysis has shown that children are being exposed to online pornography early in life – an average age of 13, with more than a quarter viewing it by age 11.
Dedicated pornography websites, including those using AI to generate the content, will be required to take steps to introduce age checks immediately and must have a ‘highly effective’ process in place by July at the latest.
Other websites where the content could be found will have until April to carry out an assessment to establish whether their services are likely to be accessed by children. They will then have until July to conduct a children’s risk assessment, followed by a requirement to implement measures to protect children using their services.
Open banking, photo ID matching, and email-based age estimation have been namedamong a list of ‘highly effective’ age verification methods by the regulator, while self-declaration of age and online payments will not be acceptable.
Lina Ghazal, Head of Regulatory and Public Affairs at email-based age verification firm Verifymy, said Ofcom’s announcement today “is a pivotal moment in the fight to make the internet a safer place, particularly for children.”
“The regulator’s long-awaited guidance on age assurance means adult content providers now have the clarity they need to get their houses in order and put in place robust and reliable methods to keep explicit material well away from underage users.”
The new rules come as part of the Online Safety Act starts to be implemented. It aims to “protect children and adults” by making social media companies and search services more responsible for their users’ safety.
Apart from age verification, the act also gives firms new imperatives to act against illegal content performed on their platforms. It will also mandate that firms remove suicide and self-harm content, making the encouraging or assisting of self-harm a criminal offence.
The act was passed in 2023 but talk of similar legislation had been ongoing for many years prior when Thersa May’s government proposed an Online Harms Bill in 2019.
Register for Free
Get daily updates and enjoy an ad-reduced experience.
Already have an account? Log in