Ofcom has finalised child safety measures for sites and apps which will be introduced from July this year.
The regulatory body for communication services has set more than 40 practical measures for tech firms to meet their duties under the Online Safety Act. These will apply to sites and apps used by children in the UK in areas such as social media, search and gaming.
The steps laid out by the regulator include preventing minors from encountering the most harmful content relating to suicide, self-harm, eating disorders and pornography.
Ofcom chief executive Dame Melanie Dawes outlined the far-reaching impact these safety measures will have. “These changes are a reset for children online,” she said. “They will mean safer social media feeds with less harmful and dangerous content, protections from being contacted by strangers and effective age checks on adult content.
“Ofcom has been tasked with bringing about a safer generation of children online, and if companies fail to act they will face enforcement.”
These measures were created following consultation and research involving tens of thousands of children, parents, companies and experts.
The research found that three in five teenage children (59%) reported encountering potentially harmful content online over a four week period, and 30% of 13-17 year olds encountered online harm from scrolling through their feed or via a ‘For You’ page in the last month.
The codes of practice finalised today demand a ‘safety-first’ approach in how tech firms design and operate their services in the UK. These measures include safer feeds, effective age checks, easier reporting and strong governance.
Providers of services likely to be accessed by children in the UK have until 24 July to finalise and record their assessment of the risk their service poses to children, which Ofcom may request.
They should then implement safety measures to mitigate those risks, and from 25 July they should apply the safety measures set out in Ofcom’s codes to mitigate those risks.
If companies fail to comply with their new duties, Ofcom has the power to impose fines and – in the most serious cases – apply for a court order to prevent the site or app from being available in the UK.
Julie Dawson, chief regulatory and policy officer at Yoti, said, “A key part of the Online Safety Act involves protecting children from legal but harmful content.
“We welcome the Children’s Risk Assessment Guidance and the first version of the Protection of Children Codes of Practice under the UK’s Online Safety Act published today, and praise the work of the Ofcom team and the sector over the last years preparing for this.”
Dawson hopes that Ofcom has prepared for the scale, and level playing field, of enforcement given that approximately 150,000 companies are in scope of the legislation. “It would be a shame for good actors to be penalised financially for compliance if it takes months for non-compliant companies to come into compliance,” she added.
Yoti works with a third of the largest global platforms undertaking one million age checks daily, including for social media, dating, adult, vaping and gaming sites.
“Online age checking is no longer optional, it’s a necessary step to create safer, age-appropriate experiences online,” Dawson concluded.
Register for Free
Bookmark your favorite posts, get daily updates, and enjoy an ad-reduced experience.
Already have an account? Log in