CMOtech UK - Technology news for CMOs & marketing decision-makers
Story image

Ofcom sets strict new online safety rules to protect children

Today

Ofcom has unveiled a comprehensive set of new rules under the Online Safety Act, aiming to enhance online protections for children across the United Kingdom. The regulations, which will apply to technology firms running platforms frequented by young users—including social media, search engines, and gaming sites—are designed to address growing concerns over children's exposure to harmful online content.

The measures outlined by the communications regulator require tech companies to take a "safety-first" approach in both the design and operation of their services. Among the 40 practical actions outlined are requirements to prevent exposure to content related to self-harm, eating disorders, pornography, and suicide. The rules also target misogynistic, violent or abusive material, online bullying, and dangerous viral challenges. Ofcom has instructed providers to produce safer content feeds, implement robust age-verification checks, and act swiftly to remove harmful material. Additionally, children must be given more autonomy and support, with clearer processes for reporting and resolving issues, and platforms are expected to demonstrate strong governance in applying these policies.

Terry Green, Social Media partner at law firm Katten Muchin Rosenman UK LLP, commented on the significance of these changes: "Tech firms that run sites and apps used by UK children…will now have to act to prevent children from seeing any harmful content from July." He noted that the window for compliance is tight, as "providers now have until 24th July to finalise and record their assessments of risks and implement safety measures to mitigate these. Sites should start this process soon as Ofcom could knock on their door immediately that date arrives asking for the assessment." Green also warned of substantial consequences for non-compliance, emphasising that failure to abide by the rules could lead to hefty fines and, in severe cases, the removal of access to UK users.

The legal implications of these regulations were underlined by Monika Sobiecki, Media Partner at Bindmans, who highlighted the requirement for platforms to create detailed written assessments of the risks their services pose to children. Sobiecki observed that, while the Online Safety Act does not grant a specific right for civil claims regarding the codes, the documentation produced may serve as critical evidence of failures to comply with duties of care if litigation becomes necessary in the future. "The codes incidentally do create a source of fresh evidence of any failures by tech companies to comply with their duties of care, in the event that future litigation is necessary to vindicate any claims for harm caused to children," Sobiecki stated.

Iona Silverman, IP and Media Partner at Freeths, offered a broader perspective on the challenge facing regulators and society: "The government needs to think bigger: this is a problem that requires a cultural shift, and also requires legislation to be one step ahead of, rather than behind, technology." She drew attention to findings from the Advertising Standards Authority's "100 Children Report", which highlighted that younger children can easily bypass minimum age requirements on social platforms and are routinely exposed to inappropriate content and advertisements. Silverman called on technology firms to take greater responsibility, arguing that traditional claims of inability to police content are no longer sustainable. Ofcom, she suggested, must act with urgency and consider imposing significant penalties where necessary to give the Online Safety Act sufficient enforcement power.

Ofcom has the authority to fine companies up to 10 percent of their worldwide revenue or GBP £18 million, whichever is greater, for breaches. With technology continuing to evolve rapidly, and AI presenting new challenges and opportunities, there is a consensus among commentators that the regulator must adopt a forward-thinking and flexible approach. Silverman warned that if meaningful progress is not made, further restrictions such as a blanket ban on social media for under-16s, similar to recent developments in Australia, could be considered.

The renewed focus on online harms also arrives as public consciousness grows, partly fuelled by current cultural touchpoints such as Netflix's "Adolescence", which has shone a light on the serious impacts that exposure to extreme content can have on young people. With these new codes in place and public scrutiny intensifying, observers are calling on Ofcom and technology companies alike to demonstrate sustained commitment to keeping children safe in an increasingly digital world.

Follow us on:
Follow us on LinkedIn Follow us on X
Share on:
Share on LinkedIn Share on X