CMOtech UK - Technology news for CMOs & marketing decision-makers
Story image

UK unveils Online Safety Act to protect digital users

Yesterday

The UK has introduced the Online Safety Act to ensure digital platforms are held accountable for protecting users from harmful content.

Daniel McAfee, Head of Legal Operations at Lawhive, elaborates on the significance of the legislation, stating, "The Online Safety Act represents the UK's most comprehensive attempt to regulate online content and establish legal responsibility for digital platforms. The law aims to protect users, particularly children, from harmful content by requiring platforms to take proactive measures to identify and mitigate risks. Unlike previous approaches, it establishes a 'duty of care' that makes platforms responsible for their systems and processes, not just individual content decisions."

The new regulatory framework mandates platforms to conduct risk assessments, remove illegal content quickly, and implement age verification systems to shield children from inappropriate material. "Platforms must conduct risk assessments, remove illegal content promptly, implement age verification systems to protect children, maintain transparent content moderation policies, establish accessible reporting mechanisms, and keep detailed records of compliance activities. The Act creates differentiated obligations based on platform size, with larger 'Category 1' services facing additional requirements around legal but potentially harmful content," McAfee explains.

Companies could face substantial financial penalties for non-compliance. McAfee explains that businesses can be fined up to GBP £18 million or 10% of their global annual turnover, whichever is greater. "Companies face fines of up to £18 million or 10% of global annual turnover (whichever is higher), potential criminal liability for senior managers who fail to comply with information requests, and possible service restriction in the UK for persistent violations. The enforcement approach is expected to be graduated, with initial focus on systemic failures rather than individual content decisions," McAfee notes.

The Act also calls for robust content moderation systems for services that host user-generated content. "Services hosting user-generated content must implement effective content moderation, clear community guidelines, accessible reporting mechanisms, and transparent appeals processes. The Act requires platforms to consider freedom of expression, particularly for content of 'democratic importance' and 'journalistic content.' Businesses should document their decision-making processes to demonstrate they've balanced safety with other important values," McAfee advises.

Key measures within the Act include prioritising the removal of illegal content, enhancing age verification protocols, and refining processes for addressing anonymous abuse. "The Act establishes requirements for platforms to remove illegal content quickly, implement age verification to protect children, provide clearer reporting mechanisms, and address anonymous abuse more effectively. It creates a regulatory framework that shifts responsibility from individual users to the platforms that design and operate digital services, while also ensuring greater transparency around content moderation," McAfee adds.

Addressing how the platforms should handle reports of harmful content, McAfee says, "Platforms must provide clearly accessible reporting mechanisms for users to flag harmful content. These reporting systems should be easy to find, simple to use, and responsive. If platforms fail to address legitimate concerns, users can file complaints with Ofcom, which has powers to investigate systemic issues."

In relation to user rights, McAfee says, "Users have the right to receive a clear explanation for content removal or account suspension, appeal the decision through the platform's appeals process, request human review of automated decisions, and submit complaints to Ofcom if platform processes fail. The Act requires that platforms implement fair and transparent appeals processes and consider impacts on freedom of expression."

Ongoing consultations will define the practical implementation of the Act's provisions, with McAfee noting, "The Act's practical implementation will evolve significantly through Ofcom's codes of practice, which are still being developed through consultation. Early enforcement actions will establish important precedents about how key provisions are interpreted. We may see refinements to categorisation criteria, definitional boundaries, and compliance standards as the regulatory system matures. The UK's approach may also influence (and be influenced by) parallel regulatory developments globally."

McAfee highlights areas of controversy relating to the Act, such as "concerns about potential impacts on freedom of expression, questions about the practicality and privacy implications of age verification systems, debates about the definition of 'legal but harmful' content, concerns about disproportionate burdens on smaller platforms, and discussions about how to balance protection from harm with access to information and diverse perspectives."

Comparing the UK initiative to other regulatory approaches, McAfee states, "While both the UK's Online Safety Act and the EU's Digital Services Act establish new responsibilities for platforms, they differ in several key areas. The UK law places greater emphasis on protecting children, includes more specific provisions around 'legal but harmful' content, grants more extensive powers to Ofcom, introduces personal liability for senior managers in certain circumstances, and includes specific provisions around democratic content. These differences create compliance challenges for companies operating across both jurisdictions."

Overall, McAfee stresses the importance of this legal development, noting, "The Online Safety Act marks a significant shift in how we regulate digital spaces in the UK. By establishing a duty of care for online platforms, the legislation moves responsibility for user safety upstream to the system level, rather than focusing solely on individual content moderation decisions. While the Act's effectiveness will depend largely on how Ofcom develops and enforces the codes of practice, it represents a clear signal that the era of limited platform liability is evolving. Organizations now face concrete legal obligations to assess and mitigate foreseeable risks, with meaningful consequences for non-compliance."

Follow us on:
Follow us on LinkedIn Follow us on X
Share on:
Share on LinkedIn Share on X