LLMpediaThe first transparent, open encyclopedia generated by LLMs

Online Safety Act

Generated by DeepSeek V3.2
Note: This article was automatically generated by a large language model (LLM) from purely parametric knowledge (no retrieval). It may contain inaccuracies or hallucinations. This encyclopedia is part of a research project currently under review.
Article Genealogy
Parent: WhatsApp Hop 4
Expansion Funnel Raw 40 → Dedup 0 → NER 0 → Enqueued 0
1. Extracted40
2. After dedup0 (None)
3. After NER0 ()
4. Enqueued0 ()

Online Safety Act. This legislation represents a major regulatory framework designed to establish a duty of care for online platforms regarding user-generated content. It aims to protect individuals, particularly children, from a range of online harms, including illegal content and material deemed legal but harmful. The act grants significant new powers to the national communications regulator, Ofcom, to oversee and enforce compliance among a wide array of digital services.

Overview

The framework was developed in response to growing societal and parliamentary concern over the prevalence of cyberbullying, disinformation, and terrorist propaganda on major platforms like Facebook, Instagram, and Twitter. Its scope encompasses services that allow user interaction, such as social media sites, search engines like Google, and some online forums. The legislation mandates that in-scope companies assess risks associated with their services and implement proportionate systems and processes to mitigate them. This approach is often compared to established safety regimes in other sectors, drawing inspiration from concepts in fields like public health and consumer protection.

Key provisions

Central to the legislation is the establishment of a duty of care, requiring companies to take proactive steps to protect users. Specific duties include conducting thorough risk assessments for illegal content, such as that related to terrorism or child sexual abuse and exploitation, and for content harmful to children, like promotion of self-harm or eating disorders. For larger platforms, additional codes of practice will be enforced by Ofcom, covering areas like fraudulent advertising and the spread of disinformation. The act also includes provisions related to protecting content of democratic importance, such as news publisher content and MPs' communications, and upholds protections for journalistic content as defined under the Human Rights Act 1998.

Legislative history

The legislative journey began with a government white paper published in 2019, which outlined initial proposals for online safety regulation. A draft bill was subsequently scrutinized by a joint committee of Parliament, including members from the House of Commons and the House of Lords, which published a report with numerous recommendations in 2021. The bill was formally introduced and underwent significant debate and amendment throughout its passage, receiving royal assent in 2023. Key figures in its development included successive Secretary of State for Digital, Culture, Media and Sport and ministers from the Department for Science, Innovation and Technology.

Implementation and enforcement

Ofcom is designated as the independent regulator responsible for implementation. Its phased approach involves extensive consultation periods to develop detailed codes of practice and guidance for regulated services. Enforcement powers granted to Ofcom are substantial, including the ability to issue fines of up to ten percent of a company's global annual turnover or, in extreme cases, pursue criminal sanctions against senior managers. The regulator can also apply to courts for service restriction orders, which could require Apple and Google to remove non-compliant apps from their respective stores, the App Store and Google Play.

Criticism and controversy

The act has faced significant criticism from various quarters. Free speech advocates, including organizations like Article 19 and the Index on Censorship, argue that its "legal but harmful" provisions for adults could lead to excessive censorship and infringe upon rights protected under the European Convention on Human Rights. Technology companies have expressed concerns about the technical feasibility and potential overreach of the duties, warning of impacts on innovation and encryption. Legal experts have also debated the potential for conflict with other frameworks, such as the General Data Protection Regulation and the upcoming Digital Markets, Competition and Consumers Bill.

International comparisons

The legislation is part of a global trend towards more stringent platform regulation. It is often contrasted with the Digital Services Act in the European Union, which focuses on systemic risk and transparency, and the Network Enforcement Act in Germany, which mandates rapid removal of illegal content. Other comparable regimes include the Online Safety Act 2021 in Australia, overseen by the eSafety Commissioner, and proposed regulations in jurisdictions like Ireland and Canada. The approach taken by the United Kingdom is seen as particularly expansive in its scope and the discretionary power granted to its national regulator.

Category:Internet law Category:British laws