Skip to content

UK's Online Safety Act Slammed for Overzealous Compliance and Content Restrictions

The act's stringent regulations have led to overly broad restrictions, impacting user anonymity and potentially contradicting user preferences. Stakeholders urge clearer rules and enforcement.

This is a web page. On this page we can see a child, toy, animals, bird, woman, sofa on the floor,...
This is a web page. On this page we can see a child, toy, animals, bird, woman, sofa on the floor, symbols and some text.

UK's Online Safety Act Slammed for Overzealous Compliance and Content Restrictions

The UK's Online Safety Act, designed to safeguard children from damaging online content, is facing global criticism and local pushback. The rigorous regulations have led to overzealous compliance, unnecessary content limitations, and even the shutdown of online forums.

The act, which mandates age verification and access restrictions to particular materials, has pressured online services into implementing overly broad restrictions to evade immediate fines. This has resulted in legitimate support communities requiring government ID verification, impacting user anonymity.

Platforms struggle with unclear definitions of harmful content, leading to overbroad restrictions on legal but innocuous content. The lack of clear guidance has created an incentive structure where platforms over-restrict content to avoid regulatory penalties, potentially contradicting user preferences.

The act has also introduced new criminal offenses for cyberflashing, intimate image abuse, and epilepsy trolling. However, the current implementation has been criticized for perceived censorship of foreign companies and inconsistent compliance.

To address these issues, the UK Parliament and Ofcom are urged to establish clear, standardized rules and enforce mandatory data privacy certifications. They should prioritize proportionality and transparency in enforcement, engage stakeholders including civil rights groups, and provide clearer definitions of harmful content, remediation periods, and judicial review for content restrictions. These steps aim to refine the act's measures, prevent over-blocking, and ensure safety interventions do not lead to excessive surveillance.

Read also:

Latest