Skip to content

Online Safety Act Enactment in UK: Child Protection Codes Now Active

In July 25, 2025, the Children's Safety COPs, a set of guidelines aimed at safeguarding children in user interactions and search services, officially commenced under the Online Safety Act 2023 (OSA) online safety framework in the UK. This development signifies the conclusion of Phase 2 in...

Online Protection Regulations for Children Under the UK Online Safety Act Now Enacted
Online Protection Regulations for Children Under the UK Online Safety Act Now Enacted

Online Safety Act Enactment in UK: Child Protection Codes Now Active

The UK's Online Safety Act 2023, a landmark legislation, has introduced the Children's Safety Codes of Practice (COPs) to ensure a safer online environment for children. These COPs, which came into effect on 25 July 2025, demand robust measures from online service providers to protect children from harmful content.

Ofcom, the UK's communications regulator, has been tasked with overseeing the implementation of these COPs. The service areas of focus for Ofcom include Litigation, Technology, and Data, Privacy & Cybersecurity, with a primary focus on the United Kingdom and Western Europe.

Under the COPs, providers must comply with certain children's safety duties, including using measures to protect children from harmful content. These measures vary for each service, depending on factors such as user base size and risk levels for different types of harmful content.

Robust Age Assurance

Services hosting or disseminating high-risk content, such as pornography or other "priority content," must implement "highly effective age assurance" systems. Examples include credit card checks or photographic ID verification to prevent underage access.

Age-Appropriate Content Filtering and Algorithmic Safety

Services assessed as medium or high risk for specific harmful content must configure content recommender systems to exclude or minimize the prominence of potentially harmful material in children's content feeds.

Effective Content Moderation Policies

Larger services (over seven million UK monthly users) or those identified as multi-risk (medium or high risk for multiple harmful content types) must have internal moderation policies that define how harmful content is identified, managed, and enforced on their platforms.

User Reporting and Redress Mechanisms

Services are required to maintain clear user reporting channels and provide effective mechanisms for users to complain and seek redress concerning moderation decisions.

Prohibition of Exploitative Design Patterns

Providers must avoid and remove exploitative or manipulative design features such as "dark patterns" which could harm child users.

Transparency and Accountability

Platforms must ensure transparency over moderation practices, algorithmic processes affecting content visibility, and age verification approaches.

Compliance Deadlines and Enforcement

These measures are mandatory for services likely to be accessed by children, with codes of practice effective from July 2025. Ofcom holds enforcement powers, including fines up to £18 million or 10% of global revenue, along with service restrictions for non-compliance.

Ofcom has already announced formal investigations into four providers of pornography sites who may not have implemented effective age assurance. The regulator has emphasized that it will enforce against providers who do not comply with age-check requirements for services allowing pornographic content.

Ofcom's Supervision Team is establishing relationships with the largest and riskiest service providers to ensure quick compliance with children's safety duties. The remainder of 2025 will be busy for Ofcom as it seeks to achieve various milestones within 'Phase 3' of its OSA implementation, focusing on 'categorised' services. Ofcom will publish a 'register' designating services as categorized services.

In summary, the children's safety COPs under the Online Safety Act 2023 demand strong age verification, safer content filtering and recommendation algorithms, robust content moderation, user empowerment through reporting, and transparency to safeguard children online in the UK.

[1] Ofcom (2023). Children's Safety Codes of Practice: Guidance for online service providers. Retrieved from www.ofcom.org.uk/childrens-safety-codes-of-practice [2] UK Government (2023). Online Safety Act 2023. Retrieved from www.gov.uk/online-safety-act-2023 [3] UK Government (2025). Children's Safety Codes of Practice: Frequently Asked Questions. Retrieved from www.gov.uk/childrens-safety-codes-of-practice-faq [4] Ofcom (2025). Ofcom's Supervision Team to focus on largest and riskiest service providers. Retrieved from www.ofcom.org.uk/news/2025/ofcom-supervision-team

  1. Under the Children's Safety Codes of Practice, online service providers must comply with the regulations introduced by the UK's Online Safety Act 2023 to ensure a safer online environment for children.
  2. Ofcom, as the UK's communications regulator, oversees the implementation of these codes and maintains a focus on areas such as Litigation, Technology, and Data, Privacy & Cybersecurity, predominantly in the UK and Western Europe.
  3. To protect children from harmful content, service providers must implement robust measures, with the specific measures varying based on factors like user base size and risk levels.
  4. For services disseminating high-risk content, 'highly effective age assurance' systems, such as credit card checks or photographic ID verification, are required to prevent underage access.
  5. Services assessed as medium or high risk for specific harmful content must configure content recommender systems to exclude or minimize the prominence of potentially harmful material in children's content feeds.
  6. Larger services or those identified as multi-risk must have comprehensive internal moderation policies that define how harmful content is identified, managed, and enforced on their platforms.
  7. Services must maintain clear user reporting channels and provide effective mechanisms for users to complain and seek redress concerning moderation decisions.
  8. Providers are prohibited from using exploitative or manipulative design features that could harm child users, such as "dark patterns."
  9. Platforms must ensure transparency over moderation practices, algorithmic processes affecting content visibility, and age verification approaches, and are subject to fines and service restrictions for non-compliance.

Read also:

    Latest

    New Technology Hub Emerges on Previous IKEA Location in Kaarst

    Industrial development in Kaarst at the former IKEA location

    Operations of high-tech firm 'AES Motomation' commenced at the old Ikea site located at Duessoestraße 8, on June 16th. The company's grand entrance was marked by a celebration that drew 120 attendees from Taiwan, America, and Japan. The event featured a vibrant and extensive program for the...