Ofcom Unveils First Guidelines For Tech Companies Under Online Safety Act

The telecoms regulator, Ofcom, has published its first set of guidelines for tech companies to follow in order to be compliant with the Online Safety Act.

Ofcom Unveils First Guidelines For Tech Companies Under Online Safety Act

The telecoms regulator, Ofcom, has published its first set of guidelines for tech companies to follow in order to be compliant with the Online Safety Act. The guidelines highlight the importance of preventing illicit content, especially child abuse, and online grooming. The draft code of practice marks a crucial step in ensuring online safety for young users.

Startling figures reveal that more than one in ten 11-18 year-olds have received explicit images, prompting Ofcom to propose several measures. These include compelling major platforms to adjust default settings to prevent children from being automatically added to suggested friends lists.

Additionally, safeguards must be implemented to shield children’s location information from being disclosed in their profiles or posts, and to block messages from contacts not in their approved list.

Content moderation teams must also be adequately resourced to effectively tackle online safety concerns. Ofcom will further mandate certain platforms to employ hash-matching technology to identify child sexual abuse material (CSAM).

This process converts images into unique numerical codes (hashes) and cross-references them with a database of known CSAM hashes. If a match is found, it signifies the presence of a known CSAM image.

While hash-matching proves effective for public content, it won’t apply to private or encrypted messages. Ofcom emphasizes that its guidance does not propose any measures that would compromise encryption.

The Online Safety Act’s provisions that could potentially authorize scanning of private messages for CSAM will not be under consideration until 2024 and are unlikely to be enforced until around 2025.

Ofcom’s CEO, Dame Melanie Dawes, acknowledges the complexity of the challenge, with the guidance spanning over 1,500 pages and potentially affecting over 100,000 services, many of which operate beyond the UK’s borders. It is estimated that approximately 20,000 small businesses may need to adhere to the regulations.

Managing public and advocacy group expectations presents another significant hurdle for Ofcom. Striking a balance between being too lenient or stringent on tech platforms will inevitably invite criticism.

Dame Melanie emphasizes that while being universally liked is an unattainable goal, ensuring proportionate, evidence-based regulatory measures remains the priority.

One misconception that Ofcom aims to dispel is the notion that harmful content should be directly reported to the regulator. Instead, the focus is on ensuring that tech platforms have robust systems in place for users to report illegal or harmful content.

The release of this guidance underscores Ofcom’s commitment to fostering a safer online environment, with the ultimate goal of protecting children and vulnerable users from exposure to harmful content and online exploitation.