British regulator brings first set of codes for tech firms to tackle online safety
British communication services’ regulator Ofcom said it has brought the country’s online safety regulation into force after publishing its first codes of practice and guidance aimed at technology companies.
Ofcom said tech firms are now legally required to start taking action to tackle criminal activity on their platforms, and make them safer by design.
The first-edition codes of practice are aimed at tackling illegal harms — such as terror, hate, fraud, child sexual abuse and assisting or encouraging suicide — under the U.K.’s Online Safety Act.
Every site and app in scope of the new laws has until March 16, 2025, to complete an assessment to understand the risks illegal content poses to children and adults on their platform. Ofcom sets out over 40 safety measures for platforms to introduce.
Ofcom said the Act places new safety duties on social media firms, search engines, messaging, gaming and dating apps, and pornography and file-sharing sites. Before it can enforce these duties, it is required to produce codes of practice and industry guidance to help firms to comply, following a period of public consultation.
Subject to the codes completing the Parliamentary process, from March 17, 2025, sites and apps will need to start implementing safety measures to mitigate those risks such as better moderation, easier reporting, and protecting children from sexual abuse and exploitation online.
“We’ll be watching the industry closely to ensure firms match up to the strict safety standards set for them under our first codes and guidance, with further requirements to follow swiftly in the first half of next year,” said Dame Melanie Dawes, Ofcom’s Chief Executive. “Those that come up short can expect Ofcom to use the full extent of our enforcement powers against them.”
Under the new code, tech firms will need to make sure their moderation teams are appropriately resourced and trained and are set performance targets, so they can remove illegal material quickly when they become aware of it.
Ofcom noted that reporting and complaints functions will be easier to find and use, with appropriate action taken in response.
Relevant providers will also need to improve the testing of their algorithms to make illegal content harder to disseminate, the regulator noted.
The codes also expect high-risk providers to use automated tools called hash-matching and URL detection to detect child sexual abuse material.
Ofcom said it has the power to fine companies up to £18M or 10% of their qualifying worldwide revenue, whichever is greater, and in very serious cases it can apply for a court order to block a site in the U.K.
In addition, Ofcom said that more consultations and duties will be coming into force, including — in January 2025: final age assurance guidance for publishers of pornographic material, and children’s access assessments; in February 2025: draft guidance on protecting women and girls; and in April 2025: additional protections for children from harmful content promoting, among other things – suicide, self-harm, eating disorders and cyberbullying.
Some services from companies such as Meta Platforms (NASDAQ:META), Alphabet’s Google (GOOG) (GOOGL) and TikTok parent ByteDance (BDNCE) could potentially come under the new codes.
Meta, Google and TikTok have not yet responded to a request for comment from Seeking Alpha.