Ofcom’s draft guidelines on illegal online content set stringent rules for big tech

Ofcom logo displayed on a smartphone with multicolored background
(Image credit: Getty Images)

UK regulator Ofcom has published its first draft codes of practice in the wake of the Online Safety Act gaining Royal Assent as the regulator looks to tackle harmful online content. 

The draft codes unveiled by Ofcom seek to crack down on illegal content distributed online. This includes terrorist or extremist content, as well as child abuse and fraudulent content. 

Technology secretary Michelle Donelan said the new rules will obligate online businesses, such as social media firms, to address long-standing issues around harmful materials online. 

"Before the Bill became law, we worked with Ofcom to make sure they could act swiftly to tackle the most harmful illegal content first," she said. 

"By working with companies to set out how they can comply with these duties, the first of their kind anywhere in the world, the process of implementation starts today."

Ofcom research shows that three-in-five children in secondary school have been contacted online in a way that made them feel uncomfortable. The new rules propose introducing measures to control friend requests on social media platforms, for example. 

'Larger and higher-risk services' should not present children with lists of suggested friends; children shouldn't appear as suggested friends; they shouldn't be visible in other users' connection lists; and their own connection lists shouldn't be visible.

Accounts outside a child’s connection list should not be able to send them direct messages, and their location information shouldn't be visible to anybody else.

Meanwhile, these larger platforms should use hash matching to check for child sexual abuse material (CSAM) against a database of illegal images, and use automated tools to detect URLs that have been identified as hosting CSAM. 

All services should block accounts run by banned terrorist organizations.

With regard to fraudulent content, services should use keyword detection to find and remove posts linked to the sale of stolen credentials, such as credit card details. 

Under the new guidelines, services that offer to verify accounts will also be forced to explain how they do it.

Ofcom’s move receives industry approval

Industry stakeholders such as Which? have given their seal of approval for the move. The consumer rights group said the new guidelines will force online businesses to adhere to more stringent rules and protect users. 

"The Online Safety Act becoming law is a vital moment in the fight back against fraud," said director of policy and advocacy Rocio Concha.

"It should force tech giants to take more responsibility for tackling fraudulent adverts on their platforms, and it is positive Ofcom is progressing with the regulatory codes so quickly to make this happen."

RELATED RESOURCE

A whitepaper from BT on how to prevent cyber attacks from holding your business back

(Image credit: BT)

Put cyber security at the heart of your organization 

DOWNLOAD NOW

The codes also include a series of requirements already laid out in the Online Safety Act. All services will have to name somebody responsible for compliance with their duties on illegal content, reporting and complaints.

Content and search moderation teams must be well resourced and trained, performance targets must be set, and progress monitored, Ofcom said. 

In addition,  the regulator will require organizations to draft and implement policies on how content is reviewed to ensure transparency. 

Ofcom seeks user-led feedback

A key aspect of the new rules highlighted by Ofcom was the desire to ensure user-led feedback and reporting of harmful content. 

Users will be entitled to report harmful content to businesses in an easy manner, the regulator said. This includes making complaints, blocking other users on social media sites, and the ability to disable comments on posts. 

The new code of conduct is now up for consultation and expected to come into force at the end of 2024. Ofcom said it will propose guidance on how adult sites should make sure children can't access pornographic content later this year. 

In spring 2024, it will also launch a consultation on more protections for children from harmful content such as suicide, self-harm, eating disorders and cyber bullying. 

Ross Kelly
News and Analysis Editor

Ross Kelly is ITPro's News & Analysis Editor, responsible for leading the brand's news output and in-depth reporting on the latest stories from across the business technology landscape. Ross was previously a Staff Writer, during which time he developed a keen interest in cyber security, business leadership, and emerging technologies.

He graduated from Edinburgh Napier University in 2016 with a BA (Hons) in Journalism, and joined ITPro in 2022 after four years working in technology conference research.

For news pitches, you can contact Ross at ross.kelly@futurenet.com, or on Twitter and LinkedIn.