Massive fines headed for tech giants that host harmful online content

Social media button concept

Video sharing apps such as Facebook, Instagram and YouTube will soon be fined millions of pounds for hosting harmful videos as part of the government's ongoing commitment to enforce EU laws.

The government will hand communications regulator Ofcom new policing and sanctioning powers to protect children from violent, abusive and pornographic content, according to the Telegraph.

The tech giants could face a fine of up to 5% of the company's revenue as well as having their platform potentially banned in the UK if they fail to comply with Ofcom's rulings.

The handoff of powers to Ofcom is being made to comply with the UK's current obligations it has to the EU. Specifically, its Audiovisual Media Service Directive (AVMS) which aims to deliver greater protections for children, as well as preserving cultural diversity and guaranteeing the independence of national media regulators.

However, the regulator may never get to enjoy these powers as the proposed date of Ofcom's new role would be 19 September 2020, beyond the current withdrawal date for the UK leaving the EU. Once it leaves the bloc, the UK will no longer be legally obligated to enforce the AVMS directive.

"The implementation of the AVMSD is required as part of the United Kingdom's obligations arising from its membership of the European Union and until the UK formally leaves the European Union all of its obligations remain in force," said a spokesman for the Department for Digital, Culture, Media and Sport to the BBC.

"If the UK leaves the European Union without a deal, we will not be bound to transpose the AVMSD into UK law."

Under the same rules, the apps in question will also face fines for improper implementation of robust age verification systems and parental controls on videos.

Social media platforms have faced heightened scrutiny this year after a number of incidents involving terrorist attacks being broadcast over online platforms. Operators, including Facebook, have been accused of not removing said videos expeditiously.

"1.5 million copies of the video had to be removed by Facebook - and could still be found on Youtube for as long as eight hours after it was first posted - is a stark reminder that we need to do more both to remove this content, and stop it going online in the first place," said former Prime Minister Theresa May at the Online Extremism Summit in Paris.

Facebook, Twitter and YouTube all faced harsh criticism after the New Zealand shooter's video evaded all three sites' harmful content algorithms.

Google, which owns YouTube, has previously boasted impressive figures concerning the accuracy of its machine learning-driven AI algorithms which first came into effect on YouTube's platform in 2017.

Within a year after its implementation, most content that was violent or extremist in nature was removed from the site with fewer than 10 views.

Connor Jones
Contributor

Connor Jones has been at the forefront of global cyber security news coverage for the past few years, breaking developments on major stories such as LockBit’s ransomware attack on Royal Mail International, and many others. He has also made sporadic appearances on the ITPro Podcast discussing topics from home desk setups all the way to hacking systems using prosthetic limbs. He has a master’s degree in Magazine Journalism from the University of Sheffield, and has previously written for the likes of Red Bull Esports and UNILAD tech during his career that started in 2015.