IT Pro is supported by its audience. When you purchase through links on our site, we may earn an affiliate commission. Learn more

Facebook hit with $150 billion lawsuit over Myanmar hate speech

The tech giant has been accused of failing to prevent disinformation from thriving on its platform

Meta, formerly known as Facebook, could be forced to pay even $150 billion (£113bn) for its alleged role in the 2017 Rohingya genocide.

Legal complaints filed in the UK and US on behalf of the Rohingya refugees claim that the tech giant failed to prevent anti-Rohingya hate speech and disinformation from thriving on its platform.

This included thousands of posts describing the community as animals and foreign invaders, falsely accusing them of crimes, and calling for them to be killed. 

Facebook’s algorithm had allegedly amplified such hateful posts on users’ news feeds, with the company failing to hire enough Burmese-speaking content moderators despite record-breaking profits that year.

The spread of anti-Rohingya propaganda ultimately resulted in real-life violence that cost the lives of 24,000 people and displaced up to a million, forcing them into “abject poverty”, according to the class-action complaint filed in California by law firms Edelson and Fields.

The US lawsuit is seeking damages “in excess of $150 billion”. The legal notice to Meta’s London offices has not been made publicly available.

Related Resource

The state of brand protection 2021

A new front opens up in the war for brand safety

A log-in screen with a red background - whitepaper from MimecastFree download

The lawsuit references claims made by a former Facebook employee, who said that the company’s executives “were fully aware that posts ordering hits by the Myanmar government on the minority Muslim Rohingya were spreading wildly on Facebook”, and that “the issue of the Rohingya being targeted on Facebook was well known inside the company for years”.

The claims echo testimonies made by another former-employee-turned-whistleblower, Frances Haugan, who in October told members of the US Congress that Facebook was “literally fanning” ethnic violence in developing countries.

Weeks later, Haugen told UK’s MPs that, due to shortages of moderators, Facebook had been unable to police harmful content in multiple languages around the world, leading to civil unrest in Myanmar in 2017 as well as Ethiopia in 2021.

However, the issue also impacts the UK, she added, due to the fact that Facebook's AI is unable to detect online abuse in British English.

Despite the widely-reported anti-Rohingya violence in Myanmar, the tech giant also failed to prevent the spread of anti-Muslim hate speech on its platform in the Assam region of northeast India, according to the lawsuit. 

Meta didn’t respond to IT Pro’s request for comment, yet had previously admitted to being “too slow to prevent misinformation and hate” in Myanmar.

Featured Resources

The state of Salesforce: Future of business

Three articles that look forward into the changing state of Salesforce and the future of business

Free Download

The mighty struggle to migrate SAP to the cloud may be over

A simplified and unified approach to delivering Enterprise Transformation in the cloud

Free Download

The business value of the transformative mainframe

Modernising on the mainframe

Free Download

The Total Economic Impact™ Of IBM FlashSystem

Cost savings and business benefits enabled by FlashSystem

Free Download

Most Popular

Cyber attack on software supplier causes "major outage" across the NHS
cyber attacks

Cyber attack on software supplier causes "major outage" across the NHS

8 Aug 2022
Why convenience is the biggest threat to your security

Why convenience is the biggest threat to your security

8 Aug 2022
How to boot Windows 11 in Safe Mode
Microsoft Windows

How to boot Windows 11 in Safe Mode

29 Jul 2022