Facebook hit with $150 billion lawsuit over Myanmar hate speech

A smartphone showing the Meta company logo in front of a large Facebook logo
(Image credit: Getty Images)

Meta, formerly known as Facebook, could be forced to pay even $150 billion (£113bn) for its alleged role in the 2017 Rohingya genocide.

Legal complaints filed in the UK and US on behalf of the Rohingya refugees claim that the tech giant failed to prevent anti-Rohingya hate speech and disinformation from thriving on its platform.

This included thousands of posts describing the community as animals and foreign invaders, falsely accusing them of crimes, and calling for them to be killed.

Facebook’s algorithm had allegedly amplified such hateful posts on users’ news feeds, with the company failing to hire enough Burmese-speaking content moderators despite record-breaking profits that year.

The spread of anti-Rohingya propaganda ultimately resulted in real-life violence that cost the lives of 24,000 people and displaced up to a million, forcing them into “abject poverty”, according to the class-action complaint filed in California by law firms Edelson and Fields.

The US lawsuit is seeking damages “in excess of $150 billion”. The legal notice to Meta’s London offices has not been made publicly available.


The state of brand protection 2021

A new front opens up in the war for brand safety


The lawsuit references claims made by a former Facebook employee, who said that the company’s executives “were fully aware that posts ordering hits by the Myanmar government on the minority Muslim Rohingya were spreading wildly on Facebook”, and that “the issue of the Rohingya being targeted on Facebook was well known inside the company for years”.

The claims echo testimonies made by another former-employee-turned-whistleblower, Frances Haugan, who in October told members of the US Congress that Facebook was “literally fanning” ethnic violence in developing countries.

Weeks later, Haugen told UK’s MPs that, due to shortages of moderators, Facebook had been unable to police harmful content in multiple languages around the world, leading to civil unrest in Myanmar in 2017 as well as Ethiopia in 2021.

However, the issue also impacts the UK, she added, due to the fact that Facebook's AI is unable to detect online abuse in British English.

Despite the widely-reported anti-Rohingya violence in Myanmar, the tech giant also failed to prevent the spread of anti-Muslim hate speech on its platform in the Assam region of northeast India, according to the lawsuit.

Meta didn’t respond to IT Pro’s request for comment, yet had previously admitted to being “too slow to prevent misinformation and hate” in Myanmar.

Sabina Weston

Having only graduated from City University in 2019, Sabina has already demonstrated her abilities as a keen writer and effective journalist. Currently a content writer for Drapers, Sabina spent a number of years writing for ITPro, specialising in networking and telecommunications, as well as charting the efforts of technology companies to improve their inclusion and diversity strategies, a topic close to her heart.

Sabina has also held a number of editorial roles at Harper's Bazaar, Cube Collective, and HighClouds.