(CN) — A class action lawsuit filed Monday in San Mateo County Superior Court accuses Meta Platforms, the company formerly known as Facebook, of knowingly helping the spread of "hate speech, misinformation, and incitement of violence" in Myanmar, which led to the genocide of the Muslim Rohingya people.
"At the core of this complaint is the realization that Facebook was willing to trade the lives of the Rohingya people for better market penetration in a small country in Southeast Asia," the complaint states.
The lead plaintiff in the suit is listed only as "Jane Doe," and is identified a Rohingya Muslim refugee now living in Illinois. She and her attorneys are seeking more than $150 billion in damages.
"We’re appalled by the crimes committed against the Rohingya people in Myanmar," said Meta spokesperson Emily Cain in a written statement. She did not comment on the suit, but claimed that Meta has "taken action on harmful misinformation to help keep people safe," and has banned Myanmar's armed forces, the Tatmadaw, from its platforms.
Also on Monday, lawyers based in the United Kingdom sent a letter to Facebook's UK office, stating their intentions to file a claim in British High Court on behalf of Rohingya living in the UK and in refugee camps in Bangladesh.
"Despite Facebook’s recognition of its culpability and its pronouncements about its role in the world, there has not been a single penny of compensation, nor any other form of reparations or support, offered to any survivor," the letter reads.
Myanmar, formerly known as Burma, is the largest country on mainland Southeast Asia, a bit larger than France, with a population of more than 55 million. Nearly 90% of the country is Buddhist. The Rohingya make up roughly 2% and have been the subject of state-sponsored discrimination for decades, having been denied citizenship since the early 1980s.
"While other ethnic and religious minorities are accepted, at least in theory, as belonging to the nation under their 'national race' status, the Rohingya’s lack of status has dramatically increased their vulnerability and contributed to the extreme scale and intensity of the violence against them," a 2018 report from the UN Human Rights Council states.
The report details mass killings conducted by Myanmar's armed forces and argues that Myanmar officials should be tried for genocide. It tallies the killings at 25,000, at least. More than 700,000 Rohingya have fled the country. Most are still living as refugees in the neighboring country of Bangladesh.
Marzuki Darusman, chairman of the U.N. Independent International Fact-Finding Mission on Myanmar, has said that Facebook has played a "determining role" in the genocide, adding that it "substantively contributed to the level of acrimony and dissension and conflict, if you will, within the public."
Facebook, wildly popular in Myanmar, became a forum for hundreds, possibly even thousands, of comments denigrating the Rohingya as less than human.
"We must fight them the way Hitler did the Jews," one user wrote in 2013.
But much of the campaign to incite hate came from the government. Myanmar military officials created fake profiles to spread misinformation and hate speech directed against Muslims living in Myanmar. Facebook's head of cybersecurity policy, Nathaniel Gleicher, told the New York Times in 2018 that the company had found "clear and deliberate attempts to covertly spread propaganda that were directly linked to the Myanmar military.” Though Facebook took down the offending accounts, the damage had been done.
Toward the end of 2018, Alex Warofka, Facebook's product policy manager, admitted: "We weren’t doing enough to help prevent our platform from being used to foment division and incite offline violence. We agree that we can and should do more."
"Facebook has long been aware that hateful, outraged, and politically extreme content (especially content attacking a perceived “out-group”) is oxygen to the company’s blood," the class action complaint states. "The more horrendous the content, the more it generates “engagement” (a measure of users’ interaction with content on the system (“likes,” “shares,” comments, etc.)). As Facebook has determined through years of study and analysis: hate and toxicity fuel its growth far more effectively than updates about a user’s favorite type of latte."
Today, Facebook is facing similar allegations over its failure to curb hate speech and inflammatory content in Ethiopia, which is currently embroiled in a civil war and and escalating ethnic conflict that some say has already crossed over into ethnic cleansing — "again with the help of a Facebook-fueled misinformation and hate-speech campaign," according to the complaint.
One of the law firms representing the plaintiffs in the suit, Edelson PC, has sued Facebook a number of times before. In February, a judge approved a $650 million settlement between the company and a 1.6 million-member class in Illinois, the result of a suit over Facebook's facial recognition tool. The firm is also representing plaintiffs suing Facebook for hosting "social casinos," online games where users "gamble" with fake currency.
Edelson PC attorneys could not immediately be reached for comment.
Read the Top 8
Sign up for the Top 8, a roundup of the day's top stories delivered directly to your inbox Monday through Friday.