A hard report released by Amnesty International against Facebook for responsibilities in promoting violence against the Rohingya in Myanmar.
The Social Atrocity: Meta and the right to remedy for the Rohingya, details how Meta knew or should have known that Facebook’s algorithmic systems were overloading the spread of harmful anti-Rohingya content in Myanmar, but the company still failed to act.
“In 2017, Rohingya people were killed, tortured, raped and displaced by the thousands as part of the Myanmar security forces’ campaign of ethnic cleansing. In the months and years leading up to the atrocities, Facebook’s algorithms intensified a storm of hate against the Rohingya, which contributed to real-world violence,” said Agnès Callamard, Secretary General of Amnesty International.
The report argues that Facebook was an echo chamber for the Myanmar government’s atrocity against the Rohingya.
“While the Myanmar military was committing crimes against humanity against the Rohingya, Meta was benefiting from the echo chamber of hate created by their hate spiral algorithms.
“Meta must be held accountable. The company now has a responsibility to provide reparations to all those who suffered the violent consequences of its reckless actions.”
Sawyeddollah, a 21-year-old Rohingya refugee, told Amnesty International: “I saw a lot of horrible things on Facebook. And I thought that the people who posted that were mean… Then I realized that it’s not just these people, the posters, but Facebook is also responsible. Facebook is helping them by not taking care of their platform.”
The Rohingya are a predominantly Muslim ethnic minority based in Rakhine state in northern Myanmar. In August 2017, more than 700,000 Rohingya fled Rakhine as Myanmar security forces launched a targeted campaign of widespread and systematic murder, rape and arson household. The violence followed decades of state-sponsored discrimination, persecution and oppression against the Rohingya that amounts to apartheid.