A new report by Amnesty International has implicated Meta Platforms Inc., the parent company of Facebook, in exacerbating human rights abuses during the armed conflict in northern Ethiopia from November 2020 to November 2022.
The report, titled “A Death Sentence for My Father: Meta’s Contribution to Human Rights Abuses in Northern Ethiopia,” provides a distressing account of how Meta’s algorithmic systems and business practices may have contributed to the violence, particularly against people from Tigray.
According to the report, Meta failed to take sufficient measures to address dangerous rhetoric on its platforms, despite receiving repeated warnings from Ethiopian civil society groups and human rights experts about the risks this posed in escalating tensions and enabling violence during the conflict. These warnings were issued both before and after the outbreak of fighting in November 2020, highlighting the real danger that, without proper safeguards, Meta’s platforms could be exploited to spread misinformation and incite violence.
“Three years after its shocking failures in Myanmar, Meta has once again, through its content-shaping algorithms and data-driven business model, contributed to serious human rights abuses,” stated Agnès Callamard, Amnesty International Secretary General.
Through on-the-ground research and technical analysis of Meta’s algorithms and internal company documents exposed in the Facebook Papers leak, the investigation revealed that Facebook’s algorithms and business models, which prioritize maximizing user engagement above all else, had the effect of disproportionately amplifying and disseminating dehumanizing, factually inaccurate, and ethnically targeted content against people of Tigray origin.
One specific case highlighted was that of university professor Meareg Amare. In November 2021, Facebook posts targeted Meareg, revealing his name, photo, workplace, and home address alongside false accusations that he supported the Tigray People’s Liberation Front (TPLF). “I knew once his name was exposed like that, it was only a matter of time,” said Meareg’s son, Abrham, in an interview with Amnesty researchers. Days later, a group of men arrived at Amare’s home and killed him. Professor Meareg Amare’s story was featured on Addis Standard‘s print publication in November 2022, followed by a $2bn lawsuit filed against Facebook’s parent company, Meta, which in Nairobi, Kenya.
Amnesty spoke with a member of an Ethiopian civil society group in Meta’s trusted partner program, who claimed that the company was “extremely slow” in responding to alerts regarding dangerous speech. This indicated a lack of cultural understanding of how quickly rhetoric can incite violence.
The report concludes that, through both action and inaction, Meta actively contributed to serious human rights violations against the people of Tigray. “Meta ignored repeated warnings and failed to implement appropriate mitigation measures to address the dangers on its platforms,” said Callamard.
Amnesty International is now calling for urgent and comprehensive reforms, including emergency measures for Meta to reduce algorithmic amplification of content in crisis situations. Furthermore, it emphasizes the need for states to regulate big tech companies in order to protect human rights and ensure accountability for any human rights violations caused or enabled by these companies, whether through product design choices or a failure to implement proper safeguards.
Amnesty’s report also feature the case of Freweyni Hetsay, who tragically lost her father and brother as a result of the spread of hate speech and the encouragement of violence through social media, which was reported by Addis Standard a month ago.