An illustration of a smartphone home screen highlights the Facebook logo
Milton Wolf Seminar on Media and Diplomacy

“The Road to Hell is Paved with Good Intentions”: the Role of Facebook in Fuelling Ethnic Violence

By Moges Teshome

With the explosion of social media platforms came high expectations; positive social transformations across the board. As the name indicates, social media was intended to create social networks and bondages, easy exchange of ideas,  alternative sources of information, streamline globalization by virtually transgressing national borders, and even bring about political changes through a social media revolution. Indeed, given the shrinking political spaces and the unprecedented decline of democracy at the global level, social movements have become potent instruments of political contestation. At one point, there was even a talk about whether a revolution would be twitted or not. 

But now we have moved beyond that and entered a new era, including the looming danger of A. I. taking over human agency. To use the founder of Facebook, Mark Zuckerberg´s (in)famous words, “move fast and break things”, has actually turned out to be “moving irresponsibly and breaking bad”. More specifically, over the last decade, social media platforms, particularly Facebook, have been used by various groups to stir up violence,  resulting in ethnic cleansing campaigns. A notable case in point is what has unfolded in Ethiopia and Myanmar. 

So, what can and should be done to contain the dark side of Facebook without getting rid of it? The 2023 Milton Wolf Seminar on Media and Diplomacy, which focused on “Media at the Abyss: War, Deglobalization, and the Diplomatic Response”, has touched upon some of the strategies to cope with the negative consequences of social media platforms. One of the participants of the seminar described Facebook as “a platform whose business model is sowing hatred”, signifying the significant role the platform plays in dividing the people as opposed to connecting us and the need to devise effective mechanisms to keep the “hate machine” at bay. At the very least, this human tragedy calls for effective regulation of and accountability for social media companies, at national and international levels.

In this blog, I will examine the role Facebook played in fuelling online hatred and facilitating ethnic violence, by zooming in on the Ethiopian case, particularly the murder of Professor Maereg Amare by mobs, following the viral hate campaigns on Facebook.

From Myanmar to Ethiopia: a Conspiracy to Murder?

It has become ubiquitous that social media platforms contributed to the proliferation of hate speech and real harm on the ground. Historically speaking, atrocity crimes have been preceded by coordinated hate speech campaigns, as the UN noted. Hate speech against minority groups is as old as history itself. Rather, social media platforms such as Facebook unleashed the speed, easy mobilization, and intensity of hate propaganda, as reinforced by the unprecedented algorithmic powerhouse of Artificial Intelligence (AI). In 2021, Facebook Whistle-blower, Frances Haugen,  gave a scathing testimony about Facebook's moral bankruptcy and called for more regulation. As such, the debate about the platform's role, whether direct or indirect, in enabling online hate speeches spread like wildfire and violence against targeted groups or individuals has been settled.

Between 2016 and 2017, Myanmar’s military carried out widespread and systematic attacks or “clearance operations” against the Rohingya minority groups, in which the military junta has allegedly committed, inter alia, the crimes of mass murder, rape, pillaging, and deportation of population. An in-depth investigation by Amnesty International into the role of Meta (Facebook) in facilitating the genocide against the Rohingya Muslim minority revealed a piece of harrowing evidence. The final report claimed that “The mass dissemination of messages that advocated hatred inciting violence and discrimination against the Rohingya, as well as other dehumanizing and discriminatory anti-Rohingya content, poured fuel on the fire of long-standing discrimination and substantially increased the risk of an outbreak of mass violence.” 

What is more concerning is Facebook's persistent pattern of behavior: its inaction in the face of imminent dangers against targeted ethnic and /or religious groups and its lack of corporate social accountability for its contributions to genocidal campaigns and ethnic cleansing. This is evident from the fact that, after 6 years of learning opportunities, Facebook has done little to manage the spread of hate speech, mobilization for violence, and social disorder in Ethiopia and beyond. As the saying goes, “the sky is the limit for corporate greed”, partly explaining the surveillance-based business model of Facebook. The company profits by keeping users on the platform as long as possible through the display of emotionally charged and inflammatory content, that, most often than not, disseminates hatred against specific groups and incites violence. 

Indeed, the UN investigator of the Rohingya genocide said, and rightly so, “Facebook turned into a beast”, evincing the complicity of Facebook in the crime. Crucially, in countries such as Myanmar and Ethiopia, where almost everything–sharing of information, the spread of hate, and mobilization for violence–is done through Facebook, it is logical to take necessary cautions and commensurate measures. But for Facebook, everything appears to be business or the business of business is business, which shouldn’t be the case.  

In 2021, the victims of the ethnic cleansing sued Facebook for $150bn for fuelling hate speech in Myanmar and enabling the ensuing genocide against Rohingya. Although the genocide case before the International Court of Justice is stepping in the right direction, Facebook has not been held accountable for promoting violence against Rohingya and its complicity in the crime.

In Ethiopia, too, Facebook has enabled social and political polarization and incitement to violence by amplifying hate speeches, even to the extent of ignoring clear warnings and call for immediate intervention from its own partners based in Kenya. This happened over the last two years as the northern part of Ethiopia has been ravaged by the onslaught of the brutal civil war. And owing to the ethnic nature of the civil war, Facebook and the likes must have put extraordinary measures to prevent or, at least, mitigate damages that might result from the unbridled hate propaganda from its platform. Instead, what Facebook did, as Haugen stated, “in places like Ethiopia, Facebook [was] literally fanning ethnic violence.” Whenever Facebook intervenes after excessive delays, content moderators remove contents or suspend accounts indiscriminately without reviewing the content of the messages. This, in turn, paved the way for digital warfare, whereby organized groups simply shut the voices of their opponents by making use of Facebook's in-built reporting system.

When asked why it failed to invest in content moderation and detection of harmful content on its platform, Facebook's standard response was that “they are closely monitoring the situation on the ground and have set up a dedicated team.” Well, the jury is out and we shall see if that is the case in the following.

The Murder of Maereg Amare and the Cry for Justice

The victim of “hate machine”, Maereg Amare Abraha, a Professor of Analytical Chemistry at Bahir Dar University in Ethiopia, was murdered in broad daylight and in front of his house on 2 November 2023. In the lead up to his murder by a vigilante group, a coordinated defamation and disinformation campaign against the deceased had been rampant. A Facebook page which has 50k followers posted a number of false accusations against the victim, which was ignored by Facebook at will. According to an investigation by Insider, Facebook was complicit in the murder of Professor Maereg, because the company either ignored warnings or unreasonably delayed in its response to allegations of incitement to violence. Without the Facebook posts, the mobs couldn’t have known the whereabouts and details about Professor Maereg, let alone attacking him. Foxglove, a UK based law firm, described the tragedy as “death by design”, a befitting title. As the report of Crisis Group noted, “Given Ethiopia’s large and active online community and contentious politics, it seems clear that Meta has not adequately invested in its detection and moderation infrastructure there.” That is to say, Facebook could have and should stopped the murder if it showed a reasonable diligence and acted responsibly.  

A portrait of Maereg Amare in academic regalia
Professor Maereg Amare: the victim of hate speech and mob justice

In an effort to vindicate justice, the son of the deceased, Abraham Maereg, a human rights expert, Mr. Fisseha Tekle and the Kenyan-based rights group, Katibu Institute, filled a lawsuit before the Kenyan High Court against Meta, stating that the defendant enabled the spread of hate and violence in Ethiopia and its complicity in the murder of the victim. As per the lawsuit, Meta is set to face a 2 billion dollar civil claim. In addition to this, the lawsuit requested an official apology for enabling the murder, improving its algorithm to demote hate speeches, and putting robust content moderation in multiple local languages. 

On top of this, more than a dozen civil society organizations have joined the quest for justice and accountability and have written an open letter in solidarity with the  victim and by supporting  the legal proceeding. They asserted that Meta, Facebook's parent company, is responsible for the proliferation of hate and violence in Ethiopia. The letter partly states: “By failing to invest in and deploy adequate safety improvements to your software or employ sufficient content moderators, Meta is fanning the flames of hatred, and contributing to thousands of deaths in Ethiopia.” More specifically, they demanded that Facebook must “unplug the hate machine” without any further delay.

We shall see what the court in Kenya will decide and whether justice will served or not. But at a broader level, what should Facebook needs to do in the near future to avert irreparable damages in Ethiopia and beyond?

Certainly, Facebook couldn’t stop hate speech, nor is it supposed to address structural political problems in deeply divided countries such as Ethiopia where political polarization, ethnic cleavages and hatemongering have become a norm.

The least Facebook and co can do is desisting from activities that potentially amplify existing tensions and enable targeted attacks on marginalized groups. But the bigger and imperative question still remains: how to regulate and hold Facebook without unduly compromising  the inherent virtues of social media, access to information and social networking? There is no easy answer for this simply because it involves moral dilemma. At any rate, the following recommendations should be implemented, among others:

  • Fully implementation of the UN Guiding Principles on Business and Human Rights without delay;
  • More transparency: social media platforms should be required to share information pertaining to content moderation policies and algorithm functions with researchers and when necessary, with concerned authorities’ for regular inspection; 
  • Devise Binding International Code of Conducts on Social Media operations  and compliance procedures;
  • Establishment of Task Force which will be in charge of preventing the spread of hate speech during national crisis such as civil war and
  • In case of clear implication of the platform in the conspiracy to murder, strict punitive measures shall be enforced.