The Failures of Facebook

Facebook is in hot water. Again.

In 2018, Facebook made headlines for its platform’s role in stoking ethnic violence in Myanmar. They are facing international criticism in 2020 for the same reason, this time in Ethiopia.

America has not been immune to the social destruction exacerbated by Facebook, but we also don’t have a political system like Ethiopia’s that’s explicitly divided along ethnic lines. Facebook’s presence in Ethiopia has only made the divisions that splinter its people more entrenched and volatile.

The company’s inadequate response to violence stoked on its platform in Ethiopia has led to ethnic persecution and death for hundreds of people.

Why look at this? What does this have to do with us?

Our every decision has an impact on the world around us, on the people around us and even on people we will never meet. Some day, you might be part of a company or an organization that has global presence, and your actions or inactions will have a massive impact.

And even if your employer doesn’t have a global impact, your decisions will still have force. They will still affect the people around you.

It’s critical to study the mistakes of others in order to learn how to make better decisions and ultimately how to be our highest and best selves. That is why we’re spending some time looking at Facebook’s failures.

Let’s take a look at Facebook’s impact in Ethiopia, what they’re doing to improve things, and what they failed to consider in their decision-making process that could have saved lives.

Facebook’s Impact in Ethiopia

So, what’s happening in Ethiopia and what role is Facebook playing?

Ethiopia is a diverse country with more than 75 ethnic groups and 83 different languages. The largest ethnic groups are the Oromo (about 35%) and the Amhara (about 27%). For years, the ruling elite intentionally emphasized and promoted ethnic identity, fueling division among Ethiopians along ethnic lines.

After years of government suppression of political opposition, Abiy Ahmed, the first Oromo prime minister, was elected in April 2018. With his leadership came sweeping reforms, including the release of thousands of political prisoners and welcoming back opposition groups, and announcing his desire to review Ethiopia’s system of ethnic federalism.

But Ahmed’s decision to open up political discourse, as some political commentators have noted, was like taking the lid off of a pressure cooker. This has led to eruptions of violence in the country among different groups. The fires of division are being stoked on social media, primarily Facebook.

Facebook in Ethiopia is almost synonymous with the internet and is considered a primary news source for users. Though only a little over 6.1 million out of about 109 million Ethiopians had Facebook accounts in January 2020, the influence of social media reaches far, as many Ethiopians get their news by word of mouth.

Growing opposition to Ahmed’s government has been fueled by mis- and disinformation proliferating on Facebook. Distorted narratives begun in the media made their way onto social media. Hate speech, inflammatory content, and calls for violence against different ethnic groups have spread like wildfire on Facebook.

All of this online activity has supercharged real-world murder and destruction.

The already tumultuous political and cultural climate of Ethiopia has exploded into something monstrous thanks to Facebook. In just one instance, a Facebook post that went viral in October 2019 led to the deaths of 80 people. And that’s just the tip of the iceberg.

What is Facebook Doing About This?

Last year, Facebook promised to employ 100 people to moderate content for all African markets. But whether those positions have been filled, how many content moderators have been assigned to Ethiopia, and who is in charge of dealing with the issues impacting Ethiopian users remains a mystery. Facebook has not answered these questions.

They have moderators in Kenya as well as moderators in Ireland, and the company whose services Facebook is using is hiring more moderators who speak Amharic, one of Ethiopia’s major languages.

The company is working on translating its Community Standards into Ethiopia’s other two main languages (right now, the Community Standards are available only in Amharic and English).

Is Facebook Doing Enough?

Is Facebook’s response to the situation in Ethiopia enough?

Many are saying no.

In an open letter to Facebook, activists, journalists, and human rights organizations have accused Facebook of failing to “prevent the escalation of incitement to violence on its services.”

Some of the short-term actions that are recommended in the letter include:

  • Making content reporting on Facebook services fully available in the Afaan Oromo and Tigrinya languages (they’re only available in Amharic and English)
  • In specific cases where there is a risk of human rights abuse, implement transparent and temporary changes to limit massive sharing functionalities
  • Preserve abusive content that Facebook makes unavailable so it can be used as evidence by organizations seeking to hold perpetrators accountable

Some of the long-term efforts suggested are:

  • Add significant resources to rights-respecting content moderation efforts (i.e. put money into hiring more content moderators)
  • Enforce meaningful and robust transparency initiatives about policies, standards, and practices
  • Actively support and help to develop initiatives that promote human rights, tolerance, diversity, and equality for all people in Ethiopia
  • Conduct in-depth human rights impact assessments for products, policies, and operations based on national context before entering any new market

These are great suggestions, and Facebook would do well to heed them. Taking these actions would show that the company had learned from their mistakes. Other social media companies might also benefit from following these guidelines.

The Three Main Mistakes Facebook Made

Decision makers at Facebook failed to adequately consider three things, which are all interrelated:

  • The interests and needs of people beyond their target markets
  • How the political and cultural context of their market might affect the way their platform would be used
  • The risks and potential bad consequences of their decisions

Individuals at other companies make those mistakes as well. We don’t want you to fall into the same traps. In our next post, we’re going to touch on Facebook’s first mistake and how to think beyond the target market in your decision making process.