What Can We Learn From Facebook’s Mistakes? (Part I)

Let’s quickly recap the three mistakes Facebook made when they expanded into new markets overseas. Facebook failed to consider:

  • The interests and needs of people beyond their target markets
  • How the political and cultural context of their market might affect the way their platform would be used
  • The risks and potential bad consequences of their decision

These considerations are all interconnected. People are embedded in societies that have a particular political system and cultural norms and values, so any consideration of people outside a target market naturally leads to considerations of context. Risk assessment requires thinking about the interests and needs of people outside the target market: what will the impact be on society at large?

We’ll take a look at each of these considerations in turn, the first in this post and the others in a subsequent post. Let’s first talk about the initial decision of creating a business model.

Building Ethical Business Models

What happens when a company prioritizes profits to the detriment of people?

We’ve seen it with Facebook. Ethnic violence in Ethiopia. Ethno-religious persecution in Myanmar. Damage to democracy in the United States. The deleterious effects of building a business model that exploits human weakness in order to make money is apparent.

There is a lot of emphasis placed on creating a profitable business model. After all, a business that can’t make a profit will not survive.

But hardly anyone points out that you need to build ethics into a business model.

If business people really want to make the world a better place, they need to ask themselves whether or not the way they are pursuing profit is ethical.

Facebook’s Business Model

The basis of Facebook’s business model is the sale of user data to advertisers. The point of engagement for the platform is the Newsfeed. Algorithms determine the individualized content and advertisements that users see. If a user clicks on an ad or on a story, she will get more of the same thing the next time she’s on Facebook.

This leads to an echo chamber effect where users are never exposed to alternative narratives. People are encouraged by Facebook’s algorithms to engage with more of the same content, even if that content includes disinformation, extremist propaganda, and conspiracy theories.

However much damage this model has done in society, Facebook continues to profit from it. Community Standards are in place, but it’s not required reading for creating an account.

Facebook’s business model is a prime example of a focus on profits at the expense of people. Such an approach to business has a human cost: real-world violence, division, and confusion.

Reflect Before You Act

We want, of course, to make a profit and for our businesses to do well. But we need to keep in mind some assessment questions as we pursue profitability:

  • Are we upholding the dignity of the human person in our pursuit of profit? Or are we denigrating it?
  • Are we making a valuable contribution to humanity? Will our business contribute to a more harmonious society? Or are we playing on human weaknesses to make money?
  • What impact will/does our business model have on society? Does it do harm?

The key to creating a business model that is sustainable and contributes to the good of society (and, really, the key to making a good decision in general) is reflection, thinking before doing.

Looking Beyond the Target Market

Part of Facebook’s ethical failure is its lack of consideration for those outside their target market. The big question is, who isn’t Facebook’s target market? It seems like their target market is any human being over age 13 with an internet connection.

But this actually excludes a lot of people, 3.23 billion to be exact. The most impoverished, many of those living in rural communities, and many indigenous peoples around the world are excluded from the public discourse that takes place on social media while bearing the negative consequences of it.

Only 6% of the population in Ethiopia have internet access, necessarily privileging the voices of a few and excluding the voices of the majority. The incitements to violence on Facebook contributed to riots, murders, and the persecution of Ethiopian people who didn’t have a voice on the platform. Disinformation spread from Facebook to rural areas in Ethiopia by word of mouth, and the people who were hearing it didn’t have access to alternative sources of information.

The same thing happened in Myanmar, where so many of the Rohingya population is impoverished and cannot access the internet. The voices of prejudice rang much louder than the voices of the marginalized and their supporters thanks to Facebook’s algorithms.

When expanding into new markets, Facebook only saw an opportunity to extend its reach. It didn’t consider the people who would not have access to its service and how they might be affected by discourse on the platform.

What these cases clearly show us is that markets are not segregated. They do not exist in a vacuum. Markets interact with other markets. The way that a product or service is used by one market has effects on people outside of that market.

The other thing to note is that markets are made up of human beings, not data points. Humans have emotions and biases that affect their judgment. Various types of education and training, both good and bad, leads human beings to make different kinds of decisions. All of this informs the way in which products and services are used.

A communication service like Facebook has an enormous potential for abuse. The number of people who know how to communicate respectfully and compassionately and who can discern truth from falsity are few and far between. The number of people who haven’t learned virtuous ways of communicating and who haven’t been trained to make distinctions in the processing of information is far greater. Look at any comment section on the internet for the evidence.

The takeaway lesson is that decision makers need to consider the interests and needs of those outside of their target market. They also need to think about the ways that their product or service will be used by the target market and how that will effect people outside of that market.

We’ll look at considerations of political and cultural context in the next post.

The Failures of Facebook

Facebook is in hot water. Again.

In 2018, Facebook made headlines for its platform’s role in stoking ethnic violence in Myanmar. They are facing international criticism in 2020 for the same reason, this time in Ethiopia.

America has not been immune to the social destruction exacerbated by Facebook, but we also don’t have a political system like Ethiopia’s that’s explicitly divided along ethnic lines. Facebook’s presence in Ethiopia has only made the divisions that splinter its people more entrenched and volatile.

The company’s inadequate response to violence stoked on its platform in Ethiopia has led to ethnic persecution and death for hundreds of people.

Why look at this? What does this have to do with us?

Our every decision has an impact on the world around us, on the people around us and even on people we will never meet. Some day, you might be part of a company or an organization that has global presence, and your actions or inactions will have a massive impact.

And even if your employer doesn’t have a global impact, your decisions will still have force. They will still affect the people around you.

It’s critical to study the mistakes of others in order to learn how to make better decisions and ultimately how to be our highest and best selves. That is why we’re spending some time looking at Facebook’s failures.

Let’s take a look at Facebook’s impact in Ethiopia, what they’re doing to improve things, and what they failed to consider in their decision-making process that could have saved lives.

Facebook’s Impact in Ethiopia

So, what’s happening in Ethiopia and what role is Facebook playing?

Ethiopia is a diverse country with more than 75 ethnic groups and 83 different languages. The largest ethnic groups are the Oromo (about 35%) and the Amhara (about 27%). For years, the ruling elite intentionally emphasized and promoted ethnic identity, fueling division among Ethiopians along ethnic lines.

After years of government suppression of political opposition, Abiy Ahmed, the first Oromo prime minister, was elected in April 2018. With his leadership came sweeping reforms, including the release of thousands of political prisoners and welcoming back opposition groups, and announcing his desire to review Ethiopia’s system of ethnic federalism.

But Ahmed’s decision to open up political discourse, as some political commentators have noted, was like taking the lid off of a pressure cooker. This has led to eruptions of violence in the country among different groups. The fires of division are being stoked on social media, primarily Facebook.

Facebook in Ethiopia is almost synonymous with the internet and is considered a primary news source for users. Though only a little over 6.1 million out of about 109 million Ethiopians had Facebook accounts in January 2020, the influence of social media reaches far, as many Ethiopians get their news by word of mouth.

Growing opposition to Ahmed’s government has been fueled by mis- and disinformation proliferating on Facebook. Distorted narratives begun in the media made their way onto social media. Hate speech, inflammatory content, and calls for violence against different ethnic groups have spread like wildfire on Facebook.

All of this online activity has supercharged real-world murder and destruction.

The already tumultuous political and cultural climate of Ethiopia has exploded into something monstrous thanks to Facebook. In just one instance, a Facebook post that went viral in October 2019 led to the deaths of 80 people. And that’s just the tip of the iceberg.

What is Facebook Doing About This?

Last year, Facebook promised to employ 100 people to moderate content for all African markets. But whether those positions have been filled, how many content moderators have been assigned to Ethiopia, and who is in charge of dealing with the issues impacting Ethiopian users remains a mystery. Facebook has not answered these questions.

They have moderators in Kenya as well as moderators in Ireland, and the company whose services Facebook is using is hiring more moderators who speak Amharic, one of Ethiopia’s major languages.

The company is working on translating its Community Standards into Ethiopia’s other two main languages (right now, the Community Standards are available only in Amharic and English).

Is Facebook Doing Enough?

Is Facebook’s response to the situation in Ethiopia enough?

Many are saying no.

In an open letter to Facebook, activists, journalists, and human rights organizations have accused Facebook of failing to “prevent the escalation of incitement to violence on its services.”

Some of the short-term actions that are recommended in the letter include:

  • Making content reporting on Facebook services fully available in the Afaan Oromo and Tigrinya languages (they’re only available in Amharic and English)
  • In specific cases where there is a risk of human rights abuse, implement transparent and temporary changes to limit massive sharing functionalities
  • Preserve abusive content that Facebook makes unavailable so it can be used as evidence by organizations seeking to hold perpetrators accountable

Some of the long-term efforts suggested are:

  • Add significant resources to rights-respecting content moderation efforts (i.e. put money into hiring more content moderators)
  • Enforce meaningful and robust transparency initiatives about policies, standards, and practices
  • Actively support and help to develop initiatives that promote human rights, tolerance, diversity, and equality for all people in Ethiopia
  • Conduct in-depth human rights impact assessments for products, policies, and operations based on national context before entering any new market

These are great suggestions, and Facebook would do well to heed them. Taking these actions would show that the company had learned from their mistakes. Other social media companies might also benefit from following these guidelines.

The Three Main Mistakes Facebook Made

Decision makers at Facebook failed to adequately consider three things, which are all interrelated:

  • The interests and needs of people beyond their target markets
  • How the political and cultural context of their market might affect the way their platform would be used
  • The risks and potential bad consequences of their decisions

Individuals at other companies make those mistakes as well. We don’t want you to fall into the same traps. In our next post, we’re going to touch on Facebook’s first mistake and how to think beyond the target market in your decision making process.