Let’s quickly recap the three mistakes Facebook made when they expanded into new markets overseas. Facebook failed to consider:
- The interests and needs of people beyond their target markets
- How the political and cultural context of their market might affect the way their platform would be used
- The risks and potential bad consequences of their decision
These considerations are all interconnected. People are embedded in societies that have a particular political system and cultural norms and values, so any consideration of people outside a target market naturally leads to considerations of context. Risk assessment requires thinking about the interests and needs of people outside the target market: what will the impact be on society at large?
We’ll take a look at each of these considerations in turn, the first in this post and the others in a subsequent post. Let’s first talk about the initial decision of creating a business model.
Building Ethical Business Models
What happens when a company prioritizes profits to the detriment of people?
We’ve seen it with Facebook. Ethnic violence in Ethiopia. Ethno-religious persecution in Myanmar. Damage to democracy in the United States. The deleterious effects of building a business model that exploits human weakness in order to make money is apparent.
There is a lot of emphasis placed on creating a profitable business model. After all, a business that can’t make a profit will not survive.
But hardly anyone points out that you need to build ethics into a business model.
If business people really want to make the world a better place, they need to ask themselves whether or not the way they are pursuing profit is ethical.
Facebook’s Business Model
The basis of Facebook’s business model is the sale of user data to advertisers. The point of engagement for the platform is the Newsfeed. Algorithms determine the individualized content and advertisements that users see. If a user clicks on an ad or on a story, she will get more of the same thing the next time she’s on Facebook.
This leads to an echo chamber effect where users are never exposed to alternative narratives. People are encouraged by Facebook’s algorithms to engage with more of the same content, even if that content includes disinformation, extremist propaganda, and conspiracy theories.
However much damage this model has done in society, Facebook continues to profit from it. Community Standards are in place, but it’s not required reading for creating an account.
Facebook’s business model is a prime example of a focus on profits at the expense of people. Such an approach to business has a human cost: real-world violence, division, and confusion.
Reflect Before You Act
We want, of course, to make a profit and for our businesses to do well. But we need to keep in mind some assessment questions as we pursue profitability:
- Are we upholding the dignity of the human person in our pursuit of profit? Or are we denigrating it?
- Are we making a valuable contribution to humanity? Will our business contribute to a more harmonious society? Or are we playing on human weaknesses to make money?
- What impact will/does our business model have on society? Does it do harm?
The key to creating a business model that is sustainable and contributes to the good of society (and, really, the key to making a good decision in general) is reflection, thinking before doing.
Looking Beyond the Target Market
Part of Facebook’s ethical failure is its lack of consideration for those outside their target market. The big question is, who isn’t Facebook’s target market? It seems like their target market is any human being over age 13 with an internet connection.
But this actually excludes a lot of people, 3.23 billion to be exact. The most impoverished, many of those living in rural communities, and many indigenous peoples around the world are excluded from the public discourse that takes place on social media while bearing the negative consequences of it.
Only 6% of the population in Ethiopia have internet access, necessarily privileging the voices of a few and excluding the voices of the majority. The incitements to violence on Facebook contributed to riots, murders, and the persecution of Ethiopian people who didn’t have a voice on the platform. Disinformation spread from Facebook to rural areas in Ethiopia by word of mouth, and the people who were hearing it didn’t have access to alternative sources of information.
The same thing happened in Myanmar, where so many of the Rohingya population is impoverished and cannot access the internet. The voices of prejudice rang much louder than the voices of the marginalized and their supporters thanks to Facebook’s algorithms.
When expanding into new markets, Facebook only saw an opportunity to extend its reach. It didn’t consider the people who would not have access to its service and how they might be affected by discourse on the platform.
What these cases clearly show us is that markets are not segregated. They do not exist in a vacuum. Markets interact with other markets. The way that a product or service is used by one market has effects on people outside of that market.
The other thing to note is that markets are made up of human beings, not data points. Humans have emotions and biases that affect their judgment. Various types of education and training, both good and bad, leads human beings to make different kinds of decisions. All of this informs the way in which products and services are used.
A communication service like Facebook has an enormous potential for abuse. The number of people who know how to communicate respectfully and compassionately and who can discern truth from falsity are few and far between. The number of people who haven’t learned virtuous ways of communicating and who haven’t been trained to make distinctions in the processing of information is far greater. Look at any comment section on the internet for the evidence.
The takeaway lesson is that decision makers need to consider the interests and needs of those outside of their target market. They also need to think about the ways that their product or service will be used by the target market and how that will effect people outside of that market.
We’ll look at considerations of political and cultural context in the next post.