In the Hutong
Surrounded by seferim
While I was working last week on my business case against Facebook coming to China, the editors of the Financial Times decided to take a moral stand. In “Here be dragons,” the FT posited that in doing business in China, Facebook would be treading on morally dangerous ground.
Facebook may not have set itself some “don’t be evil” style mission, but its raison d’être is to encourage its users to share personal information about themselves. This is morally problematic when the representatives of an authoritarian government are peering over one’s shoulder.
The editors of the FT then call upon Facebook to “articulate a strategy that allows it convincingly to navigate the ethical shoals before venturing into China.”
While I appreciate the FT’s point, I think we are getting ahead of ourselves.
Before we can judge the morality of what Facebook is doing we must first ask ourselves on what basis we are making that judgment. This is neither trivial nor pedantic.
For better or worse, we live in a world of many moral codes. While there is widespread agreement among most about some general principles (murder is wrong, theft is wrong, etc.), outside of those general principles there is much disagreement, and even within those principles there are vast variances in interpretation. Americans cannot agree on whether abortion is murder or a woman’s right; there are many who feel that the slaughter of a cow is murder, that reading a book in a bookstore without buying it is theft.
If there is great variation among moral codes in America, the question of selecting a moral yardstick becomes more complex when globalization is added to the picture. In most of the world, tolerance for different moral codes and belief systems has replaced the effort – via crusade, inquisition, holy war, and missionary colonialism – to convert everyone on the planet to a universal system of morality and beliefs. Globalization depends at its core on this toleration, a recognition that the world consists of people with different moral codes, and that there is more to be gained through accommodating those differences in the short term than trying to eliminate them.
This happy state of affairs ends when a company finds itself operating in a country where accepted principles of behavior vary radically from those at home. Under such circumstances, is a company obliged to operate under the moral code of its host country/culture, or its home country/culture? And why? If Facebook provides user information to the Chinese authorities, that may be considered immoral in the U.S., but it may be considered good citizenship in China. And if Facebook provides UK authorities with personal information on terrorist suspects, that may be considered good citizenship in Britain, but an abominable betrayal of trust elsewhere.
Your Morals…or Theirs?
The point of all this is that asking a company to behave morally is not a simple “on/off” proposition. Indeed, a company seeking to operate morally in a multicultural and morally relativistic world thus treads treacherous ground. Does it choose a moral code by which to operate, potentially alienating individuals, groups, and even countries that may find that code offensive, and potentially closing off lucrative opportunities? Or does it take the path of least risk, attempting to dance between those codes, never taking a stand yet never giving offense?
Friedman’s point, while technically correct, comes off as anachronistic and unenlightened a decade into the 21st century. Consumers, governments, media and other audiences now expect companies to conduct themselves not only according to the law and the interests of their shareholders, but also in a moral manner over and above what the law stipulates, and in keeping with the moral codes of those audiences.
The Facebook Code
We are coming into a time, then, where companies can no longer afford to be morally ambiguous. A company that fails to articulate its own moral code of conduct will have one, two, or even several articulated for it, created by online publics using behavioral exegesis: “ah, see, their privacy controls are weak, so they put profits ahead of user interests.”
This is the situation Facebook finds itself in today. It has been much clearer and open about a code of conduct for its users than it has been for itself. We cannot count on Facebook to do the right thing in China, not because we think Facebook, its officers, and investors are necessarily bad, but because we have been given no reason to believe that they will be good when presented with the kinds of moral quandaries they can expect in China. All we can look at is whether they have transgressed somebody’s moral code in the past. Have they? Yup. Uh-oh.
Until and unless Facebook articulates and promulgates a global code of conduct that applies to itself and all of its employees, that sets forth non-negotiable principles that can cover a wide range of situations, that has the support of morally credible third parties, is subject to audit, and spells out meaningful penalties for failure to comply, Facebook’s motives and intent in morally challenging situations will be suspect. And we will, of course, suspect the worst.
Reconnoiter the Moral High Ground
None of this is meaningful unless Facebook understands the full scope of the potential moral quandaries it will face in China and addresses them right now. While it is drafting its moral code, and perhaps before, Facebook needs to conduct a due diligence process to assess not the financial or regulatory risks implicit in doing business in China (though one would hope they are doing those as well,) but also the moral risks.
Those risks are not limited to what what the government likely to demand of Facebook, both initially and in the future, in return for the right to do business in China, and the potential consequences (both in China and to the business as a whole) of those compromises. They also include the moral hazards implicit in operating in a business environment with widespread corruption, and ways in which the behavior, background, and associations of Facebook’s potential partners might cast the company in a morally unfavorable light.
If you value your money, you assess financial risk. If you value your right to conduct business, you evaluate regulatory risk. And if you value what is right, you assess your moral risk. I have to believe Facebook values all of those things.
The People Factor
They may believe in those things to such an extent that Facebook’s own people may not allow the company to leap into China – that’s how I read what Bill Bishop wrote in this Digicha post on Thursday. If the company does make the decision to come here, however, what will determine the propriety of Facebook’s actions will not be codes of conduct or due diligence, but the behavior and scruples of the individuals in the Facebook China enterprise.
This is a significant challenge. It is difficult enough to identify, hire, and retain individuals with strong work habits and technical skills in China’s hyper-competitive and talent-constrained labor market. It is even more challenging to find people with the requisite talents and finely-tuned moral compasses.
Yet these are precisely the kinds of people Facebook will need most, and these will be the people whose decisions, more than those of Mark Zuckerberg or Sheryl Sandberg, will determine whether Facebook operates in accordance with any ethical rectitude in China. This should not only guide Facebook’s hiring decisions, but its choices on who to partner with, and how much to trust them with the human resources task.
Because in the end, a company’s moral standing is no greater or less than that of its most morally-challenged employee. Facebook should get that: it has always been about the people.
One Final Note
I feel compelled at this point to write a brief postscript to this series of articles.
There was once a old captain who lived in a small but neat house near the mouth of the Congo River in what was then called Leopoldville. When he wasn’t off on the business of The Company, he would spend his quiet days and evenings along the docks near the river, watching boats go upstream, watching fewer come back down, and seeing in the flotsam reminders of his own lessons on Africa’s mightiest current.
One evening down at the docks, he spied a new, elegantly fitted-out, beautifully varnished yacht tied up alongside. He went into his usual watering hole to find a party of equally-well fitted-out Englishmen at the tables, clearly pausing for an evening of revelry before continuing upriver. He walked up to the Englishman at the head of the table and, introducing himself, asked if that was his vessel outside.
“Yes it is,” beamed the Briton. “She’s a beauty, isn’t she?”
“She is,” replied the captain. “And she’ll swamp at the bend below the first rapids. Her beam is too narrow, she draws too much water, and her bows are all wrong. With respect, sir, please find yourself another vessel before continuing upstream.”
There was a brief silence at the table, and then one of the men further down jumped up and berated the captain. “Now see here, sir, but do you know to whom you speak? This man, Sir –––, is a Fellow of the Royal Geographic Society, has been up every river in Europe, plus the Nile, Amazon, and Mississippi. We here are the most experienced river crew in the world. How dare you tell him his vessel is not ready.”
The old captain smiled, tipped his hat to the table, apologized, and bade them a safe journey.
It was not three days later when the captain spied drifting downstream in the current a varnished plank of wood, and a torn Union Jack. The sight gave him no happiness.
Safe journey, Facebook, whatever you decide.
- Nine Things Facebook Must Do to Better its Chances in China (siliconhutong.com)
- Ethical Implications of Corporations (ethicalrealism.wordpress.com) – This is a superb post that I came across as I wrote this one, and it encouraged me to dig into the question of whether or not a company can, in fact, be a moral entity and not just a legal one. Stay tuned.