Monday, January 31, 2011

Of Egypt and Complex Systems

By Dr. Suhaib Riaz

On hindsight, perhaps the “Butterfly Effect” metaphor I used for the Tunisian story was more apt than I imagined. The effect has spread to other countries in the region, most prominently Egypt, and all stakeholders, including businesses are scrambling to understand the implications as reflected in recent stock drops and oil price increases.

A time like this is perhaps the best illustration for the unfortunate lack of “complex systems” thinking that is characteristic of most policy makers and strategists. There is much to quote here and then reflect upon in the context of the current political and social crises. For starters, let me quote MIT's John Sterman to give a hint of the problem:
“As the world changes ever faster, thoughtful leaders increasingly recognize that we are not only failing to solve the persistent problems we face, but are in fact causing them. All too often, well-intentioned efforts to solve pressing problems create unanticipated ‘‘side effects.’’ Our decisions provoke reactions we did not foresee. Today’s solutions become tomorrow’s problems. The result is policy resistance, the tendency for interventions to be defeated by the response of the system to the intervention itself.” 
In other words, as Sterman says later, there are no “side-effects”: these are simply the result of our inability to see the complex nature - the non-linear dynamics- of the system. He lists several interesting examples of this problem, to which certainly the case of Egypt and much of that region will be a perfect addition in the future. 
“From California’s failed electricity reforms, to road building programs that create suburban sprawl and actually increase traffic congestion, to pathogens that evolve resistance to antibiotics, our best efforts to solve problems often make them worse.” 
For the policy makers across the world caught in the confusion of the Egypt crisis, is there any doubt that a lack of complex system thinking led to the support for political elites with heavily concentrated power, such as Hosni Mubarak, that ruled for decades. Little thought was given to how such support created non-linear and unpredictable reactions and outcomes, including radicalisms of various sorts which come back to haunt us. Let me throw in another pertinent quote to help reflect more. Lewis Thomas (late biologist, 1974): 
“When you are confronted by any complex social system, such as an urban center or a hamster, with things about it that you’re dissatisfied with and anxious to fix, you cannot just step in and set about fixing with much hope of helping. This realization is one of the sore discouragements of our century…You cannot meddle with one part of a complex system from the outside without the almost certain risk of setting off disastrous events that you hadn’t counted on in other, remote parts. If you want to fix something you are first obliged to understand…the whole system…Intervening is a way of causing trouble.” 
This problem of reductionist thinking, where our mental models are of linear cause-effect events related to simple parts of a system, permeates management research to a large extent. Instead of understanding the whole, we are increasingly focusing on analyzing and understanding the parts (whether focusing on resources as in the resource-based view, on transactions as in transaction cost theory, etc.). But what if the properties of the whole are different from the properties of the sum of the parts? (a characteristic of complex systems).

There are interesting possibilities for research here on business and strategy problems. One example, related to some of my work on the global financial crisis, is on how to see the issues related to the banking and finance industry as part of a complex system, as in an article by Andrew Haldane and Robert May in the latest issue of Nature (yes business research does get published there!).

It requires a whole new way of thinking, one which is difficult and often counter-intuitive for our mental models.

Thursday, January 20, 2011

Tunisia's Butterfly Effect

By Dr. Suhaib Riaz.

A curious social phenomenon unfolded in Tunisia over the past few weeks. A country that was hardly in the news for political instability, suddenly saw a mass street revolution resulting in the ouster of its political elite. From a social science perspective, there are two things worth nothing here. The first is the obvious lack of predictability. The second is the sequence of events – which are now well documented: It all started with one unemployed street hawker’s frustration, which first caught the attention of a few other individuals and relatives, and soon spun out of control into the capital and across the nation.

If ever there was a demonstration of the “butterfly effect”, this certainly is one. Crudely put, the argument goes that a butterfly fluttering its wings in one part of the world can ultimately set off a storm in another part. It sounds incredible, until one encounters the results of Ed Lorenz, who accidentally discovered the phenomenon. A very minor decimal place error that crept in due to his use of a printout to input values into a very basic computer simulation of the earth’s weather, resulted in massive irregularities over time. Just as the Tunisian revolution, the sequence of events seemed to have an extreme sensitivity to initial conditions – one decimal place rounding-off, or one individual set off the sequence. Others might have ignored the error, but Lorenz saw in the resulting irregularities the possibilities of a new science. As James Gleick puts it: where chaos begins, classical science stops.

What are the implications for strategy research? We are forever caught in a business world where irregularities, and therefore lack of predictability, are the nature of the phenomena. Who could have possibly predicted where two college kids tinkering with computer code as a hobby would take the world (Microsoft); or how a dropout would see amazing success, get fired, come back and take the company on another round of success (Apple) – or for that matter who can predict what his sick leave announced this week will lead to? The examples in the online world – Facebook, Youtube, etc. are even more astounding. Yet, irregularities and outliers are systematically ignored in most mainstream research. We seem to be in a strange dilemma: What is predictable is not interesting; what is interesting is not predictable - and hence not useful. Could the “new” science of chaos have something to add to our knowledge in the strategic management world - where products, leaders, firms, industries etc. all seem to emerge in highly irregular patterns with little predictability in the sense of classical statistics. There are the odd works that touch upon this, but the area is waiting for more.

Which brings us back to Tunisia's Butterfly Effect. Now that it has happened, can we say something about other similar contexts? Already, there have been attempts to emulate the Tunisian example and set off a similar effect in other countries in the region, yet it hasn’t had the same impact. But then, if it could be easily replicable (predictable), it wouldn’t be as interesting, would it?