Menu
Menu
Chaos Control

Chaos Control

As Linux demonstrates, one way to ensure coherence in a complex world is to encourage variety Whenever we forge new strategies, devise new policies or create new services, we are dabbling in the world of complex adaptive systems. Authors Robert Axelrod and Michael D Cohen define a complex adaptive system as one in which a single action - such as putting up a Web site - can lead to unforeseen, even unpredictable consequences.

Their new book, Harnessing Complexity: Organisational Implications of a Scientific Frontier (The Free Press, 1999), aims to help readers understand how to design organisations and strategies in a complex environment. Borrowing ideas from scientific and biological research in which three factors - variation, interaction and selection - shape complex environments, the authors have devised a framework as a way of guiding readers to pose new questions and ponder new possibilities when confronted with complexity.

Why do the authors feel now is a particularly appropriate time to apply complexity theory to business? It all has to do with the information revolution. Given the number of people and organisations that interact in a networked economy, the unpredictable nature of business - and indeed of daily life - is accelerating. The pace and scope of adaptation is faster, resulting in social and organisational upheaval. In the following excerpt, the authors use the development of Linux as an example of variation. The question they ponder: When can an organisation or population do better with more variety as opposed to uniformity? The answer revolves around the principle of exploration versus exploitation, a trade-off situation between creating strategies that are untested but may be superior to what exists versus copying proven strategies.

We can illustrate the conditions that favour encouraging variety by considering the striking case of the Linux computer operating system and the method used to organise the work of its developers. The method is known as open source software development. This form of software development has been thrown into the limelight by the spectacular growth of Linux, which has become, in certain key areas of application, a serious competitor to operating systems developed and sold by major corporations such as Microsoft, Sun Microsystems and IBM.

This is a very surprising turn of events since Linux is given away free by its developers. There is a natural presumption that free software cannot be as reliable as for-profit software. Yet it is precisely for situations demanding high reliability that Linux has found its strongest support.

The surprise deepens with the observation that Linux is not only free but also the handiwork of an enormous, worldwide cadre of unpaid volunteers. By some estimates, Linux is the result of contributions from many thousands of programmers. A computer operating system is one of the most intricate of human creations. This number of cooks would seem more than sufficient to spoil the soup. How could thousands of scattered volunteers make an operating system that is more reliable (and faster running) than those created by dozens, or hundreds, of highly talented programmers working full time for renowned corporations?

Considering the development of Linux as a Complex Adaptive System casts light on some important components of the explanation. We can begin by pointing out that Linux is not the only example of the open source approach to software development. There are many earlier examples, such as the scripting language Perl and the e-mail server Sendmail. The most widely deployed software for serving up requested World Wide Web pages, a system known as Apache, is also the product of volunteers working together in an open source framework. What all the examples have in common is the free availability of the source code, the human readable computer instructions that specify the program. That arrangement provides the generic label for this approach to team software creation: open source software development.

The free access to the source code of Linux means that any programmer with sufficient motivation can craft changes to the code, creating a new version of the program. This is not possible in traditional development with proprietary code. From a Complex Adaptive Systems point of view, the possibility for volunteers to create working variants increases massively the variety of the population of operating systems. In successful open source cases such as Linux, that variety has been harnessed to yield a very effective result, although many observers expected chaos to result from the rapid injection of many potentially incompatible variants.

Our framework points to several structural arrangements that work to make the added exploration beneficial, averting the prospect of death by eternal boiling. [Eternal boiling occurs when the level of mutation is so high that a system remains in a permanent state of disorder.] In our terms, when a programmer modifies the source code of Linux, this activity is an endogenously triggered recombination. The trigger is usually an observation of some particular kind of poor performance by the existing standard version of the operating system.

The affected user may make an electronic request for help from the large Linux community. Interested individuals respond by suggesting fixes. These small pieces of new code are recombined with the rest of the standard version to produce new variant versions. A period of testing and discussion of the performance of the variants follows. Eventually the best-performing variant is accepted by the small team of key Linux developers, who incorporate the new code into a subsequent standard version of Linux.

When open source development prospers, a central reason seems to be that, as Eric Raymond says, "given enough eyeballs, all bugs are shallow". For Linux, there are certainly enough volunteers. Equally important, the communication among volunteers about triggering problems and proposed alternatives is precise enough that multiple plausible variants are routinely generated as possible solutions to most problems that bother users. In addition, testing of alternatives is reliable enough that the code that wins out is generally very good code, with unwanted side effects being rare. Thus the variety made possible by the free availability is marshalled to produce a rapid rate of improvement in overall quality. By inquiring a bit further into how this is accomplished, we can uncover some clues about when an open source approach is likely to work well - and when it is not.

A crucial fact is that there are two types of Linux versions: standard and variant. The few central managers of the Linux community, led by the originator of the operating system, Linus Torvalds, retain the right to label versions of the system as official releases. Each new official release creates another "standard Linux", and millions of digitally perfect copies are made of it.

This control over the definition of the next generation of the operating system is strikingly analogous to a biological mechanism seen in the emergence of multicellular organisms: sequestration of the germ line. This is a restriction of reproductive activity to a few specialised cells, while the vast majority of cells in the organism no longer participate in creating the next generation. In both cases, limiting "reproduction" to a tiny fraction of all the agents reduces the chaotic inconsistency that would follow if all variants had equal opportunity to shape the future.

In the Linux case, the centralised control of changes in the standard code makes higher levels of variety in the proposed changes sustainable, so that the "law of sufficient eyeballs" can come advantageously into play. In the biological case, the restriction functions to alter the evolutionary "incentives" of cells making up the organism. Over succeeding generations their strategies will be far more likely to be those that let them prosper as a "team" rather than those that benefit individual cells at the expense of the others. Analogous incentives are created for the programmers in the Linux case.

Numerous experiments are being undertaken in an effort to imitate the striking success that the open source approach achieved in the Linux case. As usual, we do not claim to be able to predict the success and failure of particular efforts. But Complex Adaptive Systems principles do suggest a number of key questions to ask when contemplating an open source software project. As we have seen, variety is the engine of rapid quality improvement in an open source initiative.

We can see that Linux has at least three, and perhaps all four, of the conditions favouring exploration.

1. Problems that are long-term or widespread.

In contrast to computer hardware and applications programs (such as Web browsers), operating systems are among the longest living elements of the computational world. Unix - of which Linux is a free version - dates back to 1969. It runs on mainframe and minicomputer architectures that long predate the microcomputers that now cover the earth. Thus, an improvement to an operating system is likely to bear fruit over a very long period (as time is measured in the strange universe of computing, with its Moore's Law of doubled computer power every 18 months). Another example of open source development, the Apache Web server, also seems to occupy a functional niche where improvements can be expected to have long service. In addition, the gains from any improvement in a standard version of an operating system can benefit thousands or even millions of users, providing widespread benefits.

2. Problems that provide fast, reliable feedback.

Linux exhibits this characteristic as well. In its typical role in server environments, its features are exercised at very high rates, and defects become evident quickly. Moreover, open source distribution means that every contributor of a proposed variant can make a completely functional new version that can be tested locally. This further increases the rate of feedback. And finally, the quality of proposed variants can be assessed with relatively high reliability. Speed of operation and resistance to crashes are highly valued criteria across the entire community of Linux developers. Disagreements do occur over how these should be measured, and other criteria are also important. But when compared with other software areas, such as user interface design, the appropriate performance metrics are relatively clear.

3. Problems with low risk of catastrophe from exploration.

Various parts of a software system have high levels of interdependence. When there is a premium on speed, as there is for many operating systems, there is a strong temptation to increase even further the interdependencies among modules. This can create a substantial risk of catastrophe. But Unix, from which Linux derives, has long been a partial exception to this tendency. In the Unix-Linux culture there is a well-developed philosophy of modular isolation. A key component, called the kernel, is optimised for speed. But the numerous other components are expected to honour a different set of constraints. There, interdependence among components is governed by strict principles of modularisation that severely limit side effects that any activity might have on other activities. Speed is also important outside the kernel but has to be found within the architectural constraints that give primacy to crash resistance, thus lowering the chances of catastrophic consequences from exploration.

4. Problems that have looming disasters.

This last factor favouring exploration is not a property of Linux open software development but rather of the motivation of some of the developers. Among those who have made major contributions to Linux are many who feared the extinction of the Unix operating system family, in which they have invested their expertise. They also feared the rising hegemony of operating systems from Microsoft. For them, joining a relatively high-risk, exploration-maximising software project may have been an attractive alternative to domination by what they often call "The Beast from Redmond". Taken together, our four conditions show Linux to be a development project for which it is highly promising to strongly encourage variety. It does not follow that open availability of source code is a form of magic that will cause all software projects to prosper as Linux has. Indeed, our analysis suggests that in order for the decentralised generation of proposals to be effective for Linux, several other conditions were important. In particular, Linux development benefited from the ability to identify specific problems, make accurate copies of the current system with only deliberately introduced changes, evaluate the effectiveness of proposed solutions, and centrally control the choice of which proposals are implemented as changes in the standard version. It remains to be seen just how widely applicable the decentralised generation of alternatives can be. But open source software development clearly demonstrates that even very large and highly structured systems, like Linux, can benefit from the encouragement of variety.

Robert Axelrod is a professor of political science and public policy at the University of Michigan. Michael D Cohen is a professor of information and public policy, also at Michigan

Join the CIO Australia group on LinkedIn. The group is open to CIOs, IT Directors, COOs, CTOs and senior IT managers.

Join the newsletter!

Error: Please check your email address.

More about ApacheIBM AustraliaMicrosoftOpen Source InitiativeSendMailSun MicrosystemsTeam Computing

Show Comments
Computerworld
ARN
Techworld
CMO