American Programmer, Volume X, No. 1; January 1997
Table of Contents This Issue | 1997 Issues | Other Years
Copyright 1997 Cutter Information Corp. All rights reserved. No part of this document may be
reproduced in any manner without express written permission from Cutter Information Corp.


by Jim Highsmith

Copyright 1997 by Jim Highsmith. All rights reserved.


Requirements definition needs to be viewed within the overall context of a development life cycle, and even more importantly, an overall development philosophy -- our mental model of how the world works. And that view is changing.

This changing view was illustrated by a Harvard Business Review article in which Brian Arthur culminated more than two decades of trying to convince mainstream economists that their world view, dominated by fundamental assumptions of decreasing returns, equilibrium, and deterministic dynamics, was no longer sufficient to understand reality. The new world is one of increasing returns, instability, and inability to determine cause and effect:

The two worlds . . . differ in behavior, style, and culture. They call for different management techniques, different strategies. . . . They call for different understanding [1].

The software development community has a similar dichotomy. One is represented by the more traditional deterministic development, derived from management practices rooted in nineteenth-century Newtonian physics of stability and predictability -- or in Arthur's terms, decreasing returns. As more industries move from decreasing to increasing return environments, traditional software management practices will be inadequate to meet the challenge. This article is about the second world -- unpredictable, nonlinear, and fast. These industries, such as the Internet, are like the start of a motocross race -- 50 high-powered motorcycles at the starting line and only room for three or four at the first turn 50 yards away. Traditional practices put your bike at the back of the pack, or out of the race altogether.

This article offers a different framework -- adaptive software development -- to address the issues of this second world. From a conceptual perspective, adaptive software development (ASD) is based on complex adaptive systems (CAS) theory, which Brian Arthur and his colleagues at the Santa Fe Institute have used to revolutionize the understanding of physics, biology, evolution, and economics. It is rooted in agents, self-organization, and emergent outcomes. From a practical perspective, the ASD framework is based on years of experience with traditional software development methodologies; consulting on, practicing, and writing about rapid application development (RAD) techniques; and working with high-technology software companies on managing their product development practices.

Recent articles and conference panels have illustrated these two worlds of software development, contrasting the Software Engineering Institute (SEI) method and the "Microsoft process" (there hasn't been a consensus on what to call this alternative). I would submit that the SEI approach is an example of the deterministic approach, and the Microsoft process is an example of an adaptive development approach.


Given the broad scope of CAS theory, this article can only highlight two concepts of special relevance.


A growing percentage of software product development projects are so complex that the outcomes are inherently unpredictable. And yet, successful products emerge from such environments all the time. So what is happening? One of the most difficult mental model changes in adaptive development is understanding that while the direct linkage between cause and effect is broken, there is an equally strong replacement.

The replacement is emergence, illustrated by a fascinating chapter in complex systems research. In the mid-1980s, Craig Reynolds created a computer simulation to capture the essence of the flocking behavior of birds. [Footnote 1 -- To view the "boid" simulation, look in the Yahoo Web search under Science: Artificial Life.] Each "boid" in the simulation followed three simple rules of behavior:

1. It tried to maintain a minimum distance from other objects in the environment, including other boids.

2. It tried to match velocities with boids in its neighborhood.

3. It tried to move toward the perceived center of mass of boids in its neighborhood [5].

Note the absence of any group rules of behavior. There are no algorithms defining the results expected from the group, only rules about the behavior of individual boids. Yet the behavior, an emergent behavior, is one of flocking like birds. Groups of boids flow over the screen landscape. Errant boids rush to catch up. The flock splits around obstacles and reforms on the other side.

There are, then, two diametrically opposed ways of answering some of the most basic questions to be posed about organizational life. One way . . . leads to ways of managing and organizing that greatly constrain individual freedom. . . . The opposite view is based on the notion that a creative new order emerges unpredictably from spontaneous self-organization [4].

While emergence is only a part of complex adaptive systems, it may be the most important from a management perspective. The challenge is how to utilize this knowledge to enhance our ability to manage product development.


Within the software development context, complexity has more to do with the number of interacting agents and the speed with which those agents interact than with size and technological complexity. The following conception is derived from M. Mitchell Waldrop:

[Remember] that a great many independent agents are interacting with each other in a great many ways. Think of the quadrillions of chemically reacting proteins, lipids, and nucleic acids that make up a living cell [5].

For software products, the need for adaptive development arises when there are a great many independent agents (boids) -- developers, customers, vendors, competitors, stockholders -- interacting with each other, fast enough that linear cause-and-effect rules are no longer sufficient for success. Size and technological complexity are less important factors.


Just as physicists and economists have begun to deal with the unpredictability of their sciences, software developers are revising their management practices. There appear to be three broad stages in this progression, which are illustrated by focusing on the evolution of system development life cycles. The last, the adaptive life cycle, is just emerging.

Waterfall Life Cycle

For many years, the dominant software life cycle was the waterfall. It is characterized by linearity and predictability, with a modicum of feedback thrown in for good measure. The waterfall approach, illustrated in Figure 1, produced the bulk of legacy systems in today's organizations.

Figure 1: The waterfall life cycle.

Evolutionary or Spiral Life Cycle

Since the mid-1980s, the evolutionary life cycle has emerged, based on the pioneering work of Tom Gilb and Barry Boehm (see Figure 2). (For the purposes of this article, I won't differentiate between the two.) While the evolutionary model has moved into the mainstream, many practitioners have not changed their deterministic mind-set. Long-term predictability has been abandoned for short-term predictability, but it is predictability nonetheless. For example, Tom Gilb's recent work entails detailed component planning and great precision in specifying requirements. Some practices, such as RAD, utilize evolutionary life cycles in less deterministic ways.

Figure 2: The evolutionary life cycle.

The Adaptive Life Cycle

The adaptive model is built on a different world view. While cyclical like the evolutionary model, the phase names reflect the unpredictable realm of increasingly complex systems (see Figure 3). Adaptive development goes further than its evolutionary heritage in two key ways. First, it explicitly replaces determinism with emergence. Second, it goes beyond a change in life cycle to a deeper change in management style. The difference can be subtle. For example, as the environment changes, those using a deterministic model would look for a new set of cause-and-effect rules, while those using the adaptive model know there are no such rules to find.

Figure 3: The adaptive cycle.



In complex environments, planning is a paradox. According to CAS theory, outcomes are unpredictable. Yet wandering around, endlessly experimenting on what a product should look like is not likely to lead to profitability either. "Planning," whether it is applied to overall product specifications or detail project management tasks, is too deterministic a word. It carries too much historical baggage. "Speculate" is offered as a replacement.

When we speculate, it's not that we don't define a mission to the best of our ability. (I use "mission" as a summary term for objectives, vision, goals, and outline requirements.) It simply admits the obvious -- in some important dimensions of our mission statements, we are more than likely wrong. Whether we misread our customer's needs, or technology changes, or competitors come out with a better mousetrap, the probability of mistakes is high. So let's be honest, postulate a general idea of where we are going, and put mechanisms in place to adapt. In a complex environment, following a plan produces the product you intended -- just not the product you need.


Managing in a complex environment is scary as hell -- it is also a blast. If we can't predict (plan), then we can't control in the traditional management sense. If we can't control, then a significant set of current management practices is no longer operable, or more specifically, only operable for those parts of the development process that are predictable.

Collaboration, in this context, portrays a balance between managing the doing (the main thrust of traditional management) and creating and maintaining the collaborative environment needed for emergence. As projects become increasingly complex, the balance swings much more toward the latter. At times it seems almost mystical, but in field after field, from physics to cellular automata to some of my client's projects, emergence has, well, . . . emerged [4, 5]. We have all experienced emergent results on some special project, but it somehow seemed nearly accidental, not something to count on in a crunch. CAS provides some comfort that it is not accidental.

For a project manager, maintaining this balance means two things. First, he or she has to decide which parts of the project are predictable. For example, we can predict that without appropriate configuration control procedures, a software project of any size can implode. For parts that are unpredictable, he or she has to establish an environment in which the wild and wonderful properties of emergence -- basically open, collaborative, messy, exciting, diverse, anxiety-ridden, and emotion-laden -- can exist.

Unfortunately, at least for some people, CAS postulates certain conditions for emergent behavior. The most important is that it happens at the edge of chaos. Whether in physics, biology, business, or human behavior, it appears that there are three broad categories of environments -- stable, unstable (or chaotic), and a transition zone labeled the "edge of chaos."

The edge of chaos is the constantly shifting battle zone between stagnation and anarchy, the one place where a complex system can be spontaneous, adaptive, and alive [5].

In human terms, being in chaos is analogous to being psychotic. So the trick is to lead a project team away from the familiar and the stable toward chaos, but not all the way. Success comes to those who can hold anxiety, who can attune themselves to paradox and uncertainty. Innovation, creativity, and emergent results are born in the transition zone at the edge of chaos.


Collaborative activities build products. Learning activities expose those products to a variety of stakeholders to ascertain value. Customer focus groups, technical reviews, beta testing, and postmortems are all practices that expose results to scrutiny.

Learning, according to the dictionary, is gaining mastery through experience. In an adaptive environment, learning challenges all stakeholders, including both developers and customers, to examine their assumptions and use the results of each development cycle to learn the direction of the next. The cycles need to be short, so teams can learn from small rather than large mistakes. They also need to be double-loop, so teams learn both about product changes and more fundamental changes in underlying assumptions about how the products are being developed.

Speculate -- Collaborate -- Learn

If you examine the speculate -- collaborate -- learn cycle, even briefly, it becomes obvious that the three stages overlap. It is difficult to collaborate without learning or to learn without collaborating. They are purposely messy, nonlinear, overlapping terms -- how could terms that describe an adaptive framework be otherwise?

For many project leaders and project teams, adaptive development is a terrifying prospect. First, we knock away the foundation pillar of cause and effect, so we can't say for sure what needs to be done next. Next we force the team into meeting deliverable goals, but we admit we don't know exactly what they are. Then when they then get anxious and concerned about all the "seemingly inefficient" groping around for solutions (because of the needed diversity and multiple interactions), we have to say that high anxiety is part of the new game, and it won't go away. And finally, when a successful product emerges, to many it seems almost accidental. It is not a place for the timid.

Are there organizations practicing adaptive development? I would offer Microsoft as an example of a company that excels because its management style exemplifies adaptive development. For all the techniques and practices discussed in books like Microsoft Secrets [3], I believe Microsoft prospers because at the core of these practices lies the abandonment of determinism and an embrace of the messiness of speculation -- collaboration -- learning.


The lens of ASD offers a different perspective on software management practices. The following is a brief examination of two of these, quality and RAD, both of which have ramifications for gathering requirements.

Do It Wrong the First Time

Using our new lens, let us look at the current state of software quality management practices epitomized by the phrase "Do it right the first time." In a complex environment, "Do it right the first time" is a recipe for failure.

First, how can we predict what right means? In the early stages, if the delivery time horizon isn't too far out, we may be able to speculate on what the generally correct direction is, but defining "right" borders on fantasy. Even if we could define right, doing it the first time makes no sense for other than trivial products. The first time assumes we understand the cause and effect, the specific algorithm of getting to the final product from our initial starting position, and the needs of all stakeholders -- it says we know it all.

Writers James Bach and Ed Yourdon have addressed this issue from the perspective of good enough software. Although Bach's ideas have resulted in raising the issue of quality's multi-dimensionality, his terminology set off a firestorm of reaction. "Good enough" seems to indicate a compromise position -- settling for less than the best. It offends many developers whose value system tends toward the goal of perfection.

To me, "good enough" means something like "best value." In a complex environment, the combinations and permutations of value components -- scope (features, performance, defect levels), schedule, and resources -- is so vast, there can never be an optimum value. Good enough is not settling for average, it is delivering the best in a given competitive situation.

Again using Microsoft as an example, I would offer that it does not build good enough software, it builds the best software in the environment in which it competes. Conversely, if Microsoft succumbed to deterministic quality measures, it probably would not survive for long.

RAD Practices: A Step in the Right Direction

RAD practices generally involve some combination of the following [2]:

  • Evolutionary life cycle
  • Customer focus groups, JAD sessions, technical reviews
  • Timeboxed project management
  • Continuous software engineering
  • Dedicated teams with war rooms

Most RAD projects I've been associated with over the last five years have had an adaptive, emergent flavor. More recently, as my practice has focused on speculating and establishing collaborative environments, emergent characteristics have become even more pronounced.

RAD is an anathema to many IT organizations. It works, but many would prefer it did not. Their defense is to label RAD as "hacking" and/or to relegate it to small (and by implication relatively unimportant) development projects. But Microsoft and others have produced incredibly large and complex software using techniques comparable to RAD. RAD is scary to traditional organizations because it raises questions about their fundamental world view.

RAD practices and the Microsoft process are both examples of adaptive development in action. Giving them a label (i.e., adaptive development) and realizing there is a growing body of scientific knowledge (i.e., CAS theory) that begins to help explain why they work should provide a basis for more extensive use of these practices.


Is Newtonian physics useless today? Certainly not. If the job is to design a building in San Francisco or a bridge across the Potomac, traditional physics works fine. But in quantum physics, the speed of the agents (e.g., quarks) surpasses the capabilities of Newton's theories.

Similarly, the deterministic and adaptive views of software management have complementary, not adversarial, positions. In fact, there is not a clean demarcation between two worlds but a spectrum where a blending is needed to succeed. The difficult part in utilizing both is a human one; namely, the challenge of shifting between the divergent fundamental assumptions that govern each view.

With the pace of competitive change, the reduction of product delivery cycles, and the explosion of new technology, more industries must compete in an environment of increasing returns -- where the rich get richer, and the poor scratch for survival.

Moreover, these technological webs can undergo bursts of evolutionary creativity and massive extinction events, just like biological ecosystems [5].

As businesses shift from resource-based products to knowledge-based products, software development will help determine which companies prosper in increasing return environments. That development will need to migrate toward an adaptive development-based viewpoint to meet the demands of this unstable, complex, messy new world.


1. Arthur, W. Brian. "Increasing Returns and the Two Worlds of Business." Harvard Business Review (July-August 1996).

2. Bayer, Sam, and Jim Highsmith. "RADical Software Development." American Programmer, Vol. 7, no. 6 (June 1994), pp. 35-42.

3. Cusumano, Michael, and Richard Smith. Microsoft Secrets. New York: Free Press, 1995.

4. Stacey, Ralph D. Complexity and Creativity in Organizations. San Francisco: Berrett-Koehler, 1996.

5. Waldrop, M. Mitchell. Complexity: The Emerging Science at the Edge of Order and Chaos. New York: Simon & Schuster, 1992.

Jim Highsmith is a principal at Knowledge Structures, Inc. He has 25 years of experience as a consultant, software developer, and manager. Mr. Highsmith is the primary developer of RADical Software Development, which reflects his experience in information technology planning, software engineering, project management, and process improvement. His work has encompassed a broad range of clients, from PC software companies to large IT departments.

Prior to his association with Knowledge Structures, Mr. Highsmith held both technical and management positions with CASE, computer hardware, banking, and energy companies. He has a B.S. in electrical engineering and an M.S. in management.

Mr. Highsmith has spoken at technology conferences and user groups and has published nearly a dozen articles. This article is excerpted from his forthcoming book, which will be published by Dorset House in late 1997. Outside of work, he can usually be found in the mountains around Salt Lake City hiking, skiing, and rock climbing.

Mr. Highsmith can be reached at 1161 East 9th South, Salt Lake City, UT 84105 (801/581-9679; fax 801/581-9670; e-mail:

Back to Top | Table of Contents for This Issue
Newsletter Home Page | Resources for IT & Software Professionals

Copyright 1997 Cutter Information Corp. All rights reserved. No part of this document may be reproduced in any manner without express written permission from Cutter Information Corp.