Menu
Menu
The Science of Software Development

The Science of Software Development

Back in 1987, before the Software Engineering Institute (SEI) had developed its capability maturity model (CMM) for software, Mark Paulk had a simple way of turning ON executives to the wisdom of investing in software process improvement.

He'd start by asking: "Managers, do you think managing software projects is a good idea?" He never heard an executive disagree.

Then he'd ask: "Do you think it would be a good idea if you learned from the successes and problems that happened on other projects and applied that learning to your next project?" Sure they did.

So he'd continue: "Do you think it would make sense to measure your process and product and use that data to help drive the decisions you make? Do you think it's a good idea to continually measurably improve the way you build software?

"And by that time the religious fervour was kicking in and they'd say: ‘Yeah, verily I believe!'." Paulk -- a senior member of the technical staff at the SEI, Carnegie Mellon University in Pennsylvania -- told this to a recent Australian Information Industry Association (AIIA) forum.

Mysteriously enough, though, hardly any of the organisations those executives represented was transforming that evangelical zeal into practice. In 1989, when SEI conducted its first survey of the industry, more than 90 per cent of software organisations had significant problems in fundamental project management disciplines.

For instance, requirements management -- the notion of having a common understanding with your customer of the work the software project should do -- was an Achilles' heel. Despite the widespread availability of techniques for dealing with changing requirements, including configuration management and incremental and evolutionary life cycles control, projects were frequently coming adrift in the face of organisational pressure for ad hoc changes.

"The common practice was, as one bank reported, that a vice president would be walking down the hall and see a programmer and say: ‘George, I was in the shower this morning and I thought of this really neat functionality to add to the system'," Paulk says. "And George, being a responsive professional, would say: ‘I'll be happy to add a widget into the system'. Now the thing you have to realise is that there are only two ranks in the bank that I have been able to determine: clerk and vice president. So that meant there were a lot of opportunities for additional functionality to creep in on the side."

The result? Cycle times twice as long as necessary, simply because organisations were failing to build only the things they intended to build, Paulk says.

Not much had changed six years later when the Standish Group released its Chaos Study (see chart, page 60). It concluded that in 1995 US companies and government agencies would waste a staggering $US81 billion on cancelled software projects. The same organisations would spend $US59 billion more for software projects that would be completed, but exceed their original time estimates. Overall the study found the US was spending more than $US250 billion each year on IT application development of around 175,000 projects. In addition, an astounding 31.1 per cent of projects were cancelled before they ever got completed and 52.7 per cent cost 189 per cent of their original estimates.

The cause of such failures is clear: the fundamental inability of organisations to manage the software process.

Meet the Software CMM, developed as a common-sense application of process management and quality improvement concepts to software development and maintenance. Designed to apply some judicious engineering and management practices to developing a model for building organisational capability, the CMM seeks to provide a simple way to help organisations improve their software process.

Cast of Thousands

The SEI began developing its process maturity framework in 1986, with the help of the Mitre Corporation. By September 1987 the SEI had released a brief description of the process maturity framework and developed two methods for appraising software process maturity. Then the SEI called on literally thousands of volunteers to review the work to ensure it could repeatedly and consistently succeed against budget and schedule and functionality targets for software projects. Four years later the SEI had evolved the maturity framework into the CMM, based on knowledge acquired from software process assessment and extensive feedback from government and industry.

The SEI says the CMM is designed to help companies select process improvement strategies by first determining their current process maturity and then identifying the few issues most critical to software quality and process improvement. It presents a set of recommended practices in key areas that have been shown to enhance software process capability. It gives organisations guidance on how to master their process for developing and maintaining software and how to evolve towards a culture of software engineering and management excellence.

The essential notion is that focusing on a limited set of activities and working aggressively to achieve them helps an organisation steadily improve its organisation-wide software processes to enable continuous and lasting gains in their capability. Underpinning the CMM is an evolution from what Paulk calls "bottom-dwelling, mud-sucking" level-one organisations where software processes tend to be improvised and management is focused on fire-fighting, to mature level-five organisations which possess an organisation-wide ability for managing software development and maintenance processes.

Immature software organisations routinely exceed their schedules and budgets, based as these are on unrealistic estimates. Hard deadlines are only met at the cost of functionality and quality, and there's no objective basis for judging product quality or solving product or process problems. Review and testing are frequently curtailed or eliminated when projects fall behind schedule.

Contrast this with the profile of the mature software organisation, characterised by its organisation-wide ability to manage software development and maintenance processes. Mature organisations accurately communicate the software process to staff and ensure work activities are carried out to plan. They guarantee mandated processes are usable and consistent with the way the work actually gets done, updated when necessary, and refined through controlled pilot-tests and/or cost-benefit analyses. Project roles and responsibilities are clearly spelt out.

"A decade ago we had arguments about whether it was really possible to manage software projects. That they were really this artistic, innovative, creative endeavour. That you shouldn't try to constrain the technical folk too much, because they needed to solve the really large projects," Paulk says. "We came at it with a somewhat different philosophy from those who were in that particular camp.

"What we've done is we've put something together that made sense to a lot of folks -- common sense, as I said. But you do get the question: ‘Does this actually work in practice'?"

The answer to that would seem to be a resounding yes. A number of case studies have testified to reductions in cycle time and impressive increases in productivity, quality and predictability, with improvements in maturity. For instance, a study by the (US) Air Force Institute of Technology four years ago looking at projects in organisations at different maturity levels found that level-one organisations tended to have significant cost overruns. There was a lot of variation in level-two organisations, but many were achieving cost under-runs. Organisations in level three tended to be right on target with cost projections, with the variation going down very markedly. Paulk says level-five organisations typically report defect densities in the order of seven defects per million lines of code, meaning they're running about three orders of magnitude better than industry norms.

Boeing is an enthusiastic early adapter of the CMM, as John Vu testified in a keynote talk at SEPG ‘97, entitled Software Process Improvement Journey (From Level 1 to Level 5). Vu issued research showing Boeing enjoyed dramatic increases in cycle time, significant increases in productivity, significant decreases in post-release defects, and more accurate software estimates as it moved up the CMM.

That's the kind of data that gets management's attention, Paulk says. The outcome was that Boeing told its 600 suppliers of software-intensive systems that unless they were serious about doing process improvement they obviously weren't seriously interested in remaining a supplier to Boeing. Motorola took a similar approach. The US government is also insisting suppliers show high levels of capability maturity.

Suddenly the SEI started getting calls from France and Germany from people asking: "How do you spell CMM again?".

Caveat

Paulk says there's an important caveat for companies looking to adopt the CMM. Process management principles help companies deal with the known: things like requirements and design and code and tests, using the lessons gained from previous projects. But executives also have a responsibility to be sensitive to the unknown: those issues that have yet to be met and risks that have to be actively managed.

"In the quantitative sense, we can quantitatively manage the process that we have a good knowledge of, but there are things that are unknown that we have to manage also. So we need more of the risk management kind of ideas in that area," Paulk says.

"What it really comes down to is that you have to use common-sense when you're applying any of these models. And letting common-sense prevail is actually one of the scariest things that I would suggest that you have to do in process improvement, because it's all too uncommon.

"We see a lot of people who are very concerned about the idea of applying process discipline to their environment, because they know that there is no common-sense presence present. That means you are going to wind up with mindless bureaucracy stifling everything that you do. At least if you have common-sense without process discipline you can exist in a state of creative chaos. And as a technical person I can tell you right now, creative chaos is kind of fun. You can get away with murder in a creative-chaos kind of environment.

"If you don't have any common sense and you don't have any discipline, obviously you have mindless chaos; and let's all try and avoid those kinds of environments," Paulk says.

The Model of Success

The Software Engineering Institute (SEI) Capability Maturity Model (CMM) for Software describes the principles and practices underlying software process maturity. Its raison d'etre is to help software organisations improve the maturity of their software process through an evolutionary path from being ad hoc, chaotic to mature, disciplined.

There are five maturity levels categorised under the CMM:

Initial. The software process is characterised as ad hoc, and occasionally even chaotic. Few processes are defined, and success depends on individual effort and heroics.

Repeatable. Basic project management processes are established to track cost, schedule, and functionality. The necessary process discipline is in place to repeat earlier successes on projects with similar applications.

Defined. The software process for both management and engineering activities is documented, standardised, and integrated into a standard software process for the organisation. All projects use an approved, tailored version of the company's standard software process for developing and maintaining software.

Managed. Detailed measures of the software process and product quality are collected. Both the software process and products are quantitatively understood and controlled.

Optimising. Continuous process improvement is enabled by quantitative feedback from the process and from piloting innovative ideas and technologies.

The SEI claims predictability, effectiveness, and control of an organisation's software process will improve as the organisation progresses upwards through the five levels. SEI claims that while not rigorous, the empirical evidence to date supports this belief.

Except for Level 1, each maturity level is decomposed into several key process areas that indicate the areas an organisation should focus on to improve its software process.

The key process areas at Level 2 focus on the software project's concerns related to establishing basic project management controls. They are requirements management, software project planning, software project tracking and oversight, software subcontract management, software quality assurance, and software configuration management.

The key process areas at Level 3 address both project and organisational issues, as the organisation establishes an infrastructure that institutionalises effective software engineering and management process across all projects. These processes are organisation process focus, organisation process definition, training program, integrated software management, software product engineering, intergroup coordination, and peer reviews.

The key process areas at Level 4 focus on establishing a quantitative understanding of both the software process and the software work products being built. The areas are quantitative process management and software quality management.

The key process areas at Level 5 cover the issues that both the organisation and the projects must address to implement continual, measurable software process improvement. These areas are defect prevention, technology change management, and process change management.

Each key process area is described in terms of the key practices that contribute to satisfying its goals. The key practices describe the infrastructure and activities that contribute most to the effective implementation and institutionalisation of the key process area.

Source: The Capability Maturity Model for Software, by Mark C Paulk, Bill Curtis, Mary Beth Chrissis, all of the SEI and Charles V Weber, IBM Federal Systems Company-- S BushellIt's How You Play the GameThe Capability Maturity Model (CMM) can lead to dramatic improvements in software process capability. On the other hand, organisations should never get so caught up in the game of trying to achieve a particular CMM rating that they fail to respect either the intent of the CMM or the positive aspects of a well-planned process improvement program.

So, at least, says David Klein, a staff software engineer with US company Lockheed Martin, which has attained a CMM Level 5 rating, the highest rating for a company's software development capability. The SEI CMM Level 5 is the optimising level that allows improvements to existing processes and illustrates excellence in the evaluation, adoption and integration of new processes and technologies.

However, in a paper he wrote on the CMM Klein warns that it is as fallacious to automatically equate the CMM level of an organisation with its ability to produce high-quality software as it is to associate grades with intelligence.

"In reality, organisations considered immature can and do produce high-quality software just as mature organisations can and do produce low-quality software," Klein writes. "Organisational maturity is no guarantee of success; it merely increases the likelihood of success."

Klein says the more process improvement programs (PIPs) in which he participates, the more he is convinced this misuse of the CMM is more about selling a capability than improving a capability. And he warns this can be a serious problem. "A software PIP does not come without a cost. From a software-engineering viewpoint, the cost of a well-managed PIP has a justifiable return on investment if the PIP increases the quality of the software product being produced. Conversely, if the goal of the PIP is merely to achieve a good software capability evaluation (SCE) rating, the cost of the program is unnecessary overhead and will detract from the overall quality of the software."

Klein concludes that unless an organisation believes in the merits of process improvement, the PIP activities become self-defeating. This in turn further erodes confidence in PIP activities, and may culminate in complete dismissal of further attempts to improve the organisation's business practices.

-- S Bushell

The Habits of Highly Effective Developers ¥ Prepare three separate time and cost estimates based on past experience, software functionality and a formal estimating technique, and compare actual results with predictions.

¥ Adopt a standard notation scheme and methodology for design and coding.

¥ Automate control of the development process and link it to a project-management tool.

¥ Use joint application design for requirements analysis.

¥ Practice iterative development and time-boxing.

¥ Institute a formal change-request process to prevent scope creep.

¥ Establish centres of excellence -- encourage the development of specialists in each development procedure.

¥ Measure productivity and defect removal.

¥ Employ component-based development.

¥ Establish clean-room techniques for component building.

¥ Institute version control and configuration management.

¥ Design and test for usability.

¥ Practice code inspections and walk-throughs.

-- Mickey Williamson

Join the CIO Australia group on LinkedIn. The group is open to CIOs, IT Directors, COOs, CTOs and senior IT managers.

Join the newsletter!

Error: Please check your email address.

More about Australian Information Industry AssocAustralian Information Industry AssociationBoeing AustraliaCarnegie Mellon University AustraliaEmpiricalEvolveIBM AustraliaIntergroupLockheed MartinMellonMotorolaOn TargetStandish Group

Show Comments

Market Place