The IT profession is at a turning point. One group of IT practitioners already knows what needs to be done - but the traditionalists call it radical. And the traditionalists continue to apply the same old ways of doing things that result in the same old horrendously expensive multi-year projects that produce systems barely better than what was there before, if they even work at all.
What differentiates these two groups is the manner in which they perceive and respond to complexity. A good example of this is the multi-year, multi-million dollar system infrastructure upgrade projects that so many companies are forever battling their way through. How’s your three-year systems upgrade project coming along? How many companies are going into their fifth and sixth years on those three-year projects? How many companies have spent themselves into a hole on those projects and now don’t have money to respond to the unexpected developments occurring in their markets? The reputation of the whole IT profession is dragged down by the traditionalists.
People not skilled in the use of effective techniques for dealing with complexity look at the work involved on these massive projects and become totally overwhelmed. They fall back on the use of clumsy, slow-moving, bureaucratic ways of doing things. They mumble about project management and adopt cumbersome analysis and development procedures that attempt to address all needs all at once. Analysts analyze, programmers program, documents pile up, and the years go by. Progress is gained slowly and at great expense (if there is progress at all).
An Agile and Iterative Approach
Another approach would be to respond to complexity by making rigorous use of what I believe are the six core techniques in the IT profession - joint application design; process mapping; data modeling; system prototyping; object-oriented design and programming; and system testing and rollout.
We can use these techniques to break up big problems and reduce complexity into manageable, self-contained pieces. We can use these techniques to deliver value fast by making progress right away on the simpler pieces of a problem and building solutions to the more complex pieces iteratively over time.
A reader of this blog (who does not wish to be named but agreed to have his ideas shared here – we’ll call him Carl) contacted me a while back with a story about the hard times the IRS has gone through trying to upgrade their systems infrastructure (here's an insightful article on that by Roger Sessions). Developing a suite of systems to process the tax returns of individuals and corporations has to be one of the most complex jobs I can imagine. Business processes may be complex, but they can’t hold a candle to processes born of politics and government regulations. How could anyone ever make headway on a project like this?
Carl had this to say [italics are mine], “Here’s how I believe the IRS systems should have been done and the same is true of most large multi-business case applications.” He laid out an approach that I realized immediately was a great way to solve a problem like this. His approach displays elegant simplicity in the face of apparently overwhelming complexity. It is elegant in the way it uses just a handful of techniques to manage this complexity. He said, “Process each [tax payer’s] account in its own thread – not whole files – it greatly reduces complexity and latency and increases scalability. Use pipe-and-filter with no intervening files instead of [the traditional approach involving] file-process-file-process-etc. This architecture was described by Mary Shaw at Carnegie Mellon years ago.”
Join the CIO Australia group on LinkedIn. The group is open to CIOs, IT Directors, COOs, CTOs and senior IT managers.