- 06 June, 2005 13:10
Legacy thinking is giving rise to a generation of applications that are obsolete before they're even out of development. It's time for IT shops to get serious about architecture and system design, and say goodbye to the "duct tape and spit" approach forever.
Every year, world-renowned debugging expert John Robbins,founder of US consultancy Wintellect, spends vast amounts of time consulting, training and debugging for .Net applications of all kinds. One of the most common problems he and his team see in the debugging side of the business, which includes performance tuning, is developers, especially those in IT shops, stuck in the "VB form" mind-set. For years such developers have been doing one-off Visual Basic 6.0 client applications but are now thrust into the wild world of big server programming. Because the rules and techniques for server applications are so radically different, this mind-set inevitably leads to application designs that are "buggy" and do not scale.
"Many times in our debugging business we've gone into a company to work on a problem - especially performance and scalability problems - and we quickly see the problem emanates from the original architecture," Robbins says. "In those cases, that indicates a fundamental thinking problem. In today's world of ASP, .Net and server-based applications, if the developers don't have experience in those server applications, they 'bring what they know' to the architecture, which is definitely legacy thinking.
"The legacy thinking is, unfortunately, prevalent in IT shops in our experience. The majority of those developers targeting Windows have spent the last 10 years focused on writing client applications. The productivity of those developers has been incredible and they have definitely contributed to the bottom line of their organizations. However, there's a huge difference in a client application and a server-based application, and that's the problem . . . We've seen some applications that are version 1.0 development become legacy applications immediately upon release because they have been architected with the 'duct tape and spit' approach."
What Robbins is seeing writ large is a problem Australian IT blogger Paul Reedman identified last year after some maddening experiences of his own.
"There is a problem within IT organizations which I believe is far more serious than legacy systems. I call this problem legacy thinking," Reedman, a 20-year veteran of the IT industry and member of the Queensland Executive Committee for the Australian Computer Society, wrote in his ITToolbox blog last year. "It's a thinking which has not been influenced by the new technologies (Java, .Net and SOA). It's a thinking which is trapped in technologies which were popular 10 to 15 years ago.
"This form of thinking is a problem because it can influence an enterprise IT architecture and a system design. Sometimes this thinking is a result of a lack of education, which means a solid dose of training can usually reform the thinker. In other cases the thinker appears impervious to any change and holds on to the ideas and thoughts which were fashionable a decade ago.
"Maybe I am being harsh, but I suspect that if you search your IT workplace you will find numerous examples of legacy thinking."
The blog entry, inspired by Reedman's own frustrations in getting some developers, particularly those from a Cobol background, to understand the technologies behind a new type of platform his organization was trying to implement, won fairly universal agreement from blog readers. Reedman says that although some of those developers he was working with at the time were able to make the leap and understand the new environments and others could do so with a little training, plenty more continued to find the whole notion very difficult. The age of the programmers concerned was apparently not a factor.
"There were some [developers] who were able to make the leap and understand how componentization works, how Java works, and then some of them were trained, but others found it very difficult. It was very difficult to make them understand the new way of the service-orientated architecture, how services work et cetera.
"I remember there was a Meta Group research [Meta Group has been acquired by Gartner] where they found [with developers] a third could take on the new technology straight away, a third will take it on after some training, and a third won't take it on at all," Reedman says.
His own experience seems to reinforce that finding.
Building Legacy Systems Today
Capgemini Australia vice president technology Bradley Freeman says the real danger is that legacy thinking is still leading to monolithic developments, which will result in future legacy platforms certain to cause the same sorts of problems in 10 years time as those organizations are experiencing now. He says organizations across all industry sectors, including financial institutions and big government agencies, are failing to take advantage of adaptive architectures and adaptive techniques while continuing to build on another layer of legacy for the future rather than trying to produce something more nimble.
"A classic example is the continuing implementation of old-style ERP systems," he says. "We are seeing a huge wave of that out of Europe again. While the software is fresh it's still implemented in the old-style ERP ways. There are massive layers of software architecture in there. What we would like to see is recognition that an architecture is multiple layers, multiple pieces, so we need to look at the best way of filling each block rather than filling the whole thing in one go."
Analysts might be able to show moving to an SOA leads to lower costs of ownership long-term, Freeman says, but many organizations still find it difficult to come to terms with the new environment, which requires high levels of abstract thinking.
"I think it is just a fundamentally different way of thinking," Freeman says. "There has been such a huge leap in the quality of software tools the past few years it's probably gone ahead of some of the thinking. People rushed into Y2K, they rushed out of the dotcom . . . they did not have a chance to do a lot of thinking. Now we are seeing that again. A lot of those systems are up for refreshing now, and a new style of thinking is coming into play. There are a lot of companies playing with server architectures but it's more playing than delivering right now."
However, sometimes it is only when they stop playing that the real difficulties begin. A November 2004 Gartner research note forecasts service-oriented architecture will create demand for programmers who can design, assemble and configure component services, while demand for coders will soften.
Compounding the problems of balancing the maintenance of legacy systems with the need to avoid creating new ones is what Andrew McNeil, a senior product consultant at Cincom Systems of Australia, sees as a herd mentality.
"While legacy thinking may be a problem, 'herd thinking' may be an even greater contributor to problems in IT organizations," McNeil says. "This thinking leads to a silver bullet mentality, which causes technologies to be over-hyped then subsequently [creates] a backlash."
McNeil points out that service-oriented architecture is not even a technology per se, and does not necessarily require Java and .Net technologies, even though many people may not be able to disassociate the technology from the architecture.
"SOA has many merits, but those merits include the ability to integrate so called 'legacy' systems. IBM recently extended Web services support for CICS for example," he says..
"New technologies are often subject to hype beyond their material benefits, resulting in costly rewrites from one technology base to another without gaining any functional benefit. A so-called new technology, Java is 10 years old, and its syntax derives heavily from its direct ancestors C and C++ and it continues to borrow concepts from alternatives like Smalltalk or LISP. There is no question that it is a step forward from C++, but it is not the whole answer. Analysts such as Gartner report high failure rates using these technologies, yet ironically the same analysts continue to promote the practice."
Restricting such analyses to just Java and .Net, McNeil argues, ignores modern dynamic languages such as Ruby and Python, which itself represents legacy thinking.
"New technologies come to the fore due to a confluence of factors including marketing and random circumstance," he says. "The age of a technology does not automatically confine it to the recycle bin. Many technologies are introduced well ahead of their time. If we fail to build on these technologies, we may well continue to reinvent the wheel every few years."
In Two Minds
Jinfonet corporate vice president John Gularson, who specializes in business intelligence (BI) systems, says he typically encounters two different mind-sets when dealing with legacy BI systems. In many cases enterprises are employing a large-scale, stand-alone BI platform that offers a wide breadth of features, most of which are going unused by the majority of end users, while those that are used are frequently insufficiently functional because they were designed to be just a small part of a larger system.
Even though new technologies exist that are solely focused on the most common needs of end users, like Java-based embedded reporting solutions, managers and executives are sometimes reluctant to replace their stand-alone solutions, Gularson says, arguing "if it ain't broke, don't fix it".
However, there are also organizations, working off a platform that was developed in-house, that are too emotionally attached to the platform to let it go easily. "Developers and managers spent so much time and effort building the platform that they just can't let go . . . and as a result, growth sputters because evolving reporting requirements are not met efficiently or at all," Gularson says. "More than anything, it's a reluctance to admit that what was once conceptualized, designed and built isn't the best solution any more."
For instance in one government organization Jinfonet began working with, the department was working off a self-built business intelligence platform that was more than a decade old. Management realized a new, streamlined solution was necessary to meet their current reporting and presentation requirements, but the development team, many of whom had spent their entire careers building and maintaining the original system, opposed the idea. Once egos were assuaged and the new tool was implemented, the group immediately noticed that more projects were being successfully completed on time and that decisions were being made, Gularson says.
Andy Mulholland, group CTO, Capgemini, says fewer and fewer people will be locked into old mind-sets the more time passes, because the rising young managers and executives see the role of technology in a different way from their elders. "As those guys rise up through the ranks - and some are already running business units - their pressure on the way IT functions in the business becomes more acute. Their expectations are higher; their knowledge even of what they think IT should be is very different."
While no CIO can afford to wait for this new generation to take over, it will play a very important role, because if there is no support for change inside a business the business dies, Mulholland says. If you have rising support for change coming from the young Turks - and these people are almost invariably in jobs that are promoting change in the way the company does business - change will come much easier, he says.
"There is something that is going to break the cycle and that is called compliance. Put very simply, compliance breaks the mould because there is no such thing as a compliant application," Mulholland says. "Legacy thinking is fundamentally based around applications that are linked to a department's functional requirements. As soon as you start talking about compliance you are actually talking about trading threads through the business functions that you need to have a view on.
"I actually think compliance is going to be the thing that provides the legitimate reason for changing the way we do things," he says.
Above all though, Capgemini's Freeman says, modernizing organizational thinking requires some rigour and a strong CIO or CEO prepared to understand the services they have to offer the business and to determine the best way of delivering those services, either within the existing infrastructure or a new infrastructure. But he concedes finding the time to take a pause and see what the new world will look like is difficult.
"The challenge we see across all industry sectors is that technology organizations are hard pushed to deliver their operational effectiveness right now. They do not often have a lot of thinking time," Freeman says.
For this reason, organizations would be well advised to give one or two people full responsibility to address those issues. Even technology strategists and technology architects tend to get buried in today's world instead of looking out two to three years, and it can require a leap of faith to say changing the organization's thinking will lead to better service and a lower cost environment long term. It takes a bit of courage to do that, Freeman says, which is probably why few Australian companies, he believes, are heading in the right direction.
"The message we like to give is that small is good," Freeman says. "You can do things incrementally, you can publish services incrementally, and you can start getting that service layer visible to everyone. That is the way to start changing that legacy thinking. That does not require three years of study; it requires that you almost immediately start to understand that services are critical to making those changes visible and making those services visible as soon as possible.
"Our mantra is that an architecture is a bunch of pieces that need to come together. It does not need to be built in one piece: the more nimble you are in using the process tools and the development tools that are available now the more likely you are to end up with a lower cost architecture and less of a new legacy that you going to build for the future."
However, Freeman also believes a general educational program across the organization, designed to make everyone familiar with XML and other new technologies, is very useful. "They do not necessarily have to have an in-depth knowledge of them but should know a bit about them. It's better to educate much more broadly so that they really try to bring everybody up to the same sort of basic level," he says.
The Price of Failure
Wintellect's Robbins prefers to focus developers' minds by reminding them of the costs of failure to get it right. When working with clients looking to transition to .Net, Robbins says he insists again and again that this is their one chance to do it right - they just cannot afford to fail.
"Every company we go into complains about the poor architectures of their legacy applications. For a CIO, nothing is worse than trading one set of difficult-to-maintain and extend applications for a whole new set that's in a new technology."
Robbins says he has seen numerous server applications where the architecture could almost be an exact copy of a client application. For example, he says, in most client applications, you are working in a state environment where you can easily access and keep the state of the single user's interaction in the user interface. (The state refers to selected items, text entered, and so on). When developers apply this legacy thinking to a server application, hauling the state around the network produces huge bottlenecks.
"In some cases where we have done performance tuning on server applications, there is no physical way we can get any acceptable performance out of the application other than re-architecting key portions of the application," Robbins says. "The costs to some of our clients have been huge because they have already spent months on development, and now have to start key areas from scratch. This adds a lot of time and, more importantly, makes the application much harder to extend in the future.
Robbins agrees such problems can be solved by education, but warns that that imperative too often comes up against another legacy mind-set: the notion that a developer can learn all about the new way of development all on his or her own. The developer will often try to learn on the job, he says, when designing and developing the company's spiffy new server applications.
Small wonder many of those new applications written in new environments for the company are designed on a foundation of quicksand, Robbins says. The company has decided to get rid of the legacy applications that are costing them an arm and a leg to maintain, and trade up to a brand new application that will have many of the same problems, but on a worse scale.
"When we work with companies that are starting to make the move to .Net, I scream at them: 'This is your one and only chance to get your architecture right!' They are going to be living with these architectures for the next 15 to 20 years. Unfortunately, not all companies get it."
Most developers are well aware they lack the skills and desperately want training and help to maximize these new development environments. Front line managers also see the advantage of education. Unfortunately, Robbins says, upper management rarely does. For some reason, an unfortunately high number of upper managers take a very short-term view and feel developer education or mentoring is something they can live without.
"Upper management needs to understand that software development is the most rapidly changing piece of their business," he says. "If anything, the treadmill of change is speeding up with each passing year. Legacy thinking is going to cost the company greatly. What's even worse is that the company may even miss excellent opportunities because their developers didn't have the background to take advantage of them."
One of Robbins's clients, an insurance company, made a major commitment to .Net, but before the developers designed anything, they ran everyone through classes on what .Net was all about and how to take advantage of it. Robbins immediately knew they would be in fantastic shape for the future. By contrast, some of his other clients come to Robbins only after they have experienced serious problems (including, for some, losing one million dollars a day). In these cases, a small investment in good training would have paid for itself many times over, Robbins says, and the duct tape and spit might have been spared too.
Moving to the wild world of big server application development can be a huge cultural shift, so CIOs have to be sure that it serves the needs of their business.
Gartner estimates that the worldwide market for application development (AD) services will grow from $US82 billion in 2004 to nearly $US93 billion by 2007, and integration services will grow from $US59 billion to $US70 billion in the same time frame. Gartner predicts erosion in these markets as advances in application development tools, integration software and composite application suites - such as SOA - simplify some aspects of development and integration, but says a market of this magnitude will not disappear overnight, and definitely not before 2010.
Java has already become a premier technology for e-business application development, with Gartner surveys showing 80 percent of enterprises have included Java in their technology portfolios, and most are willing to consider using Java for mission-critical, enterprise-class and global-class AD. But Gartner reports a major a shortage of Java developers to fulfil all of these mission-critical enterprise projects. In such a tight labour market, even if the enterprise can find the trained Java developers it needs, the price to hire them will be high.
On the other hand, migrating existing employees to Java and SOA can be both costly and chancy. Another Gartner research note says migration to Java is an expensive, lengthy and risky process, and finds it is between 2.0 times and 2.7 times more cost-effective to hire an accomplished Java developer than to migrate a Cobol, Visual Basic or C++ developer to Java.
Moving to Java, .Net and SOA can also be a huge cultural shift for those with a Cobol background, says Gartner research director Greta James. Even those eager to change are confronted with totally different paradigms. "For example, I have a friend that I used to work with years ago who became very senior. He retired but he has always loved programming so he decided to learn Java as a retirement project. He gave up because it was all too hard, but also part of the reason for giving up was that there was still huge demand for Cobol and so he is having a lovely time doing some Cobol programs."
James notes there will inevitably be some people that will resist change just as some will embrace it and others fall in the middle. But she also points out many organizations are keeping their legacy applications not necessarily because they have not opened their mind to the new thinking but simply because it does not necessarily make business sense to change.
"The crucial thing with architecture is not so much whether basic legacy technologies are kept or not, or that new technologies are used, but that it's driven by the needs of the business rather than being technology for the sake of technology. It's [a question of] understanding, talking to the business people about what their needs are, and then establishing an architecture that is responsive to those needs and supports those needs."