Every company has got legacy systems stashed away somewhere in their businesses.
Perhaps it's an archaic financial application that the accounting department just can't seem to part with. Maybe it's the sales force's slow-performing customer database that belongs in a nursing home and not in your data center. Or maybe it's networking equipment that's been around since the late 1990s.
My question, then, is just what are the performance, scalability, interoperability or security signs that a system or application has outlived its usefulness and needs to be replaced? Do you anticipate replacing systems as these metrics show them groaning and creaking? Or are there other considerations that make replacing and upgrading a system a must-do-now, like just how important that system is to a particular department or the business's strategy?
I often see company press releases that announce major purchases of new suites of software packages, and I always wonder: What was running before the new one and why did it need to be replaced?
For example, last year The Coca-Cola Company announced that it was moving to standardize on SAP's suite of applications (for supply chain and financials) for certain parts of its vast worldwide operations.
Wal-Mart also announced last year that it purchased SAP financials as well as some other retail-specific applications from Oracle and HP. A statement noted that the SAP "solution will replace some legacy systems while integrating with other internal Wal-Mart systems."
So as a lot of the "bleeding edge" systems from the mid to late 1990s start showing signs of wear and tear, just what are CIOs and IT managers looking for when it comes time to consider the overall value and usefulness of a system?
Is it purely performance-driven and based on specific needs of the business? Or do internal corporate politics (meaning, whoever screams the loudest) also play a role here?
Join the CIO Australia group on LinkedIn. The group is open to CIOs, IT Directors, COOs, CTOs and senior IT managers.