Pay-as-you-go computing means a lot of things to a lot of people. Use our pass-along guide to explain to your CEO what it is - and is not.
It's an appealing vision: You pay for computing power and software only when you need it. No more money wasted on expensive computer servers that are woefully underused most of the time and software programs that sit on the shelf. Utility computing (UC) is IT efficiency on a grand scale never before experienced - at least in theory.
But what is utility computing, really? Better yet, what does it mean to the CEO? Vendor bombasts would have your CEO believe that the brave, new utility world is just around the corner. And while some businesses are already taking their first steps toward utility (using existing tools and practices), this relatively new form of computing shouldn't be on every company's agenda.
In the interest of making the CIO's life a little easier, we provide this pass-along guide to help CEOs understand whether their company should be on the utility computing path.
What is Utility Computing?
Pinning down a definition of utility computing can be like tying a knot in a snake; just when you think you've done it, the bugger slithers away.
Nearly every major vendor, and a number of minor ones, are promoting their own vision and brand name for utility computing, including Agile Computing, On Demand Computing, N1 and the list goes on. But beyond branding, utility computing is about providing flexible computing resources when and where they're needed. These resources could come from a pool of computing power and software that lives inside a corporate data centre, and is metered out and billed to departments as needed. External service providers also offer such resources, akin to how Salesforce.com currently sells subscription CRM services: You pay for them while you use them or turn them off and don't pay another cent.
"You can provision less and still have the safety margin [to handle unexpected spikes in demand]," says Mike Prince, CIO at Burlington Coat Factory, which is currently moving to a utility computing model built around Oracle databases and clusters of Intel-based hardware. With its prior setup - four separate Unix servers - Burlington had multiple pools of computing resources, with every one of them being bigger than necessary. "None [of the servers] peak or load at the same time of day, day of the month or season of the year. You don't need a 20 percent sludge factor on every single system."
That sludge factor varies dramatically from company to company, but a recent report, "'Pay As You Go' IT Services", by analyst firm Saugatuck Technology, states that businesses often have 50 percent or more surplus IT capacity. Utility computing promises to either get rid of that expensive overhead completely or put the surplus to work.
Join the CIO Australia group on LinkedIn. The group is open to CIOs, IT Directors, COOs, CTOs and senior IT managers.