You're Virtually There

You're Virtually There

A staple in the mainframe world for close to 30 years, virtualization is at last attracting interest from Australian government agencies.

In any organization, servers account for the bulk of capital cost - and they proliferate like rabbits. But what if there was a way to slash the number of servers while dramatically increasing the remaining servers' ability? Well you can: welcome to the virtual world.

What most people - and that includes a good handful of CIOs - do not realize is that your average server is only working at around 15 percent of its capability. In other words, there is as much as 85 percent of unused capability on every server in any organization at any given time.

Now consider that in a typical government department there can be 600 to 800 servers, some of them running applications that are not compatible with other servers (which is partly why the number of servers grows . . .), and every time new applications come out or the department needs to do something different, the number of servers grows yet again. This happens every year across every medium-to-large organization and that in turn means there is significant capital cost involved in server deployment, not to mention inefficiencies brought about because of different software applications running on different servers.

It has been the scenario for years and it is a situation that has largely been accepted as a fait accompli by CIOs across both government and the private sector. But now a technology that has actually been in existence in the computer mainframe world for 30 years has made the jump from the big box to the small box and brought with it a brave new world. It is called virtualization.

Virtualization - potentially of both servers and data storage - promises not only better efficiency but also significant cost savings. And the other sweetener is that the ROI can be counted in weeks, or at most a few months, and that is significant in a world where ROI is sometimes never realized.

Really There, or Virtually There?

Virtualization essentially internally carves up a server into many smaller self-contained servers. These virtual machines let you run multiple operating systems on a single CPU, allowing an organization to consolidate their physical servers from a farm of typically hundreds of servers that are barely being used, down to as few as eight servers that are highly utilized.

It might be expected that there would be data security issues with virtualization, but proponents claim the images of the virtual machines can be easily copied so that if a server falls over it can be replicated and fired up on another box, removing the need for costly redundant servers. And the virtual machines run in total isolation from each other, so a developer can test systems to crash point without disrupting any other applications they may be running. This also removes the need for dedicated machines to test solutions.

"Server virtualization is about separating the bars that you have on an operating system and the underlying hardware so that you can run multiple operating systems at the same time on the same server," explains hardware provider VMware's co-founder and CTO, Ed Bugnion. "Yes, this technology has been in existence for over three decades on mainframes but it was recently introduced for industry standard servers too," Bugnion says.

Bugnion points out that the technology is very easy to deploy because it does not ask CIOs to change their software strategy.

"Typically, deploying a new application today means getting a whole new set of server hardware. You may have a thousand servers in a data centre but they will almost certainly only be utilized for between five and 15 percent of the time. Yet in order to deploy a new application you will have to purchase three or five or even 10 more servers."

Bugnion says that a classic scenario is that an organization has, for example, 300 servers. When they fit virtualization boxes they can condense the 300 down to a mere six servers.

"It's a dramatic increase in the density of your data centre and it leads to dramatic increases in flexibility. Now, with virtualization you can always provision a new machine on top of your existing infrastructure. You can still run the operating system and the applications of your choice," Bugnion says. "The technology is intrinsically compatible with the existing environment and not only that, it only takes a short number of weeks from the beginning of the IT project to get the system up and running."

VMware is acknowledged as a leader in this field and perhaps not too surprisingly the company has seen growth of 100 percent year-on-year for the past five years as more and more organizations embrace virtualization, driven by the huge savings that soon eventuate.

"The really big point is you only need five percent of the servers you originally needed, so that leads to a very dramatic reduction in the hardware acquisition costs, as well as the ongoing cost to run the servers," Bugnion says. "The conventional wisdom at the time when we started our company was that you could not build virtual machine systems for Intel systems, which most government departments are primarily running.

"People did not think you could build this technology but we cracked it and since 2001 when the first products were launched there has been very rapid adoption of the technology worldwide because of the benefits it provides."

Virtual Conversions

One of the government organizations here in Australia to have embraced virtualization is Brisbane City Council (BCC), whose IT delivery director, Bill Croucher, is a firm convert.

BCC is Australia's largest local government council, with 6500 staff serving close to a million people and with an annual budget of $1.5 billion. The council provides water, sewerage, public transport (both bus and ferry), service management and city administration. That requires numerous business applications, and so the BCC network connects to 120 offices and depots across greater Brisbane.

When Croucher was head-hunted from Cathay Pacific airlines two-and-a-half years ago he was given the task of establishing a storage and server management plan, "as well as a business continuity plan as none existed. At that time there were approximately 400 servers at BCC and as far as storage was concerned, yes there were some systems in place, but they were limited," Croucher says. "For example, while there were two data centres, one did not back up the other, so it was possible to lose lots of information very easily."

Croucher's initial focus was to get a handle on the way the servers were growing. "There was an increase in servers of around 20 percent a year, which meant that within five years, council could be spending around $80 million a year just refreshing the server fleet. There were at least seven different operating systems and it was a case of whatever anyone wanted they could have - we were heading towards having one of everything!"

Croucher also discovered there was not much of an asset management plan. BCC urgently needed to standardize and rationalize operating systems, maintenance procedures, server architecture and internal support tools, so it was a big job.

"One advantage we had was that we were able to deploy our own fibre infrastructure throughout Brisbane and because of the excellent bandwidth provided that meant we could begin to bring some of the remote servers back into data centres," Croucher says.

Virtually Unknown?

When Croucher first joined BCC server virtualization had not been considered, partly because it was a relatively new technology - at least as far as stand-alone servers were concerned - and partly because no one at BCC had known about its possibilities. As it happened, Croucher's first infrastructure job was to sort out the basic data storage situation, rather than the server environment.

"The data centres were both located in the Brisbane CBD, a couple of blocks from each other, and if a natural or unnatural circumstance arose, there was the potential to lose both facilities," Croucher says. "So, we put up a proposal to the CEO and Lord Mayor to build a new facility out of town, about eight kilometres away, not on a flood plain, not under any aircraft flight paths and built to Australian Standards (AS2834-1995). This facility will house around 50 percent of our production workload and provide the tape library backup of the other facility's production data."

It is possible to achieve storage virtualization, but though the principle is the same as server virtualization it is currently far less of a proven technology and very seldom seen. But storage virtualization is, as John Wellar, director of Advanced Storage Systems at virtualization specialist Lan1 puts it, "a growing issue because if you don't have it, it's like having a five-tonne dump truck - if you are only carrying one tonne at a time, hey you are wasting it: you have space for four more tonnes".

Wellar says it is primarily the medium-sized organizations that are opting for virtualization - both server and storage versions.

"The little guys, they don't need it, but the medium-sized ones do. They don't have the staff, they are always being asked to do more and more for less and less, and government is trying to deliver its services more and more over the Internet, so their business continuity model is changing dramatically.

"In the old days, your council worked Monday to Friday, nine to five, so basic IT resources worked pretty well. It's all different now. For example, I went and registered my wife's car the other day. I just went onto the computer and did it all. What that means is that the service has got to be ready so that I can do that at 11 at night or even three in the morning, so this is why the drive for virtualization is well and truly on. Basically there is a constant push for greater use of resources and in particular an ultimate need to lessen the impact of expensive hardware," Wellar says.

This was indeed the driving force behind Croucher's strategy at BCC, but before he could begin to even think about storage virtualization he had the daunting task of cutting the numbers of council servers as well as slashing the ongoing capital cost of new and replacement servers - a task that some at BCC thought impossible given the council's ever increasing move to Internet-based solutions for ratepayers.

"We established a commitment that we would not only stop the proliferation of server growth of 20 percent a year but over the next two years we would get below a 50-server fleet," Croucher says. "Two-and-a-half years into the program we now have 230 servers, down from the original 400, and we should break the 200 mark as we enter 2005 and achieve the target 50 level by the end of the next financial year."

Saving Millions

All of this has been achieved by using server virtualization and the cost advantages are truly staggering.

"Huge savings are being achieved," Croucher says, "not just in capital avoidance but also in real estate, staffing and in software, not to mention improved stability and performance.

"Taken together the savings have been quite significant. The capital avoidance alone to date is $1,656,220. With the implementation of virtualization using VMware products within our Microsoft/Linux (Intel) fleet, we have the ability to provide 180 small virtual partitions running across eight physical servers. Before the advent and introduction of this capability, BCC would have acquired a physical server for each application requesting capacity, in this case 180 servers."

It was not all plain sailing though. Croucher and his team did find some software and application providers who said they just could not support BCC running on VMware.

"That's still the case," Croucher says. "The arrangement we enforced is that because we've been able to establish sufficient infrastructure to have test, development, pre-production and training, if we hit a situation where an application has a problem using VMware we immediately install it in a free-standing box - the so-called native mode - and if we still have a problem then the application company has to respond to it. So far we haven't had any problems with that method.

"Certainly we have found the system to be very reliable and our internal customers are very satisfied with it. Instead of them coming to us and saying, 'I need this application run up', and us having to acquire a new server and associated storage, we can just set up a new operating environment and hand over to the application developers in around 30 minutes, rather than weeks."

Croucher says that opting for something as radical as server virtualization does demand plenty of research before making a decision, and he not only presented in-depth reports and analysis to his CEO, he also ran his research past independent experts Gartner. "We also had discussions with Microsoft and Intel before putting this up to our CEO for approval."

The next step at BCC? Storage of course, and the eventual move towards virtualization there too.

"We need to get a better handle on our storage management," Croucher says, "and there are similar issues there as there are with server management - tools, archiving, alternate provisioning capabilities to offer slower/cheaper storage, et cetera. At the moment, there are still remote locations which have separate servers and separate storage. Data backups were not happening properly and if the system went down, as they will, and someone hadn't properly backed it up . . . well, those are the sorts of issues surrounding storage which are still happening and we need to sort through them just as we have done with the servers."

Wellar agrees and he says that within five years server and storage virtualization will be so common, so ubiquitous, that it will barely rate a mention. However, he does have some words of caution.

"There is a danger that we in the industry make it all sound like it's 'point-click and it happens', but it ain't like that. Nothing is ever seamless in IT. The only time it's even relatively simple is when you do all the hard planning up front," Wellar says.

"But still, the problem is that there are commercial reasons why it's difficult to get virtualization to work across systems, and by that I mean there is a vested interest from the hardware manufacturers to resist virtualization. There is also a lack of resources across some government arenas coupled with a lack of people skills to design and implement virtualization. All of these are potential problems that government departments or companies really need to be aware of."

Join the CIO Australia group on LinkedIn. The group is open to CIOs, IT Directors, COOs, CTOs and senior IT managers.

Join the newsletter!


Sign up to gain exclusive access to email subscriptions, event invitations, competitions, giveaways, and much more.

Membership is free, and your security and privacy remain protected. View our privacy policy before signing up.

Error: Please check your email address.

More about ADVENTBrisbane City CouncilGartnerHISIntelMicrosoftProvisionVMware Australia

Show Comments