Menu
Menu
Liquid Cooling Returns to the Data Centre

Liquid Cooling Returns to the Data Centre

Enterprises across the country are struggling to meet demands in data centres that were designed for a different era of computing.

Liquid cooling is making its way back to the data centre, mostly in the form of rack-level gear. Rapidly escalating power demands and rising utility rates are requiring users to take a more proactive approach to cooling.

The shift is already in motion. Servers preconfigured with liquid-based cooling modules sit on top of the hottest components in a system, slashing cooling demands in half. This in turn will lead to entire motherboards being "hosed down" directly with treated water or refrigerants, eliminating or dramatically reducing the need for the less efficient air conditioning units that populate most data centres today.

For US Internet, a provider of collocation data centre services, the use of rack-level cooling systems from Liebert has become critical. The "creeping" nature of increasingly dense and hot server equipment inside its Minneapolis data centre almost destroyed the future of the fast-growing company, said Travis Carter, co-founder and chief technology officer of US Internet. As its customers began to move increasingly higher-density gear into the data centre, temperatures began to slowly rise, reaching 100 degrees, and equipment failure began to escalate.

"We didn't even know we had a problem at first," Carter said. "Quite frankly, it snuck up on us over a period of months [in 2005], and we found ourselves with no available space for traditional cooling. It became embarrassing. You can't bring a potential customer into a sauna and expect them to add their gear to the problem."

US Internet redesigned its data centre to incorporate Liebert XD cooling systems that pump refrigerant into cabinet racks, and it deployed new air conditioning units and environmental monitoring systems. The data centre is now maintained at 70 degrees, and the customer base is back on the growth curve. The company plans to add a data centre later this year that will also incorporate the new cooling equipment.

"As far as I'm concerned, everything we invested in the cooling equipment has been revenue generation right to the bottom line," Carter said. "We would either be out of business today or operating at a much smaller base without that equipment."

Data centre energy efficiency is top of mind. Industry roundtables with the US Department of Energy are being held across the country as a potential crisis in the data centre has mushroomed over the past few years. Enterprises across the country are struggling to meet demands in data centres that were designed for a different era of computing.

A study published by Gartner in November projects that by next year, half of all current data centres will have insufficient power and cooling capacity to meet the demands of high-density computing equipment.

Vendors have been scrambling to create a wide range of components, systems and software to try and keep a lid on the bubbling cauldron. Virtualization will buy many enterprises valuable years to develop next-generation cooling strategies. But those new products and technologies are primarily designed to buy time for data centre environments that are already overextended, and they can even add to total energy demands.

A survey of attendees at the most recent Gartner data centre conference in November 2006 showed that 80 percent of respondents said they currently have power and cooling issues on their computer room floors, said Michael Bell, a Gartner analyst. More than a third said they will have to invest in a new data centre within the next few years.

In the past few years, energy demands for the average data centre have grown from one to three kilowatts per rack to around six kilowatts per rack, with 10- to 12-kilowatt racks becoming increasingly common, Bell said. Implementations of 20- to 30-kilowatt racks early next decade seem increasingly reasonable.

But not everyone is buying the need for water-cooled data centres, trying other options instead. Highmark, a health insurance provider, recently received certification from the US Green Building Council for meeting Leadership in Energy and Environmental Design (LEED) guidelines in its creation of a 28600-sq-metre data centre. Now that it has adequate floor space, the company relies primarily on saving energy in a more traditional cooling design that uses the latest in targeted airflow and rack-level environmental monitoring systems.

"Everyone knows that data centres are power hogs, and one of our corporate strategies is to reduce energy use," said Mark Wood, Highmark's director of data center infrastructure.

Part of the LEED certification included adding items like bike racks to encourage employees to leave their automobiles at home and a 380,000-litre underground rainwater-collection system that is used for backup water supply and in the facility's toilet systems.

But many businesses have just run out of floor space and can no longer rely solely on "hot" and "cold" aisles to spread the heat load adequately. They face the prospect of expensive overhauls of existing infrastructure or an even more expensive construction of a new facility.

Join the CIO Australia group on LinkedIn. The group is open to CIOs, IT Directors, COOs, CTOs and senior IT managers.

Join the newsletter!

Error: Please check your email address.

More about American Power ConversionAppleGartnerHewlett-Packard AustraliaHighmarkIBM AustraliaLiebertMotionSun Microsystems

Show Comments

Market Place

Computerworld
ARN
Techworld
CMO