Facebook and eBay shared some tips this week for improving efficiency and cutting energy bills in data centers.
Some of their tricks will be applicable only for very large data centers -- Facebook persuaded a server vendor to provide it with custom firmware to control fan speeds, for example -- but others are relevant for mere mortals who don't buy servers by the ton.
"What we did here isn't rocket science," said Jay Park, Facebook's director of data center engineering. "We applied very basic thermal theory." He was on a panel discussing "holistic approaches to reducing energy use" at the Silicon Valley Leadership Group's Data Center Efficiency Summit, which took place Wednesday at Brocade's headquarters in Silicon Valley.
Facebook was able to reduce its costs by $230,000 a year at a 56,000-square-foot data center in Santa Clara, California, largely through better air flow management, Park said. It also got a $294,000 rebate from its electric utility, Silicon Valley Power, to offset the cost of investments it made in energy efficiency.
The cooling systems that prevent IT gear from overheating are among the biggest costs for a data center, and managing the flow of warm and cold air is key to their efficiency. Facebook collected temperature and humidity readings around the Santa Clara facility and used a computational fluid dynamics program to study the movement of air. A CFD analysis can reveal areas of the data center that are too hot or too cold, and run "what if" scenarios to show what changes might be effective.
Facebook had warm air mixing with cold air above and around server aisles, so it did a cold aisle containment project -- closing the tops and ends of server aisles with fire retardant plastic to prevent cold air blowing up through perforated floor tiles from escaping.
The cold aisle containment made its cooling systems more efficient, which allowed it to shut down 15 of its Computer Room Air Handlers (CRAHs), reducing its energy draw by 114kW. It also enabled it to raise the air temperature at the CRAH inlets from 72 degrees to 81 degrees F.
It then looked at its servers and decided the fans inside each machine were spinning faster than they needed to. Manufacturers set fan speeds to accommodate maximum server loads, but that's often faster than needed and wastes energy, Park said.
Because it's such a big customer, Facebook could work with its vendor to reduce the fan speed through a firmware update. That saved 3 watts per server, which adds up fast in a large data.
Facebook has some advantages because of its size. It has long-term leases for most of its data centers and is often the sole occupant, so it can negotiate with owners and doesn't have to worry about other tenants' equipment.
"I realize not everyone can do all these things; a collocation provider can do maybe two or three of them," Park said. "The point I'm making is that you can do something to save energy."
EBay described similar efforts at an 80,000 square foot facility in Phoenix. It applied blanking panels wherever it could, which plug the gaps in server racks to prevent cold air escaping, and made sure it had the right perforated floor tiles throughout the facility.
It too did a CFD analysis and a cold-aisle containment. And it worked with the company that provides its Universal Power Supplies to reduce excess capacity in those systems.
The data center has 86 30-ton CRAHs, and it was cost-effective to install variable-speed fans in those systems, which allow them to be turned up and down as needed. 'We also worked with the utility provider to get a rebate, which made it worthwhile," said Rick Rehyner, senior manager for eBay's global data center operations.
Utilities offer rebates to companies that install energy-saving products because it helps them avoid having to construct new power plants or buy power from a neighboring utilities at higher rates.
The project reduced total power consumption at eBay's data center by 16 per cent and carbon emissions by about the same amount, Rehyner said. Over the 18-month life of the project its total IT power load increased 50 per cent (because more servers were added during that time) but the power load for the whole data center increased only 25 per cent, because it used less power for cooling.
The Data Center Energy Efficiency Summit is in its third year and is a place for data center operators to share tips for reducing energy use. Other topics this year were data center consolidation, and updates that are coming to the PUE efficiency metric and to the LEED certification program for green buildings.
Join the CIO Australia group on LinkedIn. The group is open to CIOs, IT Directors, COOs, CTOs and senior IT managers.