fnPrime



Data Centers Get More Sophisticated About Air Flow, Higher Temperatures

Data Centers Get More Sophisticated About Air Flow, Higher Temperatures



Second of a three-part article on how to make critical facilities more energy efficient.


By David Lewellen  
OTHER PARTS OF THIS ARTICLEPt. 1: Reducing Data Center Energy Use Means Focusing on PUE, Air Flow, LEED Pt. 2: This PagePt. 3: Renewable Energy Gaining Ground In Data Centers


While the concept of allowing data centers to operate at higher temperatures is easy to understand, implementation can be harder. “Believe it or not, some technology executives still think data centers need to be run like an ice box,” Cassiliano says.

The challenge for a colocation provider, Rinard says, is that if one customer wants the servers kept at a colder temperature, the request basically has to be honored.

In some settings, waste heat from the data center can be used to warm the rest of the building (or an adjacent building, in some cities). Enck says the same strategy can be used in some other settings, such as laboratories.

“We’re getting fairly sophisticated at getting the right amount of air at the right temperature,” Rinard says.

And air at the right temperature, or at a close enough temperature, can come from outside the building. “Fifteen years ago, outside air was a bad thing,” Rinard says, but technology has progressed for filtering out dust and dirt. And that approach uses less energy than re-circulating and re-cooling the air within the building. Outside air, in most places and most seasons, “keeps the environment clean as well as utilizing free cooling,” Rinard says, and “moves the heat to where it doesn’t matter.”

That envelope is rapidly being pushed to new borders; even in the challenging environment of Arizona, which combines extreme heat and sandstorms, eBay is building data centers with no compressor-based cooling. Dale Sartor, a staff engineer at the Lawrence Berkeley National Laboratory, said that when a chip runs at 180 degrees, “if done well, even 120 degrees is cool.” Sartor explains that eBay’s state-of-the-art modular containers are built “with the entire cold aisle wall exposed to the outside. Air passes through louvers, filters, an evaporative air wash (for cooling and cleaning when needed), and then through the servers and out the hot aisle back to the outside.”

Kelly Sullivan, vice president of global data center operations for CenturyLink, says that in addition to designing hot and cold aisles, some straightforward steps to save energy include installing LED lighting (often getting a local rebate in the process), installing variable-speed drives in HVAC upgrades, and turning off lighting, because “in our business, after normal business hours, not a lot of people come in.” (Customers who want to check their servers at odd hours should be able to override the controls, however.)

More advanced measures include replacing inefficient cooling plants with more efficient chiller units. Also, Sullivan says, older UPS units are very inefficient at low loads, so CenturyLink will shut down some of the older boxes to make sure that the remainder run at least 50 percent of capacity.

CenturyLink has 57 data centers across the globe, with 260 megawatts of power. Typically now, Sullivan says, a building is opened with two megawatts of capacity but can be expanded as necessary, one or two megawatts at a time.

For a new data center, one basic energy-saving strategy is simply location: a data center in Minnesota will have lower cooling costs than an equivalent data center in Arizona. “If the shell of the building stays cooler, everything stays cooler,” Sullivan says.

Choosing a location is easier for huge firms that can fill an entire data center with their own servers. However, companies in the colocation business “have to be where our customers want us to be,” Rinard says. Many companies that rent server space will send employees to check on them. Rinard says that someone is in Equinix facilities 24 hours a day, so they contain “a few meeting rooms and break rooms for the human beings.”

In some locations, nature helps with energy costs above and beyond a cool climate. Rinard says that Equinix’s Toronto data center pulls cold water from the bottom of Lake Ontario for cooling, so that the site doesn’t need a chiller at all; pumps and fans are sufficient.

Schlattman points out that another factor to consider in siting a new facility is how often the area experiences blackouts — and unfortunately, California, Texas and the East Coast all rate poorly in that respect. Even though data centers have generators and UPS, using them will hurt sustainability ratings. A compromise that Google and some other large companies are using, he says, is to build large data centers in states like North Dakota or Nebraska, which can offer few blackouts and plenty of free cooling, and smaller facilities near population centers.

The life cycle of a UPS system at a colocation provider is 12 to 15 years, Schlattman says, which means that now and in the near future, many data centers will need upgrades. New tenants, he says, tend to want their data stored at newer, more efficient sites, which is another reason why the industry will be seeing retrofits and consolidation.

Data closets
Of course, not every data center is a large building with new equipment. Many servers are housed in converted closets or spare rooms, which were not designed for energy efficiency. A report from the Lawrence Berkeley National Laboratory points out that in such cases, the data owner rarely pays the utility bill or sees a submetered bill, which reduces both the urgency to improve efficiency and the means to measure it. However, consolidation of servers and turning off unneeded machines can save energy; so can raising the setpoints for temperature and humidity in the small data room. Another technique is the use of blanking plates and other basic steps to regulate air flow.

Other steps that facility managers can take include setting up a schedule to replace equipment; getting training in energy efficiency for IT and facility operations staff; and installing a dedicated cooling system for the data room (preferably with variable-speed drives) instead of letting the entire building’s system run around the clock.
Sullivan says that although reliability and redundancy will always be first for data center customers, energy efficiency is definitely on their minds now. But are they willing to pay more for data stored in a green center? “Never,” he says flatly.

Even so, interest is growing, and the new LEED standards have been well received by the market. “The changes have really resonated with this industry,” Enck says. “They’re set up to benefit from LEED certification.”

David Lewellen is a freelance writer based in Glendale, Wis.
Email questions to edward.sullivan@tradepress.com.
 
 


Continue Reading: Data Centers

Reducing Data Center Energy Use Means Focusing on PUE, Air Flow, LEED

Data Centers Get More Sophisticated About Air Flow, Higher Temperatures

Renewable Energy Gaining Ground In Data Centers



Contact FacilitiesNet Editorial Staff »

  posted on 1/11/2017   Article Use Policy




Related Topics: