Energy Diet For Data Centers
Experts say it’s possible to slim down bloated energy bills while maintaining reliability
A data center is a bit like an incandescent light bulb. Both have improved and accelerated the development of modern civilization. Both also give off significant waste heat and use too much electricity by today’s standards.
Edison’s 120-year-old invention might be legislated out of existence in the next several years, replaced by fluorescent fixtures and light-emitting diodes. But data centers — a comparatively new phenomenon with an astonishing appetite for energy — aren’t going to be shut down anytime soon. They provide critical services to the modern economy.
Just how much power do they use? In 2005, according to a report conducted by EPA’s
ENERGY STAR program, data centers consumed an estimated 1.2 percent of the nation’s electricity production — roughly equivalent to the operation of 14 coal-fired power plants. The report points to the shift from a paper-based economy to a digital-based economy as the cause for exponential power requirements. According to a study conducted by Jonathan Koomey, a staff scientist at Lawrence Berkeley National Laboratory (LBNL), power consumption by data centers worldwide doubled between 2000 and 2005. LBNL’s data center energy management Web site claims energy costs for these facilities can be up to 100 times those for typical commercial buildings.
ENERGY STAR’s report estimates that given the current trends, consumption will again double by 2011, and that peak load on the power grid could be as much as 12 gigawatts at that time, which would require an additional 10 baseload power plants.
A Growing Information Economy
The information superhighway is a wide one, and growing larger all the time. Estimates indicate that Internet usage is increasing at approximately 10 percent per year worldwide. Much of this growth is attributable to an insatiable appetite for entertainment: music downloads, video on demand, online gaming, e-commerce, social networking interfaces and voice-over-Internet technologies. Businesses are driving growth, too. Banks and other businesses are switching to a no-paper information model, and Sarbanes-Oxley has companies increasingly relying on digital information storage.
New efforts are underway to curtail some of that power consumption, and companies are increasingly experimenting with new technology that will help cut electric costs.
Two trends are driving this increased awareness, says William Tschudi, a researcher for LBNL’s environmental energy technologies division. “Cost is getting people’s attention,” he says. “Plus they’re having difficulty cooling some of the energy-intensive areas.”
Early practices to curtail consumption have netted efficiency improvements of up to 25 percent, according to
ENERGY STAR. And by applying best practices to data centers (see “EPA Models Data Center Energy Use” on page 66),
ENERGY STAR estimates that in 2011 power use by data centers could be contained to pre-2006 levels, rather than doubling. Furthermore, when executed correctly, these efficiency gains are achievable without compromising data center performance.
“I like to say that there’s no silver bullet, but there are lots of silver BBs,” says Tschudi.
Even with small changes, data center operators can reduce power use up to 25 percent, he says. “When you start talking about more aggressive changes, you can realize 40 percent or more in savings.”
If gaining 25 percent is relatively easy, why haven’t more data centers incorporated such measures? For starters, data center managers don’t always have access to energy profiles. Often, data centers are part of a larger facility and the managers can’t delineate how much energy an integral data center consumes.
“Average data center operators don’t know what they can be doing,” says Tschudi.
Understandably, data center managers normally care more about the equipment reliability than electricity consumption. But even if the operators in data centers are aware of increased need for efficiency, they also often lack the knowledge or equipment to monitor, track and control power use.
Trends in Efficiency
As always, things are changing in the IT industry, and quickly.
“Yesterday’s topics are things like harmonics, watts per square foot, and some cooling issues,” says Bob Cassiliano, chairman of the 7x24 Exchange. “Today’s topic is energy efficiency.”
Three examples of energy strategies for data centers include using outside air for free cooling, server virtualization and automated cooling management, says Cassiliano.
Using outside air improves efficiency by reducing dependency on a building cooling system. The idea is gaining popularity. Because of the massive draw on the electrical grid, some utilities are now offering incentives to data center operators that curtail their energy use via free-air cooling.
California’s Pacific Gas and Electric is one utility offering rebates to customers who use outside air to cool their data centers. Use of outside air requires air economizers fitted with a sensor and filter. This allows data centers to bring in outside air and still meet the required parameters for temperature and humidity.
Benefits are gained mostly by centers in cooler regions. “You can get huge energy savings because you’re not relying on a chiller to provide cooling,” says Tschudi. The biggest savings potential comes during the nighttime hours.
PG&E also offers incentives to companies that employ server virtualization to reduce their energy use. Server virtualization (and another technique called dynamic provisioning) generally reduces energy use by consolidating the number of servers required within a data center.
According to Cassiliano, many data centers use a separate server for each application.
“So that server is only using about 15 percent of its total capacity,” he says. “If you put two or more applications on a server and do it smartly, you can run multiple applications from the same server and still only use about 70 percent of total capacity.”
Dynamic provisioning works similarly, by bringing online only those servers needed to fulfill a network’s activity — rather than keeping them running all the time on a static basis.
Cassiliano cited one financial services company that aggressively relied on server virtualization to reduce its server use by 30 percent.
Finally, automated cooling management works by giving facility executives control over cooling ability.
“They can apply the cooling wherever they need it,” says Cassiliano.
It helps when facility executives have computational fluid dynamic models that allow them to know when and where hot spots develop within their aisles.
“Often significant energy is wasted by maldistributed cooling air,” says Don Beaty, chairman of ASHRAE’s TC9.9 committee. “The goal is to deliver the cool air where the cooling is required.”
Under certain circumstances, a local cooling solution such as an enclosed rack can solve the cooling need and reduce energy cost, Beaty says.
Power Supplies: A Drag on Efficiency
Using a DC power distribution system is another way to boost efficiency, according to Tschudi. Modern server circuitry requires conversion of high-voltage power into a form useable by circuitry of servers, routers, hubs, switchgear and data-storage equipment. But Tschudi says that 25 to 35 percent of all the energy consumed is wasted as heat by a server’s power supply.
“If you make the conversions more efficient, the servers require less cooling,” he says.
The conversions themselves waste electricity, too, even without the associated heat loss.
“Every time you convert power to a different voltage or change from AC to DC or DC to AC, there is some loss of energy,” says Beaty.
By using a direct DC power system, efficiencies of 80 to 90 percent are possible, according to the EPA. Aside from efficiency gains, reliability is also boosted, as there are fewer points of failure when compared to the AC-DC-AC conversions in UPS systems. However, this conversion would require a re-tooling of data center design and operation.
The trend doesn’t stop at the microprocessor level. Manufacturers are also examining how changes to cooling fans and power supplies can boost efficiency, particularly for volume servers, which make up a large percentage of servers in non-enterprise-class data centers. Some studies indicate that modifications to fans and power supplies can reduce server energy use up to 70 percent. Memory technology is also going under the knife. Recent advances might net 20 percent efficiency gains.
An Efficient Path
One change that lowers energy costs without any investment is raising the temperature in the server rooms.
“It doesn’t need to be a meatlocker in there,” says Tschudi. “It can be 80 degrees in there, if need be.”
Tschudi said that many data center managers operate their facilities with environmental conditions that are more stringent than actually needed. ASHRAE also has been working with server manufacturers to determine new temperature setpoints for data centers.
“If you raise the temperature by 20 degrees, you’ve gotten some of the low-hanging fruit,” Tschudi says. “That doesn’t cost anything.”
Because data centers are mission-critical facilities, managers are often reluctant to make efficiency gains if they believe the efficiencies will degrade reliability or performance.
But recent developments by manufacturers who supply data center equipment indicate the two aren’t always at odds. Devices that use dual-core microprocessors are one such example. Via frequency- and voltage-scaling technology, new servers can reduce energy use by up to 20 percent without adversely affecting reliability or performance.
High initial costs are another hurdle data center managers need to overcome, even when the project payback is calculated in months, not years. Vendors know this, making some reluctant to recommend cost-effective data center improvements. But Tschudi and Beaty see changes ahead.
“The design of data centers should be carefully developed,” Beaty says. “There’s a growing awareness, and the impact on energy cost, capital cost and reliability can be very significant.”
Tschudi agrees. “There is beginning to be some uptake on implementing efficiency measures,” he says. “The magnitude of some electric bills is driving this.”
EPA Models Data Center Energy Use
The 7x24 Exchange and other groups have been working with the Department of Energy and ENERGY STAR to foster improved efficiency within mission-critical facilities. To develop a better understanding of energy-efficiency opportunities that would accelerate adoption of energy-efficient technologies beyond current trends, scenarios were developed to explore energy use in data centers until 2011:
- The improved operation scenario includes energy-efficiency improvements beyond current trends that are essentially operational in nature and require little or no capital investment. This scenario represents the low-hanging fruit that can be harvested simply by operating the existing capital stock more efficiently. This means implementing strategies, such as adjusting temperature setpoints in data centers. Potential energy savings are 30 percent.
- The best practice scenario represents the efficiency gains that can be obtained through wider adoption of the practices and technologies used in the most energy-efficient facilities in operation today. Potential energy savings could go as high as 70 percent.
- The state-of-the-art scenario identifies the maximum energy-efficiency savings that could be achieved using available technologies. This scenario assumes that servers and data centers will be operated at maximum possible energy efficiency using only the most efficient technologies and best management practices available. Potential energy savings could reach 80 percent.
Both best practice and state-of-the-art models rely upon 98 percent efficient transformers and UPS systems that are 90 percent or more efficient. They also rely upon variable-speed fans and pumps, redundant air-handling units and, in the state-of-the-art model, a combined heat and power system for lower energy costs.
But the biggest difference is via server virtualization and storage devices. In the best practices model, ENERGY STAR calls for “moderate” server virtualization and reduction in applicable storage devices, whereas state-of-the-art calls for aggressive server virtualization and aggressive reduction of storage devices.
|
About 7x24 Exchange
The 7x24 Exchange is a forum dedicated to improving the reliability of mission-critical facilities.
“The organization works with industry, researchers and government to arrive at solutions while providing a forum for education and debate,” says Bob Cassiliano, chairman.
Energy efficiency has been at the top-of-mind for many attendees at the organization’s conferences, Cassiliano says.
“We intend, at our spring conference, to have panel of experts so that they can give their observations and open up an intelligent debate on the subject of data center energy efficiency,” he says.
Click here for more information about 7x24 and its spring conference, June 1-4 in Boca Raton, Fla.
|
Loren Snyder, a contributing editor for Building Operating Management
, is a writer who specializes in facility issues. He was formerly managing editor of Building Operating Management.
Related Topics: