CFD Modeling Can Help Maximize Data Center Capacity





By Maryellen Lo Bosco  
OTHER PARTS OF THIS ARTICLEPt. 1: This PagePt. 2: CFD’s 3-D Detail Can Help Avoid Stranded CapacityPt. 3: CFD Poses Issues of In-House Training, Software PurchasesPt. 4: Dealing with Hot Spots in Data Centers


By analyzing air flow, computational fluid dynamics (CFD) tools can help facility managers maximize data center capacity. While data center infrastrucure management (DCIM) has good value, supplementing it with CFD modeling provides a higher level of monitoring in the data center and more predictive ability.

The CFD modeling is emerging as a new tool that data center facility managers can use to fight hot spots. Most data centers, especially older ones, have cooling problems and hot spots, explains David Cappuccio, managing vice president and chief of research for data centers, Gartner. The way that problem has been solved in the past was to move equipment around as problems came up, but the current trend is to look at the entire floor space to get the most efficient air flow possible, which is where CFD modeling comes in.

"By running these models you can see how air is being distributed around the data center," says Mark Evanko, principal, BRUNS-PAK. CFD modeling focuses on airflow, static pressure, and temperature, and can predict where hot spots will show up, so CFD modeling can be used to determine the best place to put new equipment. "Before CFD modeling, you might be placing 10 different air-conditioning units throughout the data center, but when you deploy the CFD model you might just need six. It's also a tool for optimizing planning going forward," Evanko says.

The need for modeling arises from the way that IT equipment is installed in the data center. Data centers are designed and built on the projection for the amount of computing capacity that is needed to support the business for many years, explains Christopher Wade, national program director, critical environment operations, Newmark Grubb Knight Frank. In reality, this projection is just a "best guess" because the type of equipment that goes into them changes over time, he explains, as organizations install whatever systems are needed for the business. IT systems are upgraded or changed every six months, in some cases, and those changes have to be addressed on the data center floor. In this rapid evolution of the data center, it's hard to predict how much power, space, and cooling will be needed.

Using Space Efficiently

Predictive analytics being used for data center infrastructure management (DCIM) are intended to get more efficient use of the data center floor — "more compute per kilowatt and more compute per square foot," Cappuccio says. "You can get higher density in the racks and not have to build the next data center for years to come."

While DCIM does a good job of keeping track of temperatures, power, and space, it cannot tell the facility manager about air flow or predict the impact on the data center of future system installations. Wade says it is now necessary to have a higher level of monitoring in the data center, and supplementing DCIM with CFD provides engineering simulation tools that can predict the impact of installing equipment in any location on the floor.

"Adding CFD to DCIM allows you to predict the nature of real computer deployments you have going out to the data center," Wade says. The modeling tool will immediately let the data center manager know whether the placement of the new equipment will create a hot spot or some other problem.

"The data center is now in a similar place that individual computers were a few decades ago," says Jonathan Koomey, research fellow at the Steyer-Taylor Center for Energy Policy and Finance at Stanford University and an expert on data center energy use and management. "The software and hardware are now good enough so that we can do computer models of the whole data center." It is possible to take into consideration every individual piece of equipment — for example, a server with a specific type of fan that blows in a certain way — and incorporate it into the data center model in a way that provides insight, Koomey says.

The industry is moving away from reconfiguring existing data centers using "rules of thumb" to actually using 3-D models, not only in existing data centers, to measure temperature, air flow, and location to predict hot spots, but also in the design of new data centers.




Contact FacilitiesNet Editorial Staff »

  posted on 1/21/2015   Article Use Policy




Related Topics: