Has the Mission of Your Organization's Data Center Changed?
If the organization's strategic mission objective's for data have changes, it's time to reassess data center functionality and technology as well.
To understand what is needed from the legacy data center, conduct a fresh assessment of the enterprise’s strategic mission objectives for data, says Charles Manula, integrated facilities management lead for government at JLL. “Make sure you audit everything,” he says, as IT departments may be moving some operations to edge technologies or into the cloud.
For example, an enterprise’s legacy data center may have been initially designed as a Tier 4 facility. However, if the IT department now plans to move some data processing to the cloud, the legacy data center may operate better at Tier 2.
“Facilities and technology teams also should work together to establish the optimum temperature setpoint for the site,” says Mike Doolan, chief reliability officer for data center solutions at CBRE. “While IT assets are able to tolerate increasingly high inlet temperatures, this can impact the power drawn by internal fans within the IT assets. A collaborative effort is required to tune for minimal power consumption across the whole data center.”
Low-load conditions
As more organizations move data to the cloud or into micro data centers, another concern for legacy data centers is low-load conditions. “Chiller/DX equipment typically likes to see some minimum load for stable operation,” points out Tim Kittila, director of data center practice at Parallel Technologies.
Reducing servers, storage, and network gear makes it difficult to direct air flow to organized hot/cold aisles, by creating vacant space, according to Kittila. Typically, parts and pieces are removed and eliminated from interspersed sections, “leaving holes in random places,” explains Kittila.
Electrically, low-load conditions affect generators too, as insufficient load leads to wet-stacking. What’s more, excess capacity and space in legacy data centers often leads to inefficiencies in uninterruptible power supplies (UPS), generators, and cooling systems that were initially designed for higher heat loads, explains Doolan. That results in proportionally higher maintenance costs for the remaining infrastructure.
“Enterprise technology teams should work with their facility partners to identify opportunities to consolidate workloads into more well-defined areas of the white space,” says Doolan. Consolidation may allow some plant assets to be shut down. That allows those still operating to run at more efficient loads.
Hot aisles/cold aisles
Building management systems provide good data on the performance of the central plant. However, they may miss hot spots created by a high concentration of new servers or poor airflow.
A very low-tech solution can help address such situations, Doolan says. “Basic walking rounds by staff can help subjectively identify these [hot spots], and site teams can respond by actively reviewing the placement of floor grills.” In addition, grills with dampers can be adjusted to direct cool air where needed.
Whenever the layout of IT cabinets is rearranged, facility managers need to make sure hot and cold aisles remain effective. “This might mean blocking empty cabs and racks, or adding containment systems,” says Kittila. “Blanking panels for cabinets/racks are available in various forms and sizes, as well as mock full-rack fill-ins, which block airflow from row to row.”
Many data centers use containment systems to keep hot air separate from cold air. In those cases, facility managers have to be careful, when moving a cold-aisle containment system, to ensure that the heat is directed efficiently back to the cooling equipment, says Kittila. “Close-coupled cooling may be required in specific situations, such as rear-door heat exchangers, in-row cooling, and overhead cooling systems. These systems are perfect for directing conditioned air close to the source.”
To manage resource use and locate IT assets most efficiently, Doolan recommends considering deployment of data center infrastructure management (DCIM) systems. DCIM systems can help in placing IT assets most efficiently from power, cooling, and connectivity perspectives.
Low-cost Building Internet of Things temperature and humidity sensors can be located throughout the site “to more quickly and accurately identify hot and cold spots where intervention is needed,” says Doolan.
Related Topics: