fnPrime



data center

Tips For Doing a Data Center Upgrade



If the upgrade route is chosen, an organization may have the opportunity to move to a smaller physical footprint.


By David Lewellen  
OTHER PARTS OF THIS ARTICLEPt. 1: Is A Legacy Data Center Worth Upgrading? Pt. 2: This PagePt. 3: 5 Questions To Ask When Considering Upgrading a Legacy Data CenterPt. 4: 7x24 Exchange Develops STEM Initiative


Data center upgrades also have to keep the system running for end users, which is extra challenging if the data center wasn’t designed for maintenance, says Battish. 

“It’s not easy,” Battish says. “It does require a lot of phasing.” 

If a business keeps its mission-critical data onsite, it may still have the opportunity to move to a smaller physical footprint, with the same power and cooling load, even though it is handling larger volumes of data. 

Gilmer points to one facility that had an inefficient setup of 74 racks using 210 kilowatts — after renovations, the company was able to get the same storage on one rack using 9 kilowatts. “That’s why facilities people have to work with IT people,” he says. “Your phone has more memory and more capacity than a server did five years ago.” 

Due diligence is crucial in deciding how an organization should proceed. “So many variables come into play,” Battish says. 

During renovation, density of watts per square foot is important to keep in mind, according to Kosik. A smaller space handling the same or greater power load will have much different cooling requirements. Installing a separate, standalone cooling system for the cabinets works better, but that approach costs more and will also limit redesign options down the road. A water-cooled system, Kosik says, offers more flexibility, because a water pipe is easier to reroute than a refrigerant pipe.  

“Energy is a big, big deal because of the amount of power data centers use,” Kosik says, and manufacturers are working to develop products specifically for that market. “If you can increase efficiency by a few percentage points, that translates to big money over the years.” 

As in most other fields, developments at the very high end of the market tend to trickle down over time — and Kosik is watching water cooling for supercomputers. Because water is much more efficient as a conductor of heat than air, it works better in dense installations. Water cooling “is probably where we’re going to be in five to ten years,” he says. 

Stay ahead of problems

Most data center equipment is designed with a 10- to 15-year lifespan in mind, Battish says, and sometimes manufacturers will stop supporting it before that long, or maintenance costs may rise at an unaffordable rate. “Preemptively address that, not under failure conditions,” he says. Battish also recommends checking in with the electric utility about any upgrades or changes in power capacity. 

Even if a data center is 15 years old, facility managers should know exactly what they have on hand in terms of equipment, power, cooling, and the capacity and life expectancy of those systems. For instance, Battish says, a chiller may lose capacity as it ages and require retrocommissioning. Battish divides problems into things that should be fixed within a year (anything involving safety or code issues, or near the end of its life) and things with a horizon of three to five years.

David Lewellen is a freelance writer who covers facility issues. 

Email comments and questions to edward.sullivan@tradepress.com.




Contact FacilitiesNet Editorial Staff »

  posted on 1/25/2018   Article Use Policy




Related Topics: