fnPrime



To Keep Up With Demand For Data Centers, Google Focuses On Policies, Procedures





By Casey Laughman, Managing Editor  
OTHER PARTS OF THIS ARTICLEPt. 1: As Google Grows, It's Up To Joe Kava To Ensure Data Centers Keep Pace Pt. 2: This PagePt. 3: Custom-built Servers, Onboard Batteries Help Google Meet Its IT Energy Efficiency GoalsPt. 4: Google Shares Data Center Energy Efficiency Information To Help Push Industry ChangePt. 5: Customized Training Helps Google Data Center Employees Get Up To Speed


When Kava joined Google in 2008, he came on board at a time when the company's transition from using co-located or leased data center space was not only underway, but ramping up at a breakneck pace. At the time, the company owned just data centers in Oregon and Georgia. As the company has brought online more data centers, Kava's team has grown to 10 times larger than it was when he started. That growth has led to a number of challenges along the way. Not only were there the design and construction processes to be worked through, but the new facilities meant new people had to be hired to staff and maintain them. So Kava decided that one of his primary areas of focus had to be on making sure that there was a plan and a procedure in place for every data center the company built.

"One of the things that I was pretty passionate about right from the beginning is that we'll never have time to go back and do it again," says Kava, "so you have to spend the time right up front focusing on the processes and the documentation that make your job extensible, because as you grow 2 times, 4 times, 10 times, if you don't have those solid foundations in place, it just becomes chaos to manage."

As Google has built data center after data center over the last few years, Kava's focus on having a strong foundation for construction and operations has paid off. Because it has plans in place for the basics, Kava's team has the flexibility to try different things based on geographic or available workforce considerations. As he puts it, most of the procedures are the same — e.g., cold aisles at 80 degrees, a standard color scheme for the different water pipes — but there are about 20 percent that are unique to each location. With 80 percent done by the book, so to speak, there's more time to devote to finding a more creative or efficient solution for the other 20 percent.

As an example, consider Google's data center location in Hamina, Finland. In 2009, the company bought an old paper mill and set about converting it into a data center. For Kava, part of the appeal of turning this particular building into a data center is that, in its heyday, the mill supplied paper to newspapers, meaning it was a fundamental piece of delivering information to people.

"Now, people get information broadly online, and that facility is still doing that in a wholly different iteration today," Kava says.

But that's not the most interesting aspect of this particular facility. To start, this was the first time that Google had built a data center with a multi-story server floor. The building has very high ceilings, so Google constructed a steel mezzanine layer to add another floor to the space.

What really sets the facility apart is the way it is cooled. While most data centers are built to take advantage of whatever cooling benefits the local climate may offer, the Hamina location takes that one step farther by using seawater for cooling. Seawater from the Gulf of Finland is drawn in through a tunnel that was already in place, used to cool the facility via direct exchange, then returned to the Gulf. To minimize the environmental impact, the warm outlet water is sent to a tempering building first, where it is mixed with fresh seawater to cool it back down to a temperature similar to that of the inlet water.

The Hamina data center illustrates one of Kava's other guiding philosophies: While there may be certain things that only directly apply to certain locations, that doesn't mean that lessons learned can't be used in other places.

"While we can't use seawater cooling in Oklahoma, for instance, there were a lot of lessons learned about managing the scaling in the heat exchangers from seawater and how to manage the filtration needed for a raw water cooling system like that that have made their way into other data center designs," he says.

Google can — and does — take advantage of local cooling resources to maximize energy efficiency. In Georgia and Singapore, it's recycled wastewater; in Belgium, it's water from an industrial canal; and in South Carolina, the company is experimenting with a rainwater retention pond as a potential cooling source.




Contact FacilitiesNet Editorial Staff »

  posted on 5/5/2014   Article Use Policy




Related Topics: