Fallout
The August 2003 BLACKOUT darkened buildings from New York to Detroit and Toronto. Now, managers talk about the way departments and equipment responded and the lessons they learned.
Managers were reminded again on Aug. 14 exactly how precarious electric power supplies can be, and how difficult it can be to keep backup power flowing to all areas of facilities in an emergency. The power blackout that hit parts of the Northeast United States and southern Canada cut power in some regions for a matter of seconds and in other regions for several days.
The blackout also pointed out shortcomings in the backup power systems of some commercial and institutional facilities, as well as flaws in some facilities’ emergency- response plans. While the event certainly created havoc and uncertainty in affected areas, its aftermath provided valuable lessons for maintenance and engineering managers seeking to streamline the performance of both equipment and personnel.
Equipment Issues
The changing nature of facilities, as well as the equipment and technology installed in them, caused some problems. Howard “Skip” Perry, assistant director of construction, preventive maintenance and facility services at The Toledo Hospital in Ohio, says his department discovered such a problem: Centrifuges in one of the hospital’s buildings were not connected to emergency power.
“People add equipment without telling you,” he says.
At a large medical center in Manhattan, the power went out at about 3 p.m. on Aug. 14 and returned at about 10 a.m. the next day. In between, the center’s facility operations department discovered a number of important things about its facilities and its preparation, says the department’s manager, who did not want his organization identified.
For example, while the department generally has a handle on all of the equipment that is operating in its nine buildings and 1.5 million square feet, the list occasionally goes out of date, sometimes because of unreported changes in the system.
“We got a good feel for equipment that is on emergency power and equipment that is not,” the manager says. “The problem we have is that when we do our monthly tests and we transfer all switches and operate all engines, we still have utility power available. So you don’t get a good feel for what is on emergency (power) and what is not.”
For example, the medical center’s 10th floor corridor and nurses’ station did not have emergency lighting during the power outage.
“We were kind of shocked at that,” he says, adding that all patient rooms had emergency lighting and receptacles and that the nurses’ station and corridors had emergency receptacles. Lighting was put onto temporary power within two hours, he says.
The center’s trash compactors also were not on emergency power. And custodial crews needed more emergency power receptacles to operate equipment such as wet vacuums.
The center also had to handle these equipment issues during the blackout:
- One automatic transfer switch (ATS) serving the center’s blood bank failed. Technicians traced the problem to dust from construction that affected the ATS.
- The center lost the use of one cooling tower that served an area housing telecommunications equipment, a food-service area and a pharmacy. Technicians responded by using city water in lieu of tower water, he says.
At Cornell University in Ithaca, N.Y., facilities never lost power, says Jim Gibbs, director of maintenance management. But problems elsewhere in the area hampered the university’s ability to produce chilled water.
The limited cooling capability forced the university to shut down 17 facilities, including dormitories and athletic facilities, until power problems were rectified.
“It could have been much worse,” Gibbs says, adding that had the weather been hotter or had more students been on campus, the department would have faced tougher decisions about closing facilities or limiting cooling capacity.
Communications Concerns
The blackout also exposed holes in some maintenance departments’ communications strategies — holes that could prove costly in a more protracted or dire emergency.
Power was out for about 45 minutes at The Toledo Hospital, Perry says. But that was long enough to alert Perry and co-workers that critical information was not available to workers responding to the emergency.
When the power failed, Perry went to the power plant for the 1.3-million-square foot facility to address issues that might arise. One task was to determine if the hospital’s emergency generators had enough diesel fuel oil to operate for an undetermined period of time. But the outage occurred at a shift change, the power plant manager was off, and Perry says he soon discovered that key pieces of information — including providers of fuel oil and contact information for key power plant personnel — were not written down.
“We hadn’t made it easy for people to find that information,” Perry says. “We didn’t actually have a plan for how to reach people.”
At the Manhattan medical center, advanced communication technology was of little use. Cell phones did not work, and walkie-talkies suffered from dead spots in parts of the facilities, the center’s facilities operations manager says.
To maintain communication and respond to calls for help, medical center dispatchers and secretaries stayed in the control room, kept a written log of of problems as they occurred and relayed messages through the facilities using radios.
In some cases, departments were tripped up by seemingly minute details. For example, low batteries in some two-way radios hampered communication at Cornell, Gibbs says. At the Manhattan medical center, workers were hampered by a shortage of flashlights and by a lack of photoluminescent strips to mark pathways in unlit areas.
Making Changes
Managers say that while they are generally pleased with the responses of the their departments, a top priority in the aftermath of the outage is to review equipment performance and the response of personnel and to find areas for improvement. For example, the manager at the Manhattan medical center says that to address the problem with dead spots affecting walkie-talkies, his organization is investing in about $110 worth of repeaters that will be installed throughout its buildings. And Perry at The Toledo Hospital says his department will put together a more formal and detailed list of key contacts to consult in an emergency.
And if they needed to be reminded, managers and their staffs now know better that their plans will be tested again, most likely when they least expect it.
Planning for power problems
Facilities employ a range of different strategies to prepare for electric power outages. James McClure of Estes McClure & Associates Inc., an engineering and consulting firm specializing in education facilities, offers these examples of the way two organizations have prepared for power problems:
Lewisville (Texas) Independent School District. The district installed a 65 kw diesel-powered emergency generator at a new 100,000 square foot elementary school to provide backup power for emergency lighting, HVAC units and a cooler/freezer in the kitchen. The generator will not be available for peak shaving. The district expects to see maintenance savings, since periodic testing and replacement of battery packs in emergency fixtures will no longer be required.
911 communications center. The new center in Tyler, Texas, is staffed 24 hours a day seven days a week. A utility outage would leave critical equipment without power, and an uninterruptible power supply would provide power for only a short time. To prepare for the possibility of extended service interruptions, city officials had an emergency generator installed. To make the center fully operational, the generator can provide power for the entire building.
The city selected a diesel-powered 250 kw 120/280-volt three-phase model. An optional 700-gallon tank provides up to 36 hours of standby power at full capacity.
|
Related Topics: