Overload?



As problems mount and blackouts loom, experts wrangle over how to fix the grid


By David Kozlowski  


The blackout in August has thrown the electric transmission system into the spotlight and brought to focus ongoing debates about its reliability.

That outage put the grid on the agenda not only of Congressmen and media pundits, but also facility executives. For decades buildings have been designed based on the assumption that electric power would continue to be reliable — an assumption that is increasingly being called into question.

For facility executives, then, it’s crucial to understand the complex and recent debate about the future of power transmission.

Numbers tell part of the story. Most days the 158,000 miles of high-voltage transmission wires carry electricity from power plants to outlets in cubicles, classrooms and surgical suites without many problems. Minor outages do occur but large regional ones happen rarely. Yet when something big does go wrong with this 50-year old system, the problem usually comes down to one of two facts: First, no one is in charge of transmission nationwide, and second, some of the power lines cannot carry the amount of electricity they need to.

The first issue is about control and is in part a political one. The second issue comes down to cost.

Those two worries — control and cost — have troubled power industry experts for well over a decade because they raise questions about the reliability of the network. The trend toward deregulation of the electric industry and the growing demand for power only heightens these concerns.

But those fears remained largely the province of speculation until Aug. 14, when a massive blackout provided graphic evidence of the vulnerability of the grid.

Falling Behind

The problems of this overtaxed system had been spelled out most recently in a U.S. Department of Energy report in May 2002. Construction of high-voltage transmission is expected to increase by only 6 percent in actual miles of new lines during the next 10 years, according to the “National Transmission Grid Study.” However, it is still expected to handle a 20 percent increase in electric demand and generation during that time.

What’s more, the grid is already far behind where it should be. From 1992 to 2002, a period of significant investment in generation, load growth grew by 35 percent. But investment in the grid grew only by 18 percent during the same period. Since 1975, investment in the grid has declined at an average annual rate of $117 million per year. There was a 15 percent increase in grid capacity from 1988 to 1998, but only a projected increase of 3.5 percent from 1999 to 2009.

The system was designed to serve a local load but must now serve a different one, says Luther Dow, director of power delivery and markets for the Electric Power Research Institute (EPRI). “There are more transactions being conducted, more people using the system, more utilities and more demand.”

Many of those in favor of improving reliability want quick action to upgrade congested areas and a concerted effort to build a more efficient national-regional system. They say the grid, or at least part of it, has been pushed to its technological limit. Their analysis assumes deregulation will continue to expand. So a grid that responds more reliably to the volatile demand and load requirements of an open market is necessary. This will involve a considerable cost.

But so did the interstate highway system, they say. And the argument for a national grid is at least as strong as one for a national highway system.

“The bottom line is that transmission will need to be upgraded through more wires and improved technology,” says Victor Bush, national technology leader with URS Consulting.

Building high-voltage wires is expensive business. Every mile of new high-voltage wire costs $1 to $2 million. According to Eliot Roseman, principal with ICF Consulting, it could cost $30 to $60 billion this decade for new transmission and possibly as much as $100 billion. Pacific Northwest National Laboratory estimates that, to keep up with expected growth through 2020, as much as $450 billion might need to be invested.

Some of this needs to be done quickly. For instance, the Midwest ISO recently noted that $1.7 billion in upgrades is needed by 2007.

But what does a $100 billion investment mean? According to EPRI, if the cost were spread equally across the country, ratepayers would pay an additional $100 per year on electric bills. However, it is far from certain that such costs could be spread equally. And a number of states would object vehemently to such a plan.

A Regional-National Grid

Cost is only one obstacle to improving the reliability of the grid. Another hurdle is disagreement over the best way to manage the grid. Proponents of a better national transmission system say the grid needs better control and management. These experts agree with a federal plan to create regional transmission organizations (RTOs).

RTOs mark a radical departure from the status quo. Much of the grid today is still managed and maintained by vertically integrated utilities with state oversight. These utilities generate and transmit their own power and over their own lines in a well-defined area. But there is a problem with this model: Many utilities haven’t kept pace with needed grid upgrades. This is a downside of deregulation as it has been structured. Many utilities see little financial incentive to invest in infrastructure that others will be using and that they might be selling.

One problem concerns the flow of power. Under deregulation, power, in what is called loop-flows, freely crosses the grid when it needs to get from utility A to C, across utility B’s lines. This loop-flow isn’t usually expected by intervening utilities, and if it comes at the wrong time, during peak periods for instance, congestion can occur. These congestion points are generally well known, but fixing them becomes a problem because the utilities that own the congested lines don’t receive any compensation for the loop-flow.

To avoid problems like that, the Federal Energy Regulatory Commission (FERC) wants utilities to join RTOs. These independent organizations are needed, say experts, to make a market-based approach to electric generation and transmission work. This regional-control approach will better facilitate the management of the grid, including upgrades. This approach would also standardize rules and equalize access.

But to make RTOs effective they have to have authority to make decisions, says Ken Silverstein, director of energy industry analysis with Utilipoint International.

“They must be able to manage power flows as well as have the ability to order generation companies to alter their production,” Silverstein says. “Those entities must have the right to assess meaningful penalties. A properly structured system would not just boost reliability, but it could also work to attract much-needed investment in infrastructure.”

FERC’s plan is to form a number of RTOs with some FERC oversight. It’s not clear whether FERC would have enforcement power.

As far as many experts think, the mold has been cut for RTOs — the PJM Interconnect, which serves parts of Pennsylvania, Ohio, Maryland and New Jersey.

PJM not only controls the generation and distribution of power but also decides on where and when transmission lines get built, and levies fines for violations of rules. Jackson Mueller, an energy consultant, calls PJM one of the “stiffest” of all systems to date.

“They have good control mechanisms in place with an adequately sized grid and sufficient generation,” Mueller says. And it’s noteworthy to point out that the blackout in August stopped at PJM’s doorstep.

In an example of its good stewardship of the grid, PJM has recognized its own bottlenecks and authorized $725 million to expand and upgrade its transmission system, Silverstein says. That’s the forward thinking the rest of grid needs, he says.

Size Matters

All this reorganization and reregulating on a national scale makes Tyson Slocum, research director for Public Citizen, a little nervous. He recognizes that ensuring reliability will cost money, but he is wary of dumping billions of dollars into an old system and then handing the management of it over to private entities, such as PJM, largely to serve further deregulation, which he contends has not proven its worth. Instead, Slocum and others say there should be a greater focus on local control, energy efficiency programs and distributed generation.

“I don’t believe there is undercapacity in the transmission now,” Slocum says. “The blackout was not an issue of capacity; it was a power management issue. Deregulation is partially at fault here. We’ve put a cost-based, market-approach system of generation on top of a system that is designed to meet local loads.”

He asks what is more worthwhile: to change a grid system to meet deregulation’s needs at a cost of billions, or keep the existing system and improve local generation and power management at a cost that is probably less?

Other experts agree that grid capacity is not an issue now. They say that there is increasing evidence that much of the grid, even in high-demand areas, is underutilized, and problems occur because operators simply don’t know how much capacity is available. Even statistics showing increased congestion might be the result of reporting criteria rather than actual congestion.

All this suggests to some experts that a massive change to the system is unwarranted. As a result, this group, including some large utilities, says the majority of grid control should remain with the states and under state and utility oversight. To solve problems with interstate grid connections and transmission development, states should voluntarily form partnerships, not RTOs, where the states themselves, not FERC, maintain control. Some state regulators have started calling for a greater state coordination and control on a regional basis.

“Otherwise we are talking about expensive grid investment without any guarantees or controls that reliability issues are addressed,” Slocum says. “Where the process is transparent, there can be good control.”

It’s a time issue as well. Proponents of a state approach say the country doesn’t have time to redesign the grid to meet deregulation’s needs. They point to the United Kingdom, which has been deregulated for more than a decade. It has restructured its grid and is still struggling with reliability.

Moreover, certain areas of the country are quite satisfied with a regulated market. Experts in these areas see little advantage in spending billions of dollars to improve and expand a grid that for them is working just fine.

“I think a lot of people in the Southeast and elsewhere see the initial experiments [in deregulation] have not been that successful,” says Jonathan Sangster, senior manager director for CB Richard Ellis Consulting. He sees some credence in the idea that maybe there isn’t one model for electric generation and transmission.

“There are capacity, grid and regulatory problems with transmission in the Northeast, and in the Northwest and Southeast there aren’t these problems,” Sangster says. “The Midwest has its own problems, mostly with generation and environmental concerns.”

Still, there’s a good chance some form of FERC’s plan will happen. “But there is a lot of resistance to it,” he says.

Some say this wrangling over national or regional solutions is missing the point. Getting the power source closest to the demand is what is needed. Kyle Datta, a managing director of Rocky Mountain Institute, does believe there are some transmission capacity problems leading to breakdowns of the grid. But to him that’s a reason to shrink the grid — separate it into small grids.

Datta and others say a better idea would be to redesign the grid to better utilize distributed generation and take greater advantage of energy efficiency programs. Power sources would be linked into power parks or microgrids that could be isolated from the main system and circumvent many key congestion points. The grid should exist, Datta says, but it should complement these smaller grids.

A Muddled Plan

Mike McGrath, executive director of retail energy services with the Edison Electric Institute, says FERC’s effort to bring some standardization to the retail market has raised a number of issues. He suggests that there may be a flaw with FERC’s standard market model in assuming there is only way to solve grid problems.

“The problem might have to do with FERC’s idea of a one-size-fits-all market,” he says. “The market doesn’t work that way.”

For one thing, a considerable amount of power is produced by utilities that can ignore FERC.

Public power producers, such Bonneville Power Authority and Los Angeles Power and Light, don’t come under FERC’s jurisdiction. Proponents of FERC’s plan say it would be severely weakened if more than 2,000 utilities would be exempt.

But the concern over control of the grid might be overshadowing the development and implementation of new technologies to better manage high-voltage power trans-mission. For instance, EPRI is developing new software and sensors to collect more accurate data, more quickly. And new communication systems using fiber optic cable linking utilities more effectively would save precious fractions of seconds in decision-making time. There are also systems being worked on to better stop the cascade of failures before they turn into a major blackout. Because these technologies fall outside of the debate over control, some experts fear that much needed funding for them might get overlooked. Still, even with funding, it might take five years before some of these major grid management technologies could be implemented.

Current strategies that could help also would need financial assistance. Existing reciprocating engine generators face siting concerns but new cleaner technologies, such as fuel cells, aren’t economically feasible. More energy efficiency incentives programs come with a considerable cost as well.

As power demand increases faster than the supply can keep up, it’s becoming clearer that solutions might need to be expedited.

What is needed is a Marshall plan of sorts that addresses all the concerns of transmission, say some industry experts. They point to national standards for generation concerning reserve capacities, and ask why no such standards for grid reliability exist.

Just what a Marshall plan would be is still being debated by industry experts and politicians, and there are some fundamental questions to be answered. Is deregulation good or bad? Should the grid look more like the national interstate system or a state highway system? Is it more reliable to have a large pool of generators to choose from or locally controlled and managed power sources?

One thing is for sure, experts on both sides say, the need to address these issues is growing.

Portions of the grid are overtaxed. Electric demand is increasing. Investment in transmission is falling behind demand. And the whole system is aging. This scenario creates a situation where future blackouts are increasingly likely. Nothing new: ‘Boring stuff’ one way to make more energy available There’s nothing sexy about making energy go further. Tuning up HVAC systems, installing high-performance windows and switching to efficient lighting technologies is as exciting as it gets.

“It’s the boring stuff that everybody’s been talking about for 20 years,” says Neal Elliot, program director with the American Council for an Energy Efficient Economy (ACEEE).

Facility executives thinking of what they can do to stave off another power outage like the one that hit the Midwest, Northeast and Canada in August need to look no further than their own buildings. Reducing the amount of power buildings draw can go a long way toward improving power conditions on the nation’s electricity transmission lines.

Whether it’s retrofitting existing T-12 fluorescent lamps with more efficient T-8 lamps or insulating the building envelope with low-e glazed windows, demand-side energy management is a key component to ensuring the nation’s electrical grid performs as expected.

“When you start to unload the grid, you help give it some resiliency and help your neighbor,” Elliot says.

Department of Energy officials and others on an international panel investigating the cause of the August blackout testified to Congress that the outage wiped out nearly a quarter of the nation’s high-voltage transmission lines. The panel is still determining the exact cause and sequence of events that led to the outage. However, one factor already being discussed is that there wasn’t enough power being produced in the right areas to meet the electrical demand on that day.

Mark Hopkins, acting co-president of the Alliance to Save Energy, says he doubts demand-side of management efforts, even if they were in widespread use, could have averted the August blackout entirely. However, such efforts do help during certain peak-demand times when the electrical transmission system is taxed to its breaking point.

That’s why certain states and municipalities with high electrical demand and marginal supply have implemented incentives to encourage energy efficiency among facility executives. The New York State Energy Research and Development Authority, for example, has been encouraging building owners to install high-efficiency air conditioners by offering a $75 incentive through its Keep Cool campaign, Hopkins says.

On a federal level, Congress is debating an energy bill that awards a $2.25-per-square-foot tax deduction for energy efficiency improvements. If the measure is approved, a building would have to use 50 percent less energy than prescribed in the American Society of Heating, Refrigerating and Air-conditioning Engineers Standard 90.1 to qualify for the credit.

Hopkins says such incentives are important because they help offset the premium sometimes charged for purchasing energy-efficient versions of products.

“Even a modest financial incentive can make the difference in a buying decision,” he says.

Lower operating costs, of course, are another reason facility executives choose energy-efficient products.

One such incentive taken out of the energy bill being considered by Congress is a national systems benefit charge. States and certain utilities have used systems benefit charges to fund incentive programs that encourage the use of energy-efficient building equipment. One version of the national charge would have allowed the federal government to dole out matching funds to states engaged in incentive programs.

The theory, Hopkins says, is that a state would miss the so-called free federal money unless that state had an active energy-efficiency incentive program.

Elliot of ACEEE agrees, saying incentives often make it easier to decide whether to retrofit facilities with energy-efficient equipment.

“The money is the lubricant that actually gets the projects done,” he says.

— Mike Lobash, executive editor


 

Power aplenty — for now

Energy experts can disagree on what the future of electric transmission should be, but one thing they do agree on is that there is enough power being generated even if it can’t always get to its destination.

There is a national average of about 34 percent excess capacity in the system. And the North American Electric Reliability Council (NERC) forecasts generation capacity margins will be increasing, reversing a trend of years where declines were common. Much of the new power will come from independent power producers.

Capacity in the system was up 7.6 percent in 2002 from the previous year, according to Edison Electric Institute’s latest figures, and generation was up 2.1 percent. All of that kept pace with the slow growth of the economy.

“The capacity is there in the market,” says Gary Graham, vice president of energy advisory services with Jones Lang LaSalle.

“There are tight markets, like New York and California, but even there new plants were built to handle some of the load.”

The capacity is there — for now. Even though projections indicate that demand growth should fall slightly below supply growth for the coming years, there are a number of unknowns, such as how many 1960s- and ‘70s-era coal plants will need to be replaced, and how many nuclear plants will need to be recommissioned or decommissioned, especially in the tight market of the Northeast.

But demand projections also suggest some new generation might be necessary toward the end of the decade, says Eliot Roseman, principal, ICF Consulting.

And according to Mike McGrath, executive director of retail energy services with Edison Electric Institute, Wall Street is beginning to look more kindly toward generation projects after a serious snub by investors following the California debacle.

— David Kozlowski, senior editor


 

Producing power near facility improves grid, power reliability for facilities

When facility executives move their organizations into a new building, they typically don’t worry about who is going to supply water, sewer service and electrical power.

At least for now.

“That may not be the case in the near future,” says John Jimison, executive director and general counsel of the United States Combined Heat and Power Association.

Jimison and other backers of on-site power systems say the technologies that allow organizations to produce their own power are revolutionizing the way facility executives shop for, purchase and use electricity. In addition to making more economic sense for some, producing power at or near a facility often alleviates some of the problems that contribute to blackouts like the one earlier this year.

“A fundamental advantage of distributed generation is that it’s not as vulnerable to the problems that have shut down grids,” Jimison says.

On-site power, also known as distributed generation, allows facility executives to produce power to run all or part of their facilities. A third term – cogeneration or combined heat and power – describes a facility that uses generators on site to produce both electrical power and heat. The latter is typically used for domestic hot water or in industrial applications.

Although the terms differ and there are nuances to each system, the premise is the same: A facility is using power produced very near to where it will be used instead of drawing it from the utility. That last point is the one with ramifications that can reduce the chances of future blackouts.

Although the final cause of the Aug. 14 blackout that hit the Northeast, Midwest and Canada is still being investigated, early reports indicate that an overloaded transmission system — the grid — contributed to the outage.

Any power produced on site reduces the amount of electricity that must be transmitted through high-voltage lines. That, in turn, reduces the chances that the transmission system will fail because it’s being overtaxed.

Leo LeBlanc, regional development director with DG Energy Solutions and president of the Electrical Generating Systems Association (EGSA), an on-site power industry trade group, says grid reliability might be boosted significantly if facilities in any one area of the grid combine to produce 10 megawatts of power.

Typically, distributed generation systems range from 800 kilowatts for commercial building applications to several megawatts for industrial applications.

While on-site power systems benefit the grid and users connected to it, that alone is seldom a motivating factor for individual facilities to install power systems. Most do it because they can save money by producing power on site rather than buying it through the local utility.

There are some roadblocks, however, to facility executives pursuing on-site power options. First, the price structure utilities use can often make a project cost-prohibitive. Facilities that use on-site power but rely on a utility for backup power pay what’s known as standby charges. Those charges are often high enough to wipe out the economic benefit of on-site power.

Still, in high-price energy areas on-site power is often justified. Jimison of USCHPA says industry statistics indicate that the cost of building an on-site power system is about $1,200 per kilowatt-hour. While a kilowatt-hour costs around $800 when a utility builds a generating station, the overall cost rises to $2,000 by the time the utility adds the required reserve capacity and makes the necessary changes to the transmission and distribution system.

That money needed to improve the transmission system and expand generation facilities is what’s bolstering the prospects for on-site power.

Steve Stoyanac, president-elect of EGSA and vice president of OEM sales for Silex, says the time and money it takes to site transmission lines and to build generation plants to meet growing energy demands is making on-site power more attractive.

“Distributed generation is going to continue to grow,” he says. “The increased demand for electricity is not going away, and the capital for utilities to build generation, transmission and distribution systems is not there.”

— Mike Lobash, executive editor




Contact FacilitiesNet Editorial Staff »

  posted on 11/1/2003   Article Use Policy




Related Topics: