Learning from the Cascading Power Failures that Brought Down the Power Grid in India

by on August 25, 2012 · 14 comments

in Business, Government

Government studies show that cascading blackouts, like the one San Diego experienced in 2011, are still possible due to fundamental flaws in the nation’s power grid.

By Deb Severson

States in red were affected by the blackout. Graphic courtesy of Wikipedia

On Tuesday, July 31, 2012, India’s interconnected power grids failed, leaving 600 million people — half the population — without electricity. Cascading failures occur in systems of interconnected parts. They usually begin when one part of the system fails. Nearby system parts must then compensate for the failed component, which can then overload or in other ways fail, triggering a vicious cycle as successive parts fail.

When Federal Energy Regulatory Commission (FERC) chairman Jon Wellinghoff was asked if power problems in the Northeast could spread all the way to California, he told ABC News, “It’s very, very unlikely that ultimately would happen.” Part of why the risk is low, the ABC story conveyed, is due to our nation’s grid being divided in the middle. Unfortunately, this division still leaves hundreds of millions at risk for cascading failures, and increased investment in transmission is in NOT the answer.

A comprehensive, three-year DoD and Federal Emergency Management Agency study — Brittle Power: Energy Strategy for National Security, — concluded that relying on massive, complex, and interconnected infrastructures to transmit and deliver power from centralized generation creates unavoidable (and costly) vulnerabilities. Cascading failures in “brittle” interconnected electrical systems can be induced by severe weather, human error, sabotage, or even the interference of small animals or vegetation. Conversely resilient, decentralized systems are less likely to collapse in the face of natural or manmade disruptions and will limit damage when disruptions occur. In India’s Black Tuesday, many of India’s corporations and industrial groups were spared much of the blackout disruption because they generate their own power, which is often referred to as “distributed (local) generation.”

A 2007 US Department of Energy (DOE) study, The Potential Benefits of Distributed Generation and Rate-Related Issues that May Impede Their Expansion, provides compelling evidence of how a distributed network of smaller sources was able to adjust to problems without causing major consequences. Decentralized (local) systems also reduced the magnitude and duration of failures. The DoD study on “brittle power” further concluded that centralized energy facilities create tempting targets for terrorists because the terrorists would need to attack only a few, poorly guarded facilities to cause large, catastrophic power outages. In a complementary study, Energy Security in a Dangerous World, the International Energy Agency reaches the same conclusion.

Examples from nearly every area of critical infrastructure defined by the Department of Homeland Security verify distributed (local) generation to be a viable means for reducing vulnerability to terrorism and improving the resilience of electrical infrastructure. This is based on actual cases where local generation continued to provide power to critical facilities during times of large-scale power disruptions and outages, and is documented in the 2007 DOE study.

Richard Clarke, a former national security adviser and ABC News consultant, also conveyed, during the ABC Nightly New coverage of Black Tuesday, that today’s biggest grid threat is cyber-attack. “The U.S. power grid is extremely vulnerable to cyber-attack,” Clarke said. “The government is aware of that.

Recently the government held a White House level cyber exercise in which the scenario was a cyberterrorist attack that took down the power grid.”

A July 17, 2012 webinar, New Approaches to Grid Security, featured three of the electric industry’s most knowledgeable and influential leaders in grid reliability — Gerry Cauley, president and CEO of North American Electric Reliability Corp , Terry Jarrett, chair of the National Association of Regulatory Utility Commissioners’ committee on critical infrastructure, and Terry Boston, president and CEO of PJM Interconnection, the regional transmission organization serving all or parts of Delaware, Illinois, Indiana, Kentucky, Maryland, Michigan, New Jersey, North Carolina, Ohio, Pennsylvania, Tennessee, Virginia, West Virginia and the District of Columbia.

Each of these experts stressed the importance of resilience – the ability of the electrical infrastructure to quickly respond to cyber, terrorist and weather induced attack. Cauley also said the interconnected grid throughout the US and Canada created challenges to protecting the grid from cyber-attack due to issues related to information sharing and analysis.

Dr. Sovacool, a highly regarded researcher and industry advisor with specialized expertise in energy policy and energy security, particularly dealing with the costs and benefits of electric power systems, provided testimony regarding a 500 kV transmission line in Virginia. In his testimony, Dr. Sovacool demonstrated how long-distance transmission reduces the stability of the electric grid. He concluded, “The proposed project should be rejected on grounds that it fails to provide substantial reliability and security as defined by (the utility) Dominion itself. Instead, the proposed line increases the vulnerability of Virginia and the National Capital Area to critical national security threats.”

Dominion — one of the nation’s largest producers and transporters of energy, which serves electrical needs for customers in Virginia, West Virginia, Ohio and North Carolina — received permission to construct this project connecting Meadow Brook to Loudoun. The $243 million, 65-mile long overhead transmission line was placed in service in mid-2011. On June 29, 2012, more than a million customers in DC, Virginia and Maryland suffered mass power outoutages due to “derecho” storms. The area’s electrical grid was slow to recover.

The August 14, 2003 rolling blackout of the East Coast was, at the time, the second most wide-spread blackout in history, affecting 40 million people in eight U.S. states and Canada with an estimated cost of $6 billion. The blackout was triggered by a power surge that affected the transmission grid. The tools in place to analyze the “what ifs” didn’t do their job and the appropriate people didn’t understand the issue was theirs until their own lights went off. By then, an uncontrollable cascading failure was under way and it was too late for human intervention to save the grid.

After running thousands of simulations, researchers at the University of Albany and the National Renewable Energy Laboratory determined that had distributed solar photo-voltaic (PV) facilities been operating on August 14, the 2003 blackout most likely would have been avoided. The researchers noted that the indirect cause of peak demand on that day—hot temperatures and greater air conditioning loads—are also the best source for solar PV generation. The ability to off-set spikes in demand through distributed generation and load management – the ability by utilities to shed electric demand for a short, manageable time — remains increasingly relevant today for ensuring reliability.

In 2005, Congress put in place financial incentives to improve existing transmission to prevent future blackouts. Since then, most of the funding has been diverted to new transmission and, in 2011, the US experienced another significant black-out. In reviewing the cause of the September 28, 2011 blackout (ed.-in San Diego and beyond), which affected up to 5 million people in California, Arizona and Mexico, NERC and FERC found “common underlying causes” to the 2003 East Coast black-out. The power loss should have been limited to the Yuma, Arizona, area but, according to the power company, Arizona Public Service, “…an employee was just switching out a piece of equipment that was problematic.”

This clearly indicates that transmission incentives have not produced a corresponding increase in reliability that would prevent future blackouts, as Congress intended. Despite this, building more and more interconnected transmission is rampant. $170 billion in new transmission is planned nationally between now and 2020. In Wisconsin alone eight new high voltage lines are planned that will greatly increase inter-connection from Maine to the Dakotas.

CapX2020, a high voltage transmission line that is currently being contested, is planned to cross the Mississippi River at Alma, Wisconsin and end near La Crossse. CapX2020 was approved by the Wisconsin Public Service Commission (PSC) on May 30, 2012 despite testimony that it would create grid instability and congestion if a line to the east did not follow.

The next segment of this line, known as Badger Coulee, would connect La Crosse to Madison. At Madison, a substation would connect to grids to the south and east of Wisconsin. Unfortunately, this type of transmission expansion and “fix” is planned all over the country, making further interconnection of the nation’s grid imminent and immense.

According to Dr. Sovacool, “If our experience with nearly 100 years of centralized generation and transmission has taught us anything, it is (that) if you build it, it will be filled. Building longer and larger transmission systems to solve the system reliability problems inherent in transmission congestion is much like addressing the problems of addiction with more drugs.”

While the utilities behind CapX2020 and the Wisconsin PSC contend the line is needed for “regional reliability,” those leading the court challenge against the line believe this is a manipulation of the term. The challengers contend it is economically driven and defined by the ability to more reliably move electricity around to sell to other markets.

It is time for utilities to embrace another approach and fully acknowledge inherent issues in monitoring and controlling an interconnected grid. As Dr. Sovacool warned in his 2007 testimony, “Long distance transmission is not reliable. Given the complexity of the modern electrical system, and what we know about the role reactive power has played in recent outages, relying on longer transmission lines to wheel greater amounts of real power is perhaps the worst strategy for ensuring system reliability. Large and long transmission lines lose reactive power the larger and longer they become, requiring the system to compensate in ways that increase the likelihood of systems collapse.”

Dr. Sovacool also shared how catastrophic failures are most often the result of human error, inadequate diagnostics, and/or voltage instability due to insufficient reactive power — the power that keeps voltage and current in phase. In electric transmission, if a fault occurs and takes a line out, the change in current flow is compensated by other transmission lines. If they lack the capacity to carry the excess current, overload protection in those lines will also trip, causing a cascading failure as the excess current is switched onto neighboring circuits running at or near their capacity.

The role of a system operator is to monitor power supply and demand to keep them balanced and within safe operational limits. If a system failure occurs, operators must get power from generators or other regions or reduce their load to bring the system into balance and, hopefully, avoid a cascading system failure. Computer systems, with backups and alarms, are used to monitor the system. If the computer systems fail, system operators monitor the grid manually. Local operators are coordinated by regional centers, but the operating principle is the same for the larger network. Applying lessons learned in the 2003 and 2011 US blackouts and India’s 2102 “Black Tuesday” — the largest blackout in human history – is highly relevant and insightful.

Human and equipment error will continue to happen, and interconnected systems are vulnerable to cascading failures. Perhaps, in the US, we are not encouraging investment in the right kind of improvements to enable similar avoidance of distress. As previously conveyed, many of India’s major corporations and industrial groups were spared much of the blackout disruption because they generate their own power. Perhaps it is time to invest in an efficient, diverse, dispersed electricity system, where major failures—whether by accident or malice—become improbable by design rather than inevitable by design.

The number of citizens, municipalities, states and associations asking that viable, cost-effective and sustainable alternatives to high voltage transmission be evaluated is growing. Unfortunately, so is the number of transmission lines being built and planned. Due to this pressure, FERC has agreed to evaluate whether its own incentives are leading to an over-application for transmission and subsequent profits. Citizens are also suggesting that a conflict-of interest may exist in the planning groups of the Midwest regional grid operator (MISO) by having for-profit transmission builders at the helm of their Planning Advisory Committee and Planning Subcommittee. The dauntingly high operating-margins of transmission only companies provide further insight into why some feel it is greed, not need, that is driving this push for transmission.

In December 2011, Financial News Network reported that ITC Holdings, the largest for-profit transmission only company, has the highest operating margins in the electric industry (51.7%). ATC, which wants to build the high voltage line connecting La Crosse to Madison, has an even higher margin of 54.9% when dividing 2010 operating income by revenue. In Wisconsin, a legal battle has ensued between Xcel and ATC regarding ownership of the Badger Coulee line and the subsequent profits.

The public is also not accepting ATC’s claim that looking into non-transmission alternatives is “beyond the scope of the (transmission) project.” Instead, ratepayers are asking the Wisconsin PSC for cost/benefit analysis of feasible alternatives, including how they impact grid reliability and the sustainability of our natural resources. The voices asking to look at other ways are educated on job creation, savings and environmental benefits of energy efficiency and home grown energy over transmission. The commitment and tenacity to address these issues is evidenced by groups including Citizens Energy Task Force (CETF)/No CapX2020, Piedmont Environmental Council and Stop PATH WV, which are willing to lead the charge all the way to the courts.

Incidents such as the black-out in India, terrorist attacks on September 11, 2001, the Northeast Blackout of August 2003 and Hurricane Katrina are wake up calls. They remind us how dependent we are on electricity and how fragile a brittle grid can be. We are at a fork in the road.

Before we invest hundreds of billions of dollars in interconnected transmission, we must heed the experts and activists who are working to replace the outdated paradigms of infrastructure protection and infrastructure centralization with infrastructure resilience.

The time to change our course is now. Before what happened in India threatens half the American public.

Deb Severson is an activist in Wisconsin working with individuals and organizations trying to facilitate greater focus on distributed generation and energy efficiency. Inspired by the events in India and having read John Lawrence’s recent article, Power to the People or the Corporations, she submitted this article.

avatar John Lawrence August 25, 2012 at 1:27 pm

I’m very happy to see Deb’s article published since she emailed me in response to my prior article entitled, “Power to the People or Power to the Corporations?”, and attached the above article which supports the main thrust of my writing. So a friendly contact has been established in an area which is of great importance to the nation’s future.

A smart grid infrastructure with distributed generation needs to be built in order to promote the stability and flexibility of electric power. As Deb points out, the present grid is vulnerable to systemic collapse from extreme weather events, terrorist activity or something so simple as a technician in Yuma changing out a piece of equipment.

Distributed renewable energy sources feeding into a smart grid can replace fossil fuel energy generation and reduce the amount of CO2 going into the atmosphere which is heating up the planet and causing the extreme weather that is causing billions of dollars of destruction every year. In addition undergrounding wires, as they have done in Germany, is a rational response to global warming which has caused the downing of power lines and consequent power outages that are becoming increasingly more common.

Abraham Lincoln’s government built the transcontinental railroad in a public/private partnership. Ditto for Dwight Eisenhauer and the Interstate highway system. The same thing needs to be done on a national scale with smart electrical grid infrastructure. Private corporations alone will not do this. Government has to be involved.

I’m looking foreward to a network of knowledgable activists working along the same lines on this subject.

avatar Chris August 25, 2012 at 8:10 pm

These transmission lines are obviously needed, without them overloads of existing lines will probably occur. What do you expect the transmission companies to do, just let the grid collapse because they hadn’t upgraded their grid to handle the extra demand? You also say how major failures are most often due to human errors; this problem can be solved via smart grid technology which automates line switching and loading and would increase reliability substantially. Distributed Generation will lower demand on average, and may defer transmission projects or in rare cases prevent transmission augmentation. During large scale power failures distributed generation will not work, they will shut down due to deislanding modules which are required for the safety of utility workers. Also the derecho storms that hit the region of the new transmission line significantly affected distribution equipment not the transmission grid.

avatar John Lawrence August 26, 2012 at 9:24 am

Chris, are you affiliated with the power company? It seems to be your job to deflate any article that goes against their interests.

avatar Chris August 27, 2012 at 5:49 am

I’m not actually, I just think I’ve stated some valid points that you haven’t mentioned in your articles. Your articles seem to go completely against the grid, there should be a balance between centralised and decentralised generation. Distributed generation is a partial solution to our energy problems and in some cases may cause a bigger problem than what they intend to solve. In some places power prices are also rising because of the net metering pay back as well as infrastructure modifications due to distributed generation.

avatar Keryn August 26, 2012 at 5:40 pm

Great article, Deb! I’m glad to see it getting more press!

I’m not sure I like the comparison to the interstate highway system. That’s an analogy the for-profit transmission owners have been using for several years. A larger “national” grid will be funded by consumers (compared to taxpayers funding the highway system), but unlike the publicly-owned highway system, the transmission line (“highway”) ends up being owned by a for-profit corporation. Tolls from our highways aren’t going into some fat cat CEO’s pockets, and highways are not owned by for-profit corporations. Transmission itself is now a profit center for utilities, and will enable more long-distance energy trading. A “national” grid that will enable more centralized generation and unlimited transfer of electrons from coast to coast is wasteful and unneeded. We’re moving away from this horse and buggy technology, but the utilities refuse to adapt and are digging their hands even deeper in our pockets in order to finance some $300B of new transmission on which they can earn double digit returns for the next 70 years and promote the development of renewables at utility scale in order to keep us all dependent on them as long as possible.

Downburst wind from the “derecho” also toppled one 500kV transmission tower in the “region” of the new transmission line. That one tower failure caused the adjacent towers to also come down (“zipper effect”). Steel lattice transmission towers have a design defect that can cause them fail in the second and third tiers from the bottom in a downburst situation. This design defect has been known to engineers for more than 16 years, but no fix has been made. Transmission is only as “reliable” as any system would be when it’s not properly maintained and upgraded. Building and interconnecting new transmission lines does nothing to make existing parts of the grid more reliable. Also, the derecho caused massive failure of the lower voltage transmission system in West Virginia. It wasn’t simply a failure of the distribution system.

avatar Chris August 27, 2012 at 6:14 pm

Reliability depends on how well utilities plan their networks, where I live we’ve had 1 outage in 5 years and our rates are average. How do transmission lines allow companies to earn double digits? Only when state interconnectors are made, state economies trade electricity and also use the interconnectors as backup supply. The electrical system in the USA sounds to me to be very bad! And you obviously have little knowledge in electrical, interconnectors saved our state from mass power outages during record demand in a heat wave a few years back. New lines where I live are only built for reliability purposes, actually the only new lines built in the last 5-10 years where to connect new wind/solar. 

avatar Steve Kramer August 27, 2012 at 8:09 am

You’re right that the centralized grid causes a number of problems and you’re right that the government needs to be involved in the solution. What you don’t address (or perhaps acknowledge) is that government intervention is responsible, in part, for getting us here in the first place.

The government pretty much mandated the creation of ISOs and RTOs in the East, which contributed to reduced costs of getting power across long distances, but also contributed to more centralization. There’s less of that in the West, but systems are still interconnected.

State governments’ mandates to incorporate more renewable energy are also partly responsible for the need for more transmission. Wind energy is currently the most popular (and viable) type of renewable energy, but the wind resources are usually located far from the places where the energy will be used.

And then, there are states’ rights. The federal government tried to take the larger view through the Energy Policy Act of 2005 but was shot down in court.

So yeah, the government needs to be involved. But it’s going to take a lot more than turning to Uncle Sam and saying, “You do it.” It’s going to take a major change in legislation to (let’s be real, here) take power (no pun intended) away from the states and give it to Washington, and I don’t think Congress has the political will to make that happen.

avatar John Lawrence August 27, 2012 at 11:36 am

This from “Community Power: Decentralized Renewable Energy in California,” by Al Weinrub, Dec. 2010. I suggest you read the whole paper.

Decentralized generation means that local residences, businesses, and
communities become electric power producers. Homeowners and small
businesses produce the power they need for their own consumption. Rather
than paying ever increasing energy bills to finance remote transmission lines
and central-station power, consumers become the direct beneficiaries of the
power they produce. In five to fifteen years, through savings on energy bills,
they pay off the cost of a solar PV system that will thereafter provide very low
cost power for a total of 25 to 40 years (the expected lifetime of a solar PV
system). The choice is pretty clear: pay for the ongoing cost of remote centralstation
renewable power or pocket the savings of locally-generated renewable
power.
Businesses with large rooftops or parking lots can become small power
companies that feed electricity into the grid. Their profits on electricity sales
are a steady source of revenue that can supplement their business. Community
cooperatives can pool the rooftop area of their neighborhoods to form, for
example, an East Oakland Power Company, which could use the revenues
generated from selling electricity to provide dividends to the members of the
cooperative, or to fund local development projects, or to bankroll new cleanenergy
businesses in the community.

avatar John August 28, 2012 at 11:43 pm

Those are a few benefits of distributed generation, but going in this direction over the long term may cause more problems than it intends to solve. For example the more DG installed, the more upgrades required to the low voltage distribution grid, for example a distribution transformer. Most solar inverters also use the grid to balance out the voltage frequency therefore another reason why the upgrades are required. These upgrades have predicted to cost Europe billions in dollars over the future plus the upgrades required to the grid for increasing demand as well as the net metering returns. This will cause power prices to rise significantly for the many who can’t afford solar systems as well as times when solar isn’t working such as night or stormy days. Nearly all countries have their winter peak demand around times where solar is working at 0-10%. This has caused significant problems in Germany as their grid has become very unstable trying to rely on DG. And for what? To prevent a few hours a year of outages? Or to save some money while this causes others to suffer.

avatar John Lawrence August 30, 2012 at 8:48 pm

Power outages in the US are a significant problem. Just ask all the people who are suffering now on the Gulf Coast from Hurricane Isaac. In Germany power outages are unheard of due to undergrounding of utilities.

avatar John August 30, 2012 at 8:57 pm

Maybe we should be investing in undegrounding power lines instead. Distributed generation will only assist in reducing peak demand, It will not prevent power outages as the generation units are required to shut down during an outage. Having any form of distributed generation will be useless to all those affected by the hurricane. Also A hurricane will certainly cause outages in most countires, it isn’t just the USA.

avatar Neil August 30, 2012 at 9:13 pm

Ive had experience in the electrical industry. A high penetration of DG to a local low voltage distribution grid will cause problems, therefore upgrades to the local grid are required which can be very costly. The expense of distributed generation is shifting the high costs of maintaining the HV transmission grid to maintaining the low voltage network. Traditionally the LV network required very little investment or upgrades. But keep in mind that at times when most distributed generation sources aren’t working (cloudy days, night, windless days) the transmission grid will need to cater for that demand so investments will still be needed within the transmission network. A household will be using a majority of its energy from the grid especially during winter. DG is a good clean source of energy and is a short term solution, but high growth rates will certainly cause more problems in the future and may be more costly than maintaining our current grid.

avatar John Lawrence August 31, 2012 at 3:32 pm

A new smart grid is part of the infrastructure so badly needed in the US. This will allow electricity generated by a multiplicity of means and from various areas geographically to work together harmoniously.

Comments on this entry are closed.

{ 1 trackback }

Older Article:

Newer Article: