Monthly Archives: September 2015

The archaic nature of ‘baseload’ power

The old grid, beholden to massive, polluting baseload power plants, is being replaced by a nimbler, high-tech 21st century system oriented toward variable renewable energy.

There are no shortage of skeptics out there, even some among environmentalists and clean energy advocates, who are unconvinced that renewable energy can ever be the dominant-perhaps even sole-source of electricity generation.

The reasons for this skepticism vary. Some, for example, argue that the land needs for sufficient generation of wind and solar power are too great. This turns out to be an incredibly lame argument, but that’s the subject of a different article.

More frequent are the arguments that ‘baseload’ power-large power plants that tend to run 24/7-are necessary to ensure reliable electricity and that the variable nature of some renewables-solar and wind-can’t provide that reliability.

Then there’s the notion that the electrical grid can only accommodate a certain level of renewables, around 30-40%. Above that and the grid pretty much breaks down. These arguments are actually related and solved in the same way.

More recently, an argument has been circling among energy nerds-especially pro-nuclear energy nerds-that the integration of renewables into the grid reaches a peak for economic reasons: that renewables are limited by their cost. Not by their high cost, but by their low cost, or as one writer put it: “solar and wind eat their own lunch.”

But that merely shows that not only must the technical nature of the grid change, and it can; but so must its economic nature, and it can too.

The good old days … too bad they were killing us

The electric grid in use today was mostly designed in the 20th century. Large baseload nuclear and fossil fuel plants were built, usually far from the largest electricity consumers (cities and large industry), and transported by huge (and not particular efficient) power lines.

Those baseload plants had, and have, high capacity factors and run pretty much all the time, although nuclear reactors have to be shut for refueling for a few weeks every 12-18 months. Utilities try to arrange those shutdowns to occur during periods of low demand.

During peak power needs – hot summer days in most of the country – smaller gas plants and in the old days even oil plants would be fired up to supplement the baseload levels. And it all worked pretty well given the technology available at the time.

But, as we all now know all too clearly, that system had a price – a price not reflected in the cost of electricity. That system was and is killing us. Those large nuclear and fossil fuel plants are spewing out carbon dioxide and radioactivity and creating large quantities of dirty and deadly waste products that society doesn’t know what to do with.

Had the cost of those effects – which do have a price, a steep one – been incorporated into the price we and our parents paid for electricity, we probably would have moved to a clean energy system much faster. As it is, we no longer have much of a choice.

Variable power sources more reliable, resilient than ‘baseload’

Fortunately, as is being proven daily in Europe, a grid based on smaller, distributed variable power sources can be just as reliable, and even more resilient and secure, than a grid reliant on baseload power.

Variable does not mean unreliable: as long as it can be reliably projected with sufficient advance time what the wind will do and thus how much wind power will be available where, and the same for the sun, then a variable grid can be highly reliable. And those can be and are, in fact, reliably projected.

The ability to integrate a moderately large amount (say 30-35% or so) of renewables into a baseload-dominated grid is a given. It is happening daily. Not so much in the US, although even here states like Iowa are getting more than 20% of their power from renewables, and the percentage of renewables is set to rise rapidly-both on their own for sound economic reasons and due to encouragement of them in the Clean Power Plan.

But at some point above 35-40% renewables or so, a conflict arises. If more renewables are to be brought into the grid, the large baseload plants have to begin closing – even if they theoretically remain useful.

That’s because the kind of grid that works for the variable renewables – a fast, nimble grid where power from different sources scattered in different locations can be ramped up and down quickly depending on where it is being generated and where it is needed – doesn’t work well for baseload plants, especially nuclear reactors, which cannot ramp up and down quickly.

Those kinds of plants were designed to run 24/7 and that’s what they do – they’re not designed to fit in with a grid that doesn’t want them to run 24/7, that instead wants them to run when their power is needed. And the higher the penetration of renewables, the less the baseload plants’ power is needed.

The new kid on the block: energy storage

Add in energy storage, the new kid on the block, and polluting power plants running 24/7 become an anachronism. When the variable sources aren’t generating what is needed, just release the stored, and cheaper, electricity they generated earlier during periods of low demand.

The polluting baseload plants then make no sense at all. Why throw carbon dioxide into the air and tritium into the water and generate lethal radioactive waste just to keep dirty and usually more expensive power plants operating just for those few hours in the week when they might be useful? With storage, they’re not needed, or even particularly useful, at all.

What’s stopping us, or slowing us anyway, is not the technology for the new grid – that exists. It’s the rules. And the political will to transform the grid to accommodate the transformative technologies that have been developed over the past two decades.

If we’re going to move into the 21st century, and with nearly 15% of the century already gone we’re a good ways into it, then we’d better get moving quickly. The old rules need to be changed; David Roberts, formerly of Grist, has compiled a useful list of some of those needed changes.

The problem – the powerful incumbents holding onto their profits

One problem, obviously, is that utilities don’t want to close their old baseload power plants if they are still useful at generating electricity. They want to put off that retirement date as long as possible. Assuming its operating and maintenance costs are not so high that it loses money, the longer a power plant runs the more profit it returns. And utilities are about making money, not transforming the grid.

In the US, at least, we’re not at the point where profitable baseload power plants have to be forced closed for the greater good-renewables don’t yet make up enough of our power to require that step. But parts of Europe are quickly getting there, and we in the US will get there in many places faster than most people now think – surely within the next decade.

Germany is already showing that a grid with a high penetration of renewables can be reliable, and that forcing reactors to close can not only be publicly acceptable, it can attain wide public support.

The larger problem in Germany these days is not the amount of renewables in place, it’s that there is so much renewable generation that the grid needs to be strengthened to better distribute that electricity across the country and for export to nations like Poland and Austria – which badly want that cheap, clean power.

Public opinion polls suggest that in the US, a similarly high penetration of renewables will be most welcome, even if anti-nuclear sentiment is not at German levels.

The real problem with renewables – they are ‘too cheap’

Perhaps forcing reactors to close won’t be necessary; enough are already unprofitable, and more are likely to become so in coming years that perhaps they will simply shut down, be replaced by renewables and it will all happen quietly and happily.

More likely though, as nuclear utilities contemplate 80-year operating licenses and squeezing every last watt of power out of them regardless of their age or safety condition, that could become the nuclear issue of the next decade for the public, state regulators and policymakers and the like: should existing reactors stay open when they’re still viable or be forced aside to welcome larger amounts of cleaner, safer and usually cheaper renewables?

From our perspective, the answer is obviously yes, they should shut down to make way for the more modern system. But that’s an answer that will take a lot of preparation and groundwork beginning now, because the nuclear utilities will fight that hard.

That’s a somewhat different issue than the one that confronts us today, which is should uneconomic reactors stay open or move aside for renewables? The nuclear utilities want the ground rules changed to force ratepayers to keep those uneconomic reactors open regardless of their cost.

That’s an easy argument to make: of course the rules shouldn’t be changed to favor the higher-priced, dirtier power source. And it appears that argument is on the verge of victory in Illinois – the most nuclear state in the US. If that argument does end up carrying the day there, it can everywhere.

As for the notion that solar and wind are too cheap, that just shows the absurd nature of the economics of electricity and the failure to consider external costs – the environmental damage they cause and the full lifecycle costs of their existence – in the economic equation.

There is more to life than the dollar, though you wouldn’t know it by how many traditional markets work, and, in fact, we have reached the point that unless ‘more to life’ is adequately factored into prices, there may not be any life at all.

The concept being bandied about by these pro-nukers is that if there is ‘too much’ solar and wind in the system, its price will eventually become zero – essentially free. And at that price – or no price if you will – the system breaks down and there will be no more investment in solar and wind. Who would want to invest in it if you have to give it away?

The cheap power ‘problem’ can be solved – if you want to solve it

There are ways around the problem even under the existing system, from feed-in tariffs to Power Purchase Agreements. And the ‘problem’ itself still has at its foundation the baseload concept of electricity generation and distribution. Absent those baseload plants, which only inhibit renewable generation anyway, there cannot be ‘too much’ renewables in the system.

But including the real costs of nuclear and fossil fuel use would be the best step. Because once added in, those costs make that kind of generation too expensive to use no matter what the competition. And if the only choice is low-cost to zero-cost renewables, well, certainly consumers wouldn’t mind.

In the real world, rather than abstract economic modeling scenarios, electricity is a necessity and it will be provided. But in the real world, in the new world of the 21st century electricity grid, it may well be that electricity itself will not be as profitable to generators as it was in the 20th century.

Energy efficiency is reducing demand and that, despite a growing population and even with economic growth, is a trend that will continue and probably accelerate (Maryland, for example, has set a new policy of reducing demand by 2% every year). Renewables act to drive down electricity prices.

Certainly the idea that individual utilities, or even a consortium dominated by a single utility (a la Vogtle or Summer) will ever again build mega-billion dollar power plants of any kind just in order to sell electricity, is a relic of the 20th century playing out today as farce.

It won’t be playing out much longer. Utilities, like Virginia’s Dominion, that may think that obsolete model still applies, will regret it.

Not too cheap to meter – but too cheap to worry about

Electricity may never be free, or too cheap to meter, but it may well become one of life’s little bargains. Long distance in my lifetime has gone from an expensive luxury item rarely used; to an inexpensive, frequently-dialed option; to a free add-on to both my landline and iphone plans.

For my millennial kids, the concept of a ‘long-distance’ call is meaningless: they’ve never made one and never will. But they do still use phones, and all the services modern phone plans offer.

The costs of electricity are going to come down too – technology and renewables are already starting to see to that – but someone, whether it be the traditional utilities or someone smarter is going to come along and figure out how to make money by providing electricity add-ons and services, even if the electricity itself is free or nearly so.

Totally free electricity may be too much to hope for, there is a grid to pay for and maintain after all, and there will be for the foreseeable future. But the money to be made will be in the add-on services, not the basic electricity.

The solar rooftop people have pretty much already figured this out for their slice of the business – whether by lease or purchase, you pay primarily for the equipment, installation and maintenance, not so much for the electricity.

But since rooftop solar doesn’t work for everyone nor everywhere, there is a market ready for something new and safe and clean and that won’t destroy the planet we live on. I’d say that’s a pretty damn large market looking for the electricity equivalent of long-distance in the iphone era. With a market like that, someone is going to deliver, even if the electricity itself is little more than a low-cost add-on to other services people want.

That won’t happen tomorrow, of course. As Barry Cinnamon of The Energy Show podcast put it, “But this change in our energy sources will take many years, just as the complete transition from ‘horse and buggy’ transportation to gas-powered cars took 50 years.

“As with other large-scale technological changes, customer economics will force the current incumbent energy providers to change (unlikely), or go out of business (more likely). It’s a virtuous cycle as more customers are satisfied with renewable power generation, and more people are employed in these industries.”

Bye-bye nuclear – no place for you in this new power market

For nuclear power though – even for ‘small modular reactors’ (which actually are not so small, most are much larger than the early US commercial reactors like Big Rock Point and Yankee Rowe and some are as large as Fukushima Daiichi Unit-1) – and fossil fuels as well, the transformation means extinction.

By definition, SMRs are also baseload power plants; despite being smaller than today’s behemoth reactors, they are designed to run 24/7 and like their larger brethren, cannot power up and down quickly.

Before they even exist, they are obsolete. Their polluting ‘baseload’ means of providing their product (electricity) will be unneeded and functionally and economically irrelevant – unable to compete with those offering electricity as part of a set of services, rather than as an end in itself.

 


 

Michael Mariotte is Executive Director at Nuclear Information and Resource Service (NIRS).

This article originally ran on Green World, a news and information service of NIRS.

 

The archaic nature of ‘baseload’ power

The old grid, beholden to massive, polluting baseload power plants, is being replaced by a nimbler, high-tech 21st century system oriented toward variable renewable energy.

There are no shortage of skeptics out there, even some among environmentalists and clean energy advocates, who are unconvinced that renewable energy can ever be the dominant-perhaps even sole-source of electricity generation.

The reasons for this skepticism vary. Some, for example, argue that the land needs for sufficient generation of wind and solar power are too great. This turns out to be an incredibly lame argument, but that’s the subject of a different article.

More frequent are the arguments that ‘baseload’ power-large power plants that tend to run 24/7-are necessary to ensure reliable electricity and that the variable nature of some renewables-solar and wind-can’t provide that reliability.

Then there’s the notion that the electrical grid can only accommodate a certain level of renewables, around 30-40%. Above that and the grid pretty much breaks down. These arguments are actually related and solved in the same way.

More recently, an argument has been circling among energy nerds-especially pro-nuclear energy nerds-that the integration of renewables into the grid reaches a peak for economic reasons: that renewables are limited by their cost. Not by their high cost, but by their low cost, or as one writer put it: “solar and wind eat their own lunch.”

But that merely shows that not only must the technical nature of the grid change, and it can; but so must its economic nature, and it can too.

The good old days … too bad they were killing us

The electric grid in use today was mostly designed in the 20th century. Large baseload nuclear and fossil fuel plants were built, usually far from the largest electricity consumers (cities and large industry), and transported by huge (and not particular efficient) power lines.

Those baseload plants had, and have, high capacity factors and run pretty much all the time, although nuclear reactors have to be shut for refueling for a few weeks every 12-18 months. Utilities try to arrange those shutdowns to occur during periods of low demand.

During peak power needs – hot summer days in most of the country – smaller gas plants and in the old days even oil plants would be fired up to supplement the baseload levels. And it all worked pretty well given the technology available at the time.

But, as we all now know all too clearly, that system had a price – a price not reflected in the cost of electricity. That system was and is killing us. Those large nuclear and fossil fuel plants are spewing out carbon dioxide and radioactivity and creating large quantities of dirty and deadly waste products that society doesn’t know what to do with.

Had the cost of those effects – which do have a price, a steep one – been incorporated into the price we and our parents paid for electricity, we probably would have moved to a clean energy system much faster. As it is, we no longer have much of a choice.

Variable power sources more reliable, resilient than ‘baseload’

Fortunately, as is being proven daily in Europe, a grid based on smaller, distributed variable power sources can be just as reliable, and even more resilient and secure, than a grid reliant on baseload power.

Variable does not mean unreliable: as long as it can be reliably projected with sufficient advance time what the wind will do and thus how much wind power will be available where, and the same for the sun, then a variable grid can be highly reliable. And those can be and are, in fact, reliably projected.

The ability to integrate a moderately large amount (say 30-35% or so) of renewables into a baseload-dominated grid is a given. It is happening daily. Not so much in the US, although even here states like Iowa are getting more than 20% of their power from renewables, and the percentage of renewables is set to rise rapidly-both on their own for sound economic reasons and due to encouragement of them in the Clean Power Plan.

But at some point above 35-40% renewables or so, a conflict arises. If more renewables are to be brought into the grid, the large baseload plants have to begin closing – even if they theoretically remain useful.

That’s because the kind of grid that works for the variable renewables – a fast, nimble grid where power from different sources scattered in different locations can be ramped up and down quickly depending on where it is being generated and where it is needed – doesn’t work well for baseload plants, especially nuclear reactors, which cannot ramp up and down quickly.

Those kinds of plants were designed to run 24/7 and that’s what they do – they’re not designed to fit in with a grid that doesn’t want them to run 24/7, that instead wants them to run when their power is needed. And the higher the penetration of renewables, the less the baseload plants’ power is needed.

The new kid on the block: energy storage

Add in energy storage, the new kid on the block, and polluting power plants running 24/7 become an anachronism. When the variable sources aren’t generating what is needed, just release the stored, and cheaper, electricity they generated earlier during periods of low demand.

The polluting baseload plants then make no sense at all. Why throw carbon dioxide into the air and tritium into the water and generate lethal radioactive waste just to keep dirty and usually more expensive power plants operating just for those few hours in the week when they might be useful? With storage, they’re not needed, or even particularly useful, at all.

What’s stopping us, or slowing us anyway, is not the technology for the new grid – that exists. It’s the rules. And the political will to transform the grid to accommodate the transformative technologies that have been developed over the past two decades.

If we’re going to move into the 21st century, and with nearly 15% of the century already gone we’re a good ways into it, then we’d better get moving quickly. The old rules need to be changed; David Roberts, formerly of Grist, has compiled a useful list of some of those needed changes.

The problem – the powerful incumbents holding onto their profits

One problem, obviously, is that utilities don’t want to close their old baseload power plants if they are still useful at generating electricity. They want to put off that retirement date as long as possible. Assuming its operating and maintenance costs are not so high that it loses money, the longer a power plant runs the more profit it returns. And utilities are about making money, not transforming the grid.

In the US, at least, we’re not at the point where profitable baseload power plants have to be forced closed for the greater good-renewables don’t yet make up enough of our power to require that step. But parts of Europe are quickly getting there, and we in the US will get there in many places faster than most people now think – surely within the next decade.

Germany is already showing that a grid with a high penetration of renewables can be reliable, and that forcing reactors to close can not only be publicly acceptable, it can attain wide public support.

The larger problem in Germany these days is not the amount of renewables in place, it’s that there is so much renewable generation that the grid needs to be strengthened to better distribute that electricity across the country and for export to nations like Poland and Austria – which badly want that cheap, clean power.

Public opinion polls suggest that in the US, a similarly high penetration of renewables will be most welcome, even if anti-nuclear sentiment is not at German levels.

The real problem with renewables – they are ‘too cheap’

Perhaps forcing reactors to close won’t be necessary; enough are already unprofitable, and more are likely to become so in coming years that perhaps they will simply shut down, be replaced by renewables and it will all happen quietly and happily.

More likely though, as nuclear utilities contemplate 80-year operating licenses and squeezing every last watt of power out of them regardless of their age or safety condition, that could become the nuclear issue of the next decade for the public, state regulators and policymakers and the like: should existing reactors stay open when they’re still viable or be forced aside to welcome larger amounts of cleaner, safer and usually cheaper renewables?

From our perspective, the answer is obviously yes, they should shut down to make way for the more modern system. But that’s an answer that will take a lot of preparation and groundwork beginning now, because the nuclear utilities will fight that hard.

That’s a somewhat different issue than the one that confronts us today, which is should uneconomic reactors stay open or move aside for renewables? The nuclear utilities want the ground rules changed to force ratepayers to keep those uneconomic reactors open regardless of their cost.

That’s an easy argument to make: of course the rules shouldn’t be changed to favor the higher-priced, dirtier power source. And it appears that argument is on the verge of victory in Illinois – the most nuclear state in the US. If that argument does end up carrying the day there, it can everywhere.

As for the notion that solar and wind are too cheap, that just shows the absurd nature of the economics of electricity and the failure to consider external costs – the environmental damage they cause and the full lifecycle costs of their existence – in the economic equation.

There is more to life than the dollar, though you wouldn’t know it by how many traditional markets work, and, in fact, we have reached the point that unless ‘more to life’ is adequately factored into prices, there may not be any life at all.

The concept being bandied about by these pro-nukers is that if there is ‘too much’ solar and wind in the system, its price will eventually become zero – essentially free. And at that price – or no price if you will – the system breaks down and there will be no more investment in solar and wind. Who would want to invest in it if you have to give it away?

The cheap power ‘problem’ can be solved – if you want to solve it

There are ways around the problem even under the existing system, from feed-in tariffs to Power Purchase Agreements. And the ‘problem’ itself still has at its foundation the baseload concept of electricity generation and distribution. Absent those baseload plants, which only inhibit renewable generation anyway, there cannot be ‘too much’ renewables in the system.

But including the real costs of nuclear and fossil fuel use would be the best step. Because once added in, those costs make that kind of generation too expensive to use no matter what the competition. And if the only choice is low-cost to zero-cost renewables, well, certainly consumers wouldn’t mind.

In the real world, rather than abstract economic modeling scenarios, electricity is a necessity and it will be provided. But in the real world, in the new world of the 21st century electricity grid, it may well be that electricity itself will not be as profitable to generators as it was in the 20th century.

Energy efficiency is reducing demand and that, despite a growing population and even with economic growth, is a trend that will continue and probably accelerate (Maryland, for example, has set a new policy of reducing demand by 2% every year). Renewables act to drive down electricity prices.

Certainly the idea that individual utilities, or even a consortium dominated by a single utility (a la Vogtle or Summer) will ever again build mega-billion dollar power plants of any kind just in order to sell electricity, is a relic of the 20th century playing out today as farce.

It won’t be playing out much longer. Utilities, like Virginia’s Dominion, that may think that obsolete model still applies, will regret it.

Not too cheap to meter – but too cheap to worry about

Electricity may never be free, or too cheap to meter, but it may well become one of life’s little bargains. Long distance in my lifetime has gone from an expensive luxury item rarely used; to an inexpensive, frequently-dialed option; to a free add-on to both my landline and iphone plans.

For my millennial kids, the concept of a ‘long-distance’ call is meaningless: they’ve never made one and never will. But they do still use phones, and all the services modern phone plans offer.

The costs of electricity are going to come down too – technology and renewables are already starting to see to that – but someone, whether it be the traditional utilities or someone smarter is going to come along and figure out how to make money by providing electricity add-ons and services, even if the electricity itself is free or nearly so.

Totally free electricity may be too much to hope for, there is a grid to pay for and maintain after all, and there will be for the foreseeable future. But the money to be made will be in the add-on services, not the basic electricity.

The solar rooftop people have pretty much already figured this out for their slice of the business – whether by lease or purchase, you pay primarily for the equipment, installation and maintenance, not so much for the electricity.

But since rooftop solar doesn’t work for everyone nor everywhere, there is a market ready for something new and safe and clean and that won’t destroy the planet we live on. I’d say that’s a pretty damn large market looking for the electricity equivalent of long-distance in the iphone era. With a market like that, someone is going to deliver, even if the electricity itself is little more than a low-cost add-on to other services people want.

That won’t happen tomorrow, of course. As Barry Cinnamon of The Energy Show podcast put it, “But this change in our energy sources will take many years, just as the complete transition from ‘horse and buggy’ transportation to gas-powered cars took 50 years.

“As with other large-scale technological changes, customer economics will force the current incumbent energy providers to change (unlikely), or go out of business (more likely). It’s a virtuous cycle as more customers are satisfied with renewable power generation, and more people are employed in these industries.”

Bye-bye nuclear – no place for you in this new power market

For nuclear power though – even for ‘small modular reactors’ (which actually are not so small, most are much larger than the early US commercial reactors like Big Rock Point and Yankee Rowe and some are as large as Fukushima Daiichi Unit-1) – and fossil fuels as well, the transformation means extinction.

By definition, SMRs are also baseload power plants; despite being smaller than today’s behemoth reactors, they are designed to run 24/7 and like their larger brethren, cannot power up and down quickly.

Before they even exist, they are obsolete. Their polluting ‘baseload’ means of providing their product (electricity) will be unneeded and functionally and economically irrelevant – unable to compete with those offering electricity as part of a set of services, rather than as an end in itself.

 


 

Michael Mariotte is Executive Director at Nuclear Information and Resource Service (NIRS).

This article originally ran on Green World, a news and information service of NIRS.

 

The archaic nature of ‘baseload’ power

The old grid, beholden to massive, polluting baseload power plants, is being replaced by a nimbler, high-tech 21st century system oriented toward variable renewable energy.

There are no shortage of skeptics out there, even some among environmentalists and clean energy advocates, who are unconvinced that renewable energy can ever be the dominant-perhaps even sole-source of electricity generation.

The reasons for this skepticism vary. Some, for example, argue that the land needs for sufficient generation of wind and solar power are too great. This turns out to be an incredibly lame argument, but that’s the subject of a different article.

More frequent are the arguments that ‘baseload’ power-large power plants that tend to run 24/7-are necessary to ensure reliable electricity and that the variable nature of some renewables-solar and wind-can’t provide that reliability.

Then there’s the notion that the electrical grid can only accommodate a certain level of renewables, around 30-40%. Above that and the grid pretty much breaks down. These arguments are actually related and solved in the same way.

More recently, an argument has been circling among energy nerds-especially pro-nuclear energy nerds-that the integration of renewables into the grid reaches a peak for economic reasons: that renewables are limited by their cost. Not by their high cost, but by their low cost, or as one writer put it: “solar and wind eat their own lunch.”

But that merely shows that not only must the technical nature of the grid change, and it can; but so must its economic nature, and it can too.

The good old days … too bad they were killing us

The electric grid in use today was mostly designed in the 20th century. Large baseload nuclear and fossil fuel plants were built, usually far from the largest electricity consumers (cities and large industry), and transported by huge (and not particular efficient) power lines.

Those baseload plants had, and have, high capacity factors and run pretty much all the time, although nuclear reactors have to be shut for refueling for a few weeks every 12-18 months. Utilities try to arrange those shutdowns to occur during periods of low demand.

During peak power needs – hot summer days in most of the country – smaller gas plants and in the old days even oil plants would be fired up to supplement the baseload levels. And it all worked pretty well given the technology available at the time.

But, as we all now know all too clearly, that system had a price – a price not reflected in the cost of electricity. That system was and is killing us. Those large nuclear and fossil fuel plants are spewing out carbon dioxide and radioactivity and creating large quantities of dirty and deadly waste products that society doesn’t know what to do with.

Had the cost of those effects – which do have a price, a steep one – been incorporated into the price we and our parents paid for electricity, we probably would have moved to a clean energy system much faster. As it is, we no longer have much of a choice.

Variable power sources more reliable, resilient than ‘baseload’

Fortunately, as is being proven daily in Europe, a grid based on smaller, distributed variable power sources can be just as reliable, and even more resilient and secure, than a grid reliant on baseload power.

Variable does not mean unreliable: as long as it can be reliably projected with sufficient advance time what the wind will do and thus how much wind power will be available where, and the same for the sun, then a variable grid can be highly reliable. And those can be and are, in fact, reliably projected.

The ability to integrate a moderately large amount (say 30-35% or so) of renewables into a baseload-dominated grid is a given. It is happening daily. Not so much in the US, although even here states like Iowa are getting more than 20% of their power from renewables, and the percentage of renewables is set to rise rapidly-both on their own for sound economic reasons and due to encouragement of them in the Clean Power Plan.

But at some point above 35-40% renewables or so, a conflict arises. If more renewables are to be brought into the grid, the large baseload plants have to begin closing – even if they theoretically remain useful.

That’s because the kind of grid that works for the variable renewables – a fast, nimble grid where power from different sources scattered in different locations can be ramped up and down quickly depending on where it is being generated and where it is needed – doesn’t work well for baseload plants, especially nuclear reactors, which cannot ramp up and down quickly.

Those kinds of plants were designed to run 24/7 and that’s what they do – they’re not designed to fit in with a grid that doesn’t want them to run 24/7, that instead wants them to run when their power is needed. And the higher the penetration of renewables, the less the baseload plants’ power is needed.

The new kid on the block: energy storage

Add in energy storage, the new kid on the block, and polluting power plants running 24/7 become an anachronism. When the variable sources aren’t generating what is needed, just release the stored, and cheaper, electricity they generated earlier during periods of low demand.

The polluting baseload plants then make no sense at all. Why throw carbon dioxide into the air and tritium into the water and generate lethal radioactive waste just to keep dirty and usually more expensive power plants operating just for those few hours in the week when they might be useful? With storage, they’re not needed, or even particularly useful, at all.

What’s stopping us, or slowing us anyway, is not the technology for the new grid – that exists. It’s the rules. And the political will to transform the grid to accommodate the transformative technologies that have been developed over the past two decades.

If we’re going to move into the 21st century, and with nearly 15% of the century already gone we’re a good ways into it, then we’d better get moving quickly. The old rules need to be changed; David Roberts, formerly of Grist, has compiled a useful list of some of those needed changes.

The problem – the powerful incumbents holding onto their profits

One problem, obviously, is that utilities don’t want to close their old baseload power plants if they are still useful at generating electricity. They want to put off that retirement date as long as possible. Assuming its operating and maintenance costs are not so high that it loses money, the longer a power plant runs the more profit it returns. And utilities are about making money, not transforming the grid.

In the US, at least, we’re not at the point where profitable baseload power plants have to be forced closed for the greater good-renewables don’t yet make up enough of our power to require that step. But parts of Europe are quickly getting there, and we in the US will get there in many places faster than most people now think – surely within the next decade.

Germany is already showing that a grid with a high penetration of renewables can be reliable, and that forcing reactors to close can not only be publicly acceptable, it can attain wide public support.

The larger problem in Germany these days is not the amount of renewables in place, it’s that there is so much renewable generation that the grid needs to be strengthened to better distribute that electricity across the country and for export to nations like Poland and Austria – which badly want that cheap, clean power.

Public opinion polls suggest that in the US, a similarly high penetration of renewables will be most welcome, even if anti-nuclear sentiment is not at German levels.

The real problem with renewables – they are ‘too cheap’

Perhaps forcing reactors to close won’t be necessary; enough are already unprofitable, and more are likely to become so in coming years that perhaps they will simply shut down, be replaced by renewables and it will all happen quietly and happily.

More likely though, as nuclear utilities contemplate 80-year operating licenses and squeezing every last watt of power out of them regardless of their age or safety condition, that could become the nuclear issue of the next decade for the public, state regulators and policymakers and the like: should existing reactors stay open when they’re still viable or be forced aside to welcome larger amounts of cleaner, safer and usually cheaper renewables?

From our perspective, the answer is obviously yes, they should shut down to make way for the more modern system. But that’s an answer that will take a lot of preparation and groundwork beginning now, because the nuclear utilities will fight that hard.

That’s a somewhat different issue than the one that confronts us today, which is should uneconomic reactors stay open or move aside for renewables? The nuclear utilities want the ground rules changed to force ratepayers to keep those uneconomic reactors open regardless of their cost.

That’s an easy argument to make: of course the rules shouldn’t be changed to favor the higher-priced, dirtier power source. And it appears that argument is on the verge of victory in Illinois – the most nuclear state in the US. If that argument does end up carrying the day there, it can everywhere.

As for the notion that solar and wind are too cheap, that just shows the absurd nature of the economics of electricity and the failure to consider external costs – the environmental damage they cause and the full lifecycle costs of their existence – in the economic equation.

There is more to life than the dollar, though you wouldn’t know it by how many traditional markets work, and, in fact, we have reached the point that unless ‘more to life’ is adequately factored into prices, there may not be any life at all.

The concept being bandied about by these pro-nukers is that if there is ‘too much’ solar and wind in the system, its price will eventually become zero – essentially free. And at that price – or no price if you will – the system breaks down and there will be no more investment in solar and wind. Who would want to invest in it if you have to give it away?

The cheap power ‘problem’ can be solved – if you want to solve it

There are ways around the problem even under the existing system, from feed-in tariffs to Power Purchase Agreements. And the ‘problem’ itself still has at its foundation the baseload concept of electricity generation and distribution. Absent those baseload plants, which only inhibit renewable generation anyway, there cannot be ‘too much’ renewables in the system.

But including the real costs of nuclear and fossil fuel use would be the best step. Because once added in, those costs make that kind of generation too expensive to use no matter what the competition. And if the only choice is low-cost to zero-cost renewables, well, certainly consumers wouldn’t mind.

In the real world, rather than abstract economic modeling scenarios, electricity is a necessity and it will be provided. But in the real world, in the new world of the 21st century electricity grid, it may well be that electricity itself will not be as profitable to generators as it was in the 20th century.

Energy efficiency is reducing demand and that, despite a growing population and even with economic growth, is a trend that will continue and probably accelerate (Maryland, for example, has set a new policy of reducing demand by 2% every year). Renewables act to drive down electricity prices.

Certainly the idea that individual utilities, or even a consortium dominated by a single utility (a la Vogtle or Summer) will ever again build mega-billion dollar power plants of any kind just in order to sell electricity, is a relic of the 20th century playing out today as farce.

It won’t be playing out much longer. Utilities, like Virginia’s Dominion, that may think that obsolete model still applies, will regret it.

Not too cheap to meter – but too cheap to worry about

Electricity may never be free, or too cheap to meter, but it may well become one of life’s little bargains. Long distance in my lifetime has gone from an expensive luxury item rarely used; to an inexpensive, frequently-dialed option; to a free add-on to both my landline and iphone plans.

For my millennial kids, the concept of a ‘long-distance’ call is meaningless: they’ve never made one and never will. But they do still use phones, and all the services modern phone plans offer.

The costs of electricity are going to come down too – technology and renewables are already starting to see to that – but someone, whether it be the traditional utilities or someone smarter is going to come along and figure out how to make money by providing electricity add-ons and services, even if the electricity itself is free or nearly so.

Totally free electricity may be too much to hope for, there is a grid to pay for and maintain after all, and there will be for the foreseeable future. But the money to be made will be in the add-on services, not the basic electricity.

The solar rooftop people have pretty much already figured this out for their slice of the business – whether by lease or purchase, you pay primarily for the equipment, installation and maintenance, not so much for the electricity.

But since rooftop solar doesn’t work for everyone nor everywhere, there is a market ready for something new and safe and clean and that won’t destroy the planet we live on. I’d say that’s a pretty damn large market looking for the electricity equivalent of long-distance in the iphone era. With a market like that, someone is going to deliver, even if the electricity itself is little more than a low-cost add-on to other services people want.

That won’t happen tomorrow, of course. As Barry Cinnamon of The Energy Show podcast put it, “But this change in our energy sources will take many years, just as the complete transition from ‘horse and buggy’ transportation to gas-powered cars took 50 years.

“As with other large-scale technological changes, customer economics will force the current incumbent energy providers to change (unlikely), or go out of business (more likely). It’s a virtuous cycle as more customers are satisfied with renewable power generation, and more people are employed in these industries.”

Bye-bye nuclear – no place for you in this new power market

For nuclear power though – even for ‘small modular reactors’ (which actually are not so small, most are much larger than the early US commercial reactors like Big Rock Point and Yankee Rowe and some are as large as Fukushima Daiichi Unit-1) – and fossil fuels as well, the transformation means extinction.

By definition, SMRs are also baseload power plants; despite being smaller than today’s behemoth reactors, they are designed to run 24/7 and like their larger brethren, cannot power up and down quickly.

Before they even exist, they are obsolete. Their polluting ‘baseload’ means of providing their product (electricity) will be unneeded and functionally and economically irrelevant – unable to compete with those offering electricity as part of a set of services, rather than as an end in itself.

 


 

Michael Mariotte is Executive Director at Nuclear Information and Resource Service (NIRS).

This article originally ran on Green World, a news and information service of NIRS.

 

The archaic nature of ‘baseload’ power

The old grid, beholden to massive, polluting baseload power plants, is being replaced by a nimbler, high-tech 21st century system oriented toward variable renewable energy.

There are no shortage of skeptics out there, even some among environmentalists and clean energy advocates, who are unconvinced that renewable energy can ever be the dominant-perhaps even sole-source of electricity generation.

The reasons for this skepticism vary. Some, for example, argue that the land needs for sufficient generation of wind and solar power are too great. This turns out to be an incredibly lame argument, but that’s the subject of a different article.

More frequent are the arguments that ‘baseload’ power-large power plants that tend to run 24/7-are necessary to ensure reliable electricity and that the variable nature of some renewables-solar and wind-can’t provide that reliability.

Then there’s the notion that the electrical grid can only accommodate a certain level of renewables, around 30-40%. Above that and the grid pretty much breaks down. These arguments are actually related and solved in the same way.

More recently, an argument has been circling among energy nerds-especially pro-nuclear energy nerds-that the integration of renewables into the grid reaches a peak for economic reasons: that renewables are limited by their cost. Not by their high cost, but by their low cost, or as one writer put it: “solar and wind eat their own lunch.”

But that merely shows that not only must the technical nature of the grid change, and it can; but so must its economic nature, and it can too.

The good old days … too bad they were killing us

The electric grid in use today was mostly designed in the 20th century. Large baseload nuclear and fossil fuel plants were built, usually far from the largest electricity consumers (cities and large industry), and transported by huge (and not particular efficient) power lines.

Those baseload plants had, and have, high capacity factors and run pretty much all the time, although nuclear reactors have to be shut for refueling for a few weeks every 12-18 months. Utilities try to arrange those shutdowns to occur during periods of low demand.

During peak power needs – hot summer days in most of the country – smaller gas plants and in the old days even oil plants would be fired up to supplement the baseload levels. And it all worked pretty well given the technology available at the time.

But, as we all now know all too clearly, that system had a price – a price not reflected in the cost of electricity. That system was and is killing us. Those large nuclear and fossil fuel plants are spewing out carbon dioxide and radioactivity and creating large quantities of dirty and deadly waste products that society doesn’t know what to do with.

Had the cost of those effects – which do have a price, a steep one – been incorporated into the price we and our parents paid for electricity, we probably would have moved to a clean energy system much faster. As it is, we no longer have much of a choice.

Variable power sources more reliable, resilient than ‘baseload’

Fortunately, as is being proven daily in Europe, a grid based on smaller, distributed variable power sources can be just as reliable, and even more resilient and secure, than a grid reliant on baseload power.

Variable does not mean unreliable: as long as it can be reliably projected with sufficient advance time what the wind will do and thus how much wind power will be available where, and the same for the sun, then a variable grid can be highly reliable. And those can be and are, in fact, reliably projected.

The ability to integrate a moderately large amount (say 30-35% or so) of renewables into a baseload-dominated grid is a given. It is happening daily. Not so much in the US, although even here states like Iowa are getting more than 20% of their power from renewables, and the percentage of renewables is set to rise rapidly-both on their own for sound economic reasons and due to encouragement of them in the Clean Power Plan.

But at some point above 35-40% renewables or so, a conflict arises. If more renewables are to be brought into the grid, the large baseload plants have to begin closing – even if they theoretically remain useful.

That’s because the kind of grid that works for the variable renewables – a fast, nimble grid where power from different sources scattered in different locations can be ramped up and down quickly depending on where it is being generated and where it is needed – doesn’t work well for baseload plants, especially nuclear reactors, which cannot ramp up and down quickly.

Those kinds of plants were designed to run 24/7 and that’s what they do – they’re not designed to fit in with a grid that doesn’t want them to run 24/7, that instead wants them to run when their power is needed. And the higher the penetration of renewables, the less the baseload plants’ power is needed.

The new kid on the block: energy storage

Add in energy storage, the new kid on the block, and polluting power plants running 24/7 become an anachronism. When the variable sources aren’t generating what is needed, just release the stored, and cheaper, electricity they generated earlier during periods of low demand.

The polluting baseload plants then make no sense at all. Why throw carbon dioxide into the air and tritium into the water and generate lethal radioactive waste just to keep dirty and usually more expensive power plants operating just for those few hours in the week when they might be useful? With storage, they’re not needed, or even particularly useful, at all.

What’s stopping us, or slowing us anyway, is not the technology for the new grid – that exists. It’s the rules. And the political will to transform the grid to accommodate the transformative technologies that have been developed over the past two decades.

If we’re going to move into the 21st century, and with nearly 15% of the century already gone we’re a good ways into it, then we’d better get moving quickly. The old rules need to be changed; David Roberts, formerly of Grist, has compiled a useful list of some of those needed changes.

The problem – the powerful incumbents holding onto their profits

One problem, obviously, is that utilities don’t want to close their old baseload power plants if they are still useful at generating electricity. They want to put off that retirement date as long as possible. Assuming its operating and maintenance costs are not so high that it loses money, the longer a power plant runs the more profit it returns. And utilities are about making money, not transforming the grid.

In the US, at least, we’re not at the point where profitable baseload power plants have to be forced closed for the greater good-renewables don’t yet make up enough of our power to require that step. But parts of Europe are quickly getting there, and we in the US will get there in many places faster than most people now think – surely within the next decade.

Germany is already showing that a grid with a high penetration of renewables can be reliable, and that forcing reactors to close can not only be publicly acceptable, it can attain wide public support.

The larger problem in Germany these days is not the amount of renewables in place, it’s that there is so much renewable generation that the grid needs to be strengthened to better distribute that electricity across the country and for export to nations like Poland and Austria – which badly want that cheap, clean power.

Public opinion polls suggest that in the US, a similarly high penetration of renewables will be most welcome, even if anti-nuclear sentiment is not at German levels.

The real problem with renewables – they are ‘too cheap’

Perhaps forcing reactors to close won’t be necessary; enough are already unprofitable, and more are likely to become so in coming years that perhaps they will simply shut down, be replaced by renewables and it will all happen quietly and happily.

More likely though, as nuclear utilities contemplate 80-year operating licenses and squeezing every last watt of power out of them regardless of their age or safety condition, that could become the nuclear issue of the next decade for the public, state regulators and policymakers and the like: should existing reactors stay open when they’re still viable or be forced aside to welcome larger amounts of cleaner, safer and usually cheaper renewables?

From our perspective, the answer is obviously yes, they should shut down to make way for the more modern system. But that’s an answer that will take a lot of preparation and groundwork beginning now, because the nuclear utilities will fight that hard.

That’s a somewhat different issue than the one that confronts us today, which is should uneconomic reactors stay open or move aside for renewables? The nuclear utilities want the ground rules changed to force ratepayers to keep those uneconomic reactors open regardless of their cost.

That’s an easy argument to make: of course the rules shouldn’t be changed to favor the higher-priced, dirtier power source. And it appears that argument is on the verge of victory in Illinois – the most nuclear state in the US. If that argument does end up carrying the day there, it can everywhere.

As for the notion that solar and wind are too cheap, that just shows the absurd nature of the economics of electricity and the failure to consider external costs – the environmental damage they cause and the full lifecycle costs of their existence – in the economic equation.

There is more to life than the dollar, though you wouldn’t know it by how many traditional markets work, and, in fact, we have reached the point that unless ‘more to life’ is adequately factored into prices, there may not be any life at all.

The concept being bandied about by these pro-nukers is that if there is ‘too much’ solar and wind in the system, its price will eventually become zero – essentially free. And at that price – or no price if you will – the system breaks down and there will be no more investment in solar and wind. Who would want to invest in it if you have to give it away?

The cheap power ‘problem’ can be solved – if you want to solve it

There are ways around the problem even under the existing system, from feed-in tariffs to Power Purchase Agreements. And the ‘problem’ itself still has at its foundation the baseload concept of electricity generation and distribution. Absent those baseload plants, which only inhibit renewable generation anyway, there cannot be ‘too much’ renewables in the system.

But including the real costs of nuclear and fossil fuel use would be the best step. Because once added in, those costs make that kind of generation too expensive to use no matter what the competition. And if the only choice is low-cost to zero-cost renewables, well, certainly consumers wouldn’t mind.

In the real world, rather than abstract economic modeling scenarios, electricity is a necessity and it will be provided. But in the real world, in the new world of the 21st century electricity grid, it may well be that electricity itself will not be as profitable to generators as it was in the 20th century.

Energy efficiency is reducing demand and that, despite a growing population and even with economic growth, is a trend that will continue and probably accelerate (Maryland, for example, has set a new policy of reducing demand by 2% every year). Renewables act to drive down electricity prices.

Certainly the idea that individual utilities, or even a consortium dominated by a single utility (a la Vogtle or Summer) will ever again build mega-billion dollar power plants of any kind just in order to sell electricity, is a relic of the 20th century playing out today as farce.

It won’t be playing out much longer. Utilities, like Virginia’s Dominion, that may think that obsolete model still applies, will regret it.

Not too cheap to meter – but too cheap to worry about

Electricity may never be free, or too cheap to meter, but it may well become one of life’s little bargains. Long distance in my lifetime has gone from an expensive luxury item rarely used; to an inexpensive, frequently-dialed option; to a free add-on to both my landline and iphone plans.

For my millennial kids, the concept of a ‘long-distance’ call is meaningless: they’ve never made one and never will. But they do still use phones, and all the services modern phone plans offer.

The costs of electricity are going to come down too – technology and renewables are already starting to see to that – but someone, whether it be the traditional utilities or someone smarter is going to come along and figure out how to make money by providing electricity add-ons and services, even if the electricity itself is free or nearly so.

Totally free electricity may be too much to hope for, there is a grid to pay for and maintain after all, and there will be for the foreseeable future. But the money to be made will be in the add-on services, not the basic electricity.

The solar rooftop people have pretty much already figured this out for their slice of the business – whether by lease or purchase, you pay primarily for the equipment, installation and maintenance, not so much for the electricity.

But since rooftop solar doesn’t work for everyone nor everywhere, there is a market ready for something new and safe and clean and that won’t destroy the planet we live on. I’d say that’s a pretty damn large market looking for the electricity equivalent of long-distance in the iphone era. With a market like that, someone is going to deliver, even if the electricity itself is little more than a low-cost add-on to other services people want.

That won’t happen tomorrow, of course. As Barry Cinnamon of The Energy Show podcast put it, “But this change in our energy sources will take many years, just as the complete transition from ‘horse and buggy’ transportation to gas-powered cars took 50 years.

“As with other large-scale technological changes, customer economics will force the current incumbent energy providers to change (unlikely), or go out of business (more likely). It’s a virtuous cycle as more customers are satisfied with renewable power generation, and more people are employed in these industries.”

Bye-bye nuclear – no place for you in this new power market

For nuclear power though – even for ‘small modular reactors’ (which actually are not so small, most are much larger than the early US commercial reactors like Big Rock Point and Yankee Rowe and some are as large as Fukushima Daiichi Unit-1) – and fossil fuels as well, the transformation means extinction.

By definition, SMRs are also baseload power plants; despite being smaller than today’s behemoth reactors, they are designed to run 24/7 and like their larger brethren, cannot power up and down quickly.

Before they even exist, they are obsolete. Their polluting ‘baseload’ means of providing their product (electricity) will be unneeded and functionally and economically irrelevant – unable to compete with those offering electricity as part of a set of services, rather than as an end in itself.

 


 

Michael Mariotte is Executive Director at Nuclear Information and Resource Service (NIRS).

This article originally ran on Green World, a news and information service of NIRS.

 

The archaic nature of ‘baseload’ power

The old grid, beholden to massive, polluting baseload power plants, is being replaced by a nimbler, high-tech 21st century system oriented toward variable renewable energy.

There are no shortage of skeptics out there, even some among environmentalists and clean energy advocates, who are unconvinced that renewable energy can ever be the dominant-perhaps even sole-source of electricity generation.

The reasons for this skepticism vary. Some, for example, argue that the land needs for sufficient generation of wind and solar power are too great. This turns out to be an incredibly lame argument, but that’s the subject of a different article.

More frequent are the arguments that ‘baseload’ power-large power plants that tend to run 24/7-are necessary to ensure reliable electricity and that the variable nature of some renewables-solar and wind-can’t provide that reliability.

Then there’s the notion that the electrical grid can only accommodate a certain level of renewables, around 30-40%. Above that and the grid pretty much breaks down. These arguments are actually related and solved in the same way.

More recently, an argument has been circling among energy nerds-especially pro-nuclear energy nerds-that the integration of renewables into the grid reaches a peak for economic reasons: that renewables are limited by their cost. Not by their high cost, but by their low cost, or as one writer put it: “solar and wind eat their own lunch.”

But that merely shows that not only must the technical nature of the grid change, and it can; but so must its economic nature, and it can too.

The good old days … too bad they were killing us

The electric grid in use today was mostly designed in the 20th century. Large baseload nuclear and fossil fuel plants were built, usually far from the largest electricity consumers (cities and large industry), and transported by huge (and not particular efficient) power lines.

Those baseload plants had, and have, high capacity factors and run pretty much all the time, although nuclear reactors have to be shut for refueling for a few weeks every 12-18 months. Utilities try to arrange those shutdowns to occur during periods of low demand.

During peak power needs – hot summer days in most of the country – smaller gas plants and in the old days even oil plants would be fired up to supplement the baseload levels. And it all worked pretty well given the technology available at the time.

But, as we all now know all too clearly, that system had a price – a price not reflected in the cost of electricity. That system was and is killing us. Those large nuclear and fossil fuel plants are spewing out carbon dioxide and radioactivity and creating large quantities of dirty and deadly waste products that society doesn’t know what to do with.

Had the cost of those effects – which do have a price, a steep one – been incorporated into the price we and our parents paid for electricity, we probably would have moved to a clean energy system much faster. As it is, we no longer have much of a choice.

Variable power sources more reliable, resilient than ‘baseload’

Fortunately, as is being proven daily in Europe, a grid based on smaller, distributed variable power sources can be just as reliable, and even more resilient and secure, than a grid reliant on baseload power.

Variable does not mean unreliable: as long as it can be reliably projected with sufficient advance time what the wind will do and thus how much wind power will be available where, and the same for the sun, then a variable grid can be highly reliable. And those can be and are, in fact, reliably projected.

The ability to integrate a moderately large amount (say 30-35% or so) of renewables into a baseload-dominated grid is a given. It is happening daily. Not so much in the US, although even here states like Iowa are getting more than 20% of their power from renewables, and the percentage of renewables is set to rise rapidly-both on their own for sound economic reasons and due to encouragement of them in the Clean Power Plan.

But at some point above 35-40% renewables or so, a conflict arises. If more renewables are to be brought into the grid, the large baseload plants have to begin closing – even if they theoretically remain useful.

That’s because the kind of grid that works for the variable renewables – a fast, nimble grid where power from different sources scattered in different locations can be ramped up and down quickly depending on where it is being generated and where it is needed – doesn’t work well for baseload plants, especially nuclear reactors, which cannot ramp up and down quickly.

Those kinds of plants were designed to run 24/7 and that’s what they do – they’re not designed to fit in with a grid that doesn’t want them to run 24/7, that instead wants them to run when their power is needed. And the higher the penetration of renewables, the less the baseload plants’ power is needed.

The new kid on the block: energy storage

Add in energy storage, the new kid on the block, and polluting power plants running 24/7 become an anachronism. When the variable sources aren’t generating what is needed, just release the stored, and cheaper, electricity they generated earlier during periods of low demand.

The polluting baseload plants then make no sense at all. Why throw carbon dioxide into the air and tritium into the water and generate lethal radioactive waste just to keep dirty and usually more expensive power plants operating just for those few hours in the week when they might be useful? With storage, they’re not needed, or even particularly useful, at all.

What’s stopping us, or slowing us anyway, is not the technology for the new grid – that exists. It’s the rules. And the political will to transform the grid to accommodate the transformative technologies that have been developed over the past two decades.

If we’re going to move into the 21st century, and with nearly 15% of the century already gone we’re a good ways into it, then we’d better get moving quickly. The old rules need to be changed; David Roberts, formerly of Grist, has compiled a useful list of some of those needed changes.

The problem – the powerful incumbents holding onto their profits

One problem, obviously, is that utilities don’t want to close their old baseload power plants if they are still useful at generating electricity. They want to put off that retirement date as long as possible. Assuming its operating and maintenance costs are not so high that it loses money, the longer a power plant runs the more profit it returns. And utilities are about making money, not transforming the grid.

In the US, at least, we’re not at the point where profitable baseload power plants have to be forced closed for the greater good-renewables don’t yet make up enough of our power to require that step. But parts of Europe are quickly getting there, and we in the US will get there in many places faster than most people now think – surely within the next decade.

Germany is already showing that a grid with a high penetration of renewables can be reliable, and that forcing reactors to close can not only be publicly acceptable, it can attain wide public support.

The larger problem in Germany these days is not the amount of renewables in place, it’s that there is so much renewable generation that the grid needs to be strengthened to better distribute that electricity across the country and for export to nations like Poland and Austria – which badly want that cheap, clean power.

Public opinion polls suggest that in the US, a similarly high penetration of renewables will be most welcome, even if anti-nuclear sentiment is not at German levels.

The real problem with renewables – they are ‘too cheap’

Perhaps forcing reactors to close won’t be necessary; enough are already unprofitable, and more are likely to become so in coming years that perhaps they will simply shut down, be replaced by renewables and it will all happen quietly and happily.

More likely though, as nuclear utilities contemplate 80-year operating licenses and squeezing every last watt of power out of them regardless of their age or safety condition, that could become the nuclear issue of the next decade for the public, state regulators and policymakers and the like: should existing reactors stay open when they’re still viable or be forced aside to welcome larger amounts of cleaner, safer and usually cheaper renewables?

From our perspective, the answer is obviously yes, they should shut down to make way for the more modern system. But that’s an answer that will take a lot of preparation and groundwork beginning now, because the nuclear utilities will fight that hard.

That’s a somewhat different issue than the one that confronts us today, which is should uneconomic reactors stay open or move aside for renewables? The nuclear utilities want the ground rules changed to force ratepayers to keep those uneconomic reactors open regardless of their cost.

That’s an easy argument to make: of course the rules shouldn’t be changed to favor the higher-priced, dirtier power source. And it appears that argument is on the verge of victory in Illinois – the most nuclear state in the US. If that argument does end up carrying the day there, it can everywhere.

As for the notion that solar and wind are too cheap, that just shows the absurd nature of the economics of electricity and the failure to consider external costs – the environmental damage they cause and the full lifecycle costs of their existence – in the economic equation.

There is more to life than the dollar, though you wouldn’t know it by how many traditional markets work, and, in fact, we have reached the point that unless ‘more to life’ is adequately factored into prices, there may not be any life at all.

The concept being bandied about by these pro-nukers is that if there is ‘too much’ solar and wind in the system, its price will eventually become zero – essentially free. And at that price – or no price if you will – the system breaks down and there will be no more investment in solar and wind. Who would want to invest in it if you have to give it away?

The cheap power ‘problem’ can be solved – if you want to solve it

There are ways around the problem even under the existing system, from feed-in tariffs to Power Purchase Agreements. And the ‘problem’ itself still has at its foundation the baseload concept of electricity generation and distribution. Absent those baseload plants, which only inhibit renewable generation anyway, there cannot be ‘too much’ renewables in the system.

But including the real costs of nuclear and fossil fuel use would be the best step. Because once added in, those costs make that kind of generation too expensive to use no matter what the competition. And if the only choice is low-cost to zero-cost renewables, well, certainly consumers wouldn’t mind.

In the real world, rather than abstract economic modeling scenarios, electricity is a necessity and it will be provided. But in the real world, in the new world of the 21st century electricity grid, it may well be that electricity itself will not be as profitable to generators as it was in the 20th century.

Energy efficiency is reducing demand and that, despite a growing population and even with economic growth, is a trend that will continue and probably accelerate (Maryland, for example, has set a new policy of reducing demand by 2% every year). Renewables act to drive down electricity prices.

Certainly the idea that individual utilities, or even a consortium dominated by a single utility (a la Vogtle or Summer) will ever again build mega-billion dollar power plants of any kind just in order to sell electricity, is a relic of the 20th century playing out today as farce.

It won’t be playing out much longer. Utilities, like Virginia’s Dominion, that may think that obsolete model still applies, will regret it.

Not too cheap to meter – but too cheap to worry about

Electricity may never be free, or too cheap to meter, but it may well become one of life’s little bargains. Long distance in my lifetime has gone from an expensive luxury item rarely used; to an inexpensive, frequently-dialed option; to a free add-on to both my landline and iphone plans.

For my millennial kids, the concept of a ‘long-distance’ call is meaningless: they’ve never made one and never will. But they do still use phones, and all the services modern phone plans offer.

The costs of electricity are going to come down too – technology and renewables are already starting to see to that – but someone, whether it be the traditional utilities or someone smarter is going to come along and figure out how to make money by providing electricity add-ons and services, even if the electricity itself is free or nearly so.

Totally free electricity may be too much to hope for, there is a grid to pay for and maintain after all, and there will be for the foreseeable future. But the money to be made will be in the add-on services, not the basic electricity.

The solar rooftop people have pretty much already figured this out for their slice of the business – whether by lease or purchase, you pay primarily for the equipment, installation and maintenance, not so much for the electricity.

But since rooftop solar doesn’t work for everyone nor everywhere, there is a market ready for something new and safe and clean and that won’t destroy the planet we live on. I’d say that’s a pretty damn large market looking for the electricity equivalent of long-distance in the iphone era. With a market like that, someone is going to deliver, even if the electricity itself is little more than a low-cost add-on to other services people want.

That won’t happen tomorrow, of course. As Barry Cinnamon of The Energy Show podcast put it, “But this change in our energy sources will take many years, just as the complete transition from ‘horse and buggy’ transportation to gas-powered cars took 50 years.

“As with other large-scale technological changes, customer economics will force the current incumbent energy providers to change (unlikely), or go out of business (more likely). It’s a virtuous cycle as more customers are satisfied with renewable power generation, and more people are employed in these industries.”

Bye-bye nuclear – no place for you in this new power market

For nuclear power though – even for ‘small modular reactors’ (which actually are not so small, most are much larger than the early US commercial reactors like Big Rock Point and Yankee Rowe and some are as large as Fukushima Daiichi Unit-1) – and fossil fuels as well, the transformation means extinction.

By definition, SMRs are also baseload power plants; despite being smaller than today’s behemoth reactors, they are designed to run 24/7 and like their larger brethren, cannot power up and down quickly.

Before they even exist, they are obsolete. Their polluting ‘baseload’ means of providing their product (electricity) will be unneeded and functionally and economically irrelevant – unable to compete with those offering electricity as part of a set of services, rather than as an end in itself.

 


 

Michael Mariotte is Executive Director at Nuclear Information and Resource Service (NIRS).

This article originally ran on Green World, a news and information service of NIRS.

 

Nuclear power is expensive everywhere!

The Britons attitude to the notion that nuclear power is not cheap after all is a bit like a child who first hears that Father Christmas does not, after all, exist.

Disbelief, and in this case a belief that if only Father Christmas is nationalised, then it will still be true.

The psychologists call this cognitive dissonance, in other words if a fact is uncomfortable to you, you believe that the fact is wrong.

The belief that somehow nuclear power will be cheaper if somehow it is done differently here has been stoked by a recent IEA Report which says that British nuclear power, in the shape of the proposed contract for Hinkley C, is the most expensive in the world.

In fact the IEA report is heavily reliant on a limited number of projections of costs, which in the world of nuclear power is a fantasy world in itself.

However what is more apparent is that it is not so much that nuclear power is more expensive in the UK so much that it is only in the UK that something vaguely approaching a estimate of nuclear costs on the same basis as other energy technologies has been attempted. This is because of the need to make nuclear at least look like it was fitting into the contours of what passes for a competitive electricity generation market in the UK.

Of course nuclear power looks expensive if you do it that way. because it is! (assuming you want to treat nuclear power on the same costing basis as other energy sources). Even this (Hinkley C) comparison is somewhat biased towards nuclear because other technologies don’t get 35 year contracts, and they don’t get a Government offer to underwrite 60% of the projected construction costs.

So, the Government’s declared price for Hinkley C is, in reality an underestimate of nuclear power costs compared to other energy sources. This is even more the case since the plant hasn’t even started construction yet.

Plucked out of thin air while wearing rose tinted spectacles?

I’m scratching my head as to how the IEA could have come to the conclusion it did. The French EPR is now several years behind schedule in its construction and is said to now cost three times more than its original estimate (already!). As for the Finnish EPR, well, that cost just goes off the scale.

Even if you look at other reactors being built in the West, the high costs are also much in evidence. The AP1000 reactors being built in the USA are as expensive as the EPR. And the claims circulating about how the proposed Hitachi project will go better than Hinkley C are more exercises in fantasy.

There’s a lot of stories about how much cheaper nuclear power is in China. Well, I don’t know whether anybody has noticed this, but easily analysable information on costs of nuclear construction in China is about as scarce as it was from the old British CEGB on a bad day – and it was, apart from the occasional leak, usually a very bad day.

But recent commentary of the Chinese nuclear programme is not especially encouraging: Take for example this article from the Global Times, whose news items seem to run in parallel with the priorities of the Chinese Government:

“Since 2004, China has been approving projects using advanced nuclear power reactors, including US-based Westinghouse’s AP1000 and France-based Areva’s EPR (Evolutionary Power Reactor), many of which are now under construction. Dubbed generation III reactors, they are designed to withstand the crisis that damaged the Japanese nuclear plant.

“Construction of these projects has not been smooth. Sanmen Nuclear Power Station in Zhejiang Province was expected to be the first nuclear power plant in the world that uses AP1000 technology. The first of the two reactors was scheduled to finish construction and start operation in November 2013, but construction is now over 18 months behind schedule. The plant won’t start operation until 2016 at the earliest, an official from China’s State Nuclear Power Technology, the company building the power plant, said in January.”

Of course the Chinese Government has some grandiose plans for nuclear power construction. So did the UK Government back in 2010!

Nationalisation won’t make nuclear power cheaper – but it will help with fiddling the figures

So where is the British flight into ‘nuclear power is cheap somewhere’ fantasy leading us? Well, to nationalisation, of course, a charge led by the IPPR, a centre-left think tank, which I suppose, is a more plausible vehicle for this than, say, a right leaning think tank such as Policy Exchange (who, incidentally, have recently discovered that onshore wind power is reasonable cheap after all and should be offered some contracts).

Nationalisation won’t make nuclear power any cheaper. The claims that somehow the ‘cheaper’ money from the state will make the technology less expensive ignore some relevant facts. First, the Hinkley C deal already has access to state backed finance through the agency of the state owned (French and Chinese) companies that are building the plant and the state guaranteed loan offered by the UK Treasury.

Setting up a state body to compete with others in the electricity market will also generate a further (interesting) EU state aid application. But really the talk of cheap state money, is not the key point that the nuclear people are aiming at.

What the pro-nuclear lobby now wants is for limitless sums of money to be siphoned off from public spending on education, health and whatever else and spent on building nuclear power stations. The money will be notionally borrowed, a contract that will be concocted that will claim that the electricity consumer will pay the money back at a later date, and the balance will be paid for by, well, less schools, hospitals etc.

Meanwhile a story will be manufactured about how all of this is cheap. Cognitive dissonance will prevent people asking why if it is ok to fund nuclear power this way, then why isn’t it ok to fund offshore winfarms and other things.

Of course, even this isn’t ‘the crack’ as a Liverpudlian friend of mine used to put it. The ‘crack’ is that the state will end up giving the whole project a blank cheque so that the disastrous construction process can be bankrolled entirely from a bottomless pit of state finances.

Talk about allegedly ‘cheap’ money from the Government is just a cover for what the nuclear people are really aiming for. A cost-plus contract, spend however much you want contact, a blank cheque. The road towards this will no doubt be littered with pretend signposts, like they’ll be a tender process etc, but at the end of the day nothing will happen until the blank cheque has been sent.

I have been pleasantly surprised to see that the UK Government has not signed such a thing. The Treasury has not been pushed into accepting this (one thing that me and George Osborne agree about). Yet.

 


 

David Toke is Reader in Energy Politics at the Department of Politics and International Relations, University of Aberdeen. 

This article was originally published on David Toke’s Green Energy Blog.

 

Hinkley C nuclear plant postponed indefinitely

The French state-owned power company EDF has revealed that its Hinkley C nuclear power plant planned for Somerset, England, will not be ready in 2023 as planned.

The company has given no completion date but promises a “revised timetable” when its gives the project its final approval – in other words it is now indefinitely postponed.

Its chief executive, Jean-Bernard Levy, insists that he has “full confidence in the success of the Hinkley Point project” however the facts suggest the opposite may be the case.

The delay will have serious effects on the UK’s plans to ‘keep the lights on’ in 2023 and beyond, as a number of old coal power stations will be forced to close by that date under EU air quality rules.

A spokesman for the Department of Energy and Climate Change (DECC) also signalled a cautious note over the project’s future: “The UK Government and EDF are continuing to work together to finalise the project. The deal must represent value for money and is subject to approval by ministers.”

DECC’s comment comes at time when the project is coming under growing criticism for its very high cost. It is due to receive some £95 (at current prices) per MWh generated for 35 years after it comes into operation, representing a subsidy on cirrent wholesale prices of about £55 per MWh.

Why does Hinkley C cost three times more than IEA nuclear cost estimates?

A new report by the International Emergy Agency projected nuclear power costs of just $50 per MWh (about £33) for projects beginning now. This gives rise to the question – why is Hinkley C costing almost three times as much?

This is the exact question that The Ecologist put to EDF’s media office for Hinkley earlier today. We were promised an explanation but none has been forthcoming.

However DECC’s public insistence that the project “must represent value for money” and the warning that it is “subject to approval by ministers” is surely indicative of a growing scepticism at the heart of government over the project.

An obvious driver for this cautionary tone is the Treasury, long believed to have concerns over the project’s staggeringly high cost to UK energy users – independently estimated at some €105 billion (£75 billion).

In July Lord Howell, former energy minister and father in law to George Osborne, the Chancellor, made the following startling contribution to the Energy Bill in the Lords, which many observers believe to represent the Chancellor’s views:

“By far the biggest obligation, or future burden, on consumers and households is the Hinkley Point C nuclear project. I am very pro nuclear and pro its low-carbon contribution but this must be one of the worst deals ever for British households and British industry.

“Furthermore, the component suppliers to EDF are in trouble, costs keep rising, no reactor of this kind has ever been completed successfully, those that are being built are years behind and workers at the site have been laid off, so personally I would shed no tears at all if the elephantine Hinkley Point C project were abandoned in favour of smaller and possibly cheaper nuclear plants a bit later on.”

The Flamanville disaster rolls on

EDF’s revelation on Hinkley C delays came in the course of a press conference about it’s Flamanville reactor in France, which is being built to the same EPR design as Hinkley C.

The Flamanville reactor was ordered in 2006 for a price of €3.3 billion and was meant to be generating power in 2012. According to EDF, it is now scheduled for completion in the 4th quarter of 2018 – over six years late and more than three times over budget: “Following assessment of all the industrial and financial parameters, project costs have been revised to €10.5 billion.”

However Jean-Bernard Lévy, EDF Chairman and Managing Director, insisted: “I have reviewed the Flamanville EPR project in detail, and I am absolutely confident that it will be a success. It is a priority for EDF and of critical importance for the French nuclear industry and its success internationally.”

He added that “all of the experience gained at Flamanville will be invaluable for other EPR projects, such as Hinkley Point.”

And he announced a “new roadmap” for Flamanville which “aims to optimise the management of the project” and features “three key milestones”: primary circuit mechanical erection to be finalised in Q1 2016; electromechanical erection to be completed in Q1 2017; fuel loading and reactor start-up in Q4 2018.

But what about the dodgy reactor vessel?

But there was a gaping hole it the EDF statement: it made no mention at all of by far the greatest problem afflicting the Flamanville project: that the French Nuclear Safety Authority (ASN) announced last April that its reactor vessel, already installed at the heart of the power station, had metallurgical weaknesses which may render it unusable.

Forged by Areva’s Creusot Forge subsidiary, tests revealed areas with high carbon concentration resulting in “lower than expected mechanical toughness values”. Pierre-Franck Chevet, head of the ASN, said: “It is a serious fault, even a very serious fault, because it involves a crucial part of the nuclear reactor.”

The results of further tests are expected in October 2015. If the reactor vessel does indeed need replacement, Chevet said. “either EDF abandons the project or it takes out the vessel and starts building a new one … this would be a very heavy operation in terms of cost and delay.”

The Ecologist telephoned EDF’s corporate press office in Paris today asking for clarification over the new schedule for Flamanville, and whether it took into account the possible need to replace the reactor vessel. Despite following up the enquiry by email EDF has provided no response.

In the event that the reactor fails the safety tests, Areva will be liable for all the knock-on costs to EDF, which could amount to as much as €5 billion: for the cost of a new vessel, for the cost of all the work already completed that will need to be broken apart and re-built, and for the delays.

This would push Areva – already close to insolvency – into bankruptcy – and potentially leave Hinkley C with no reactor vessel. It would also leave EDF carrying most or indeed all of the cost – leaving it desperately short of working capital and unable to take on the Hinkley C project.

As such one interpretation of EDF’s ‘good news’ announcements over Flamanville and Hinkley C is that they are, in fact, the complete opposite of what they purport to be. Hinkley C has never looked less likely to be completed.

Meanwhile the UK government has a problem. With no nuclear power station coming on stream at Hinkley C, and old coal fired and nuclear stations due to close, the obvious way to fill the gap is with renewables like solar and wind. But the entire sector has been devastated by a series of deep spending cuts.

Has Amber Rudd got a cunning plan? We can only hope so.

 


 

Oliver Tickell edits The Ecologist.

 

We scientists welcome Scotland’s GM-free status

Dear Mr Lochhead,

Firstly, as members of the science community, we wish to congratulate you on your recent announcement of a GM-free status for Scotland.

You have received a letter from Sense About Science urging you to abandon Scotland’s ban on genetically modified (GM) crops and asking for a meeting to persuade you in that direction. We, the undersigned, urge you to maintain Scotland’s admirable integrity on this issue and not to fall victim to the unfounded claims made for GM crops.

Each of us has been involved either with the process of genetic modification and/or with the debate around genetic modification. Over many years, the following facts have become clear to us.

Who supports GM crops?

It is a documented fact [1] that professionals with a career or financial interest in a controversial product are much more likely to endorse it than are those who have no such interest. This is very much the case with GM.

In this instance, however, the stakes for a multi-billion dollar industry, and the organisations and scientists funded by it, are so high that opposition extends far beyond endorsement.

‘Unfavourable’ research results in company-sponsored studies are routinely suppressed; and we see, repeatedly, vitriolic attacks and defamation of independent scientists who report evidence of harm resulting from GM crops, as in laboratory animal-feeding trials. [2]

The GM industry and disinformation

In spite of the huge body of evidence to the contrary that continues to accumulate, the GM industry wishes the public and politicians to believe that GM crops are friendly to the environment and safe to consume by humans and animals – and, indeed, to be an essential tool of modern agriculture.

A number of industry-backed ‘front’ organisations have been created that employ ‘stealth PR techniques’ to promote the interests of clients, and Sense About Science is one of these. [3]

GM pesticide use and yields

GM crops are designed for intensive, chemical agriculture, which has already led to the degradation of our soils and the micro-organisms that promote soil fertility. Food crops now contain only a fraction of the mineral content they had a mere 50 years ago.

The amount of pesticides that are applied to GM crops is increasing year-on-year. [4] A class of GM crops that actually have decreased amounts of applied insecticide (the Bt crops) are engineered to produce their own insecticide within their cells – and these toxins cannot be washed off but must be eaten.

The total amount of insecticide occurring in and on Bt crops is estimated to be greater than if an applied insecticide had been used. [5] As weeds and insects become resistant to the controlling chemicals, additional and more highly toxic pesticides are being added.

Yields of GM crops are similar to, or lower than, yields of conventional crops [6]; there is no gene for higher yield in any GM crop.

If a bigger harvest occurs, it is rather a decreased loss to weeds or insects than an intrinsically higher yield. Some newer GM crops that are claimed to give intrinsically higher yields or other desirable trait were actually developed by traditional breeding to achieve that trait, and they were subsequently made GM (and patentable) by introducing the usual herbicide-tolerance or insect-resistance genes. [7]

Safety of GM crops for environment and health

In the United States, where GM crops are most widely grown, an increasing number of weeds has become resistant to Roundup, the glyphosate-based herbicide that is used on most herbicide-tolerant GM crops.

On some farms, old-fashioned weeding by hand has become necessary, and there are even cases where farms have had to be abandoned because the weed problem became intractable. [8] Some varieties of weeds are now resistant to several herbicides.

GM seed developers perform their own animal-feeding nutritional and safety studies. Usually these are commercially confidential and unavailable for public scrutiny. Those that are published are short-term and dismiss as not ‘biologically meaningful’ the statistically significant differences they acknowledge to occur between the animals fed a GM variety and the control animals. [9]

Independent scientists almost always find harm to health (immune system, kidneys, liver, reproductive fertility, etc.) when animals are given GM feed. [10] Many farmers have also reported health problems from GM feed, and these disappear when a non-GM diet is re-introduced. [11]

In March of this year, the widely used herbicide glyphosate was declared to be ‘probably carcinogenic‘ by the experts in the International Agency for Research on Cancer (IARC), which operates under the auspices of the World Health Organisation. [12] Only peer-reviewed publications were considered, not unpublished company-sponsored studies.

Monsanto Company, which owns the huge majority of patents on GM crops dependent on glyphosate, has described this report as “junk science” and is now trying to discredit it and to have it retracted. [13]

Genetics remains an imperfectly understood science, and genetic engineering often induces unexpected and even harmful changes that may not be recognised during testing.

More information

An excellent online publication that provides summaries and examples of all aspects of GM processes and products can be found online. Two of the three authors are geneticists who either are, or have been, engaged in genetic engineering.

From our perspective as independent scientists, a Scottish ban on GMOs is entirely justified on scientific grounds alone. We hope that you and your advisors will not be beguiled by the unjustified claims and promises of an increasingly desperate GM industry.

Scotland must maintain its high-quality crops and foods, without contamination by genetically engineered varieties. Instead of pursuing a route that is already causing serious environmental and health problems, Scotland should become a leader in the science of agroecology, which has already proved its efficacy and sustainability. [14]

Together with non-invasive modern innovations like Marker Assisted Selection to speed traditional breeding, agroecology has promise to replace our present conventional and GM agriculture and to produce healthy soils leading to healthy plants, animals and food.

Yours sincerely,

The undersigned

 


 

Scientists

Prof. Carlo Leifert (PhD, Dipl. Ing. agr.), School of Agriculture, Food and Rural Development at Newcastle University; Director, Stockbridge Technology Centre

Dr Michael Antoniou, King’s College London, Head of Gene Expression and Therapy Group

Prof Susan Bardocz, PhD, formerly at University of Debrecen, Hungary; formerly at the Rowett Research Institute, Aberdeen

Dr. E. Ann Clark, Associate Professor (retired), Plant Agriculture, University of Guelph, Ontario, Canada

Prof. Joe Cummins Emeritus Professor of Genetics Western University, London Ontario, Canada Fellow of Science In Society, London, UK; Many reports and articles opposing Genetic Engineering beginning around 1980

Dr S. W. B. Ewen, M.B.Ch.B., Ph.D.,F.R.C.Path, retired, Histopathologist at Aberdeen Royal Infirmary

Dr John Fagan, Director, Earth Open Source

Dr Angelika Hilbeck, Chair, European Network of Scientists for Social and Environmental Responsibility (ENSSER), Germany

Dr Mae-Wan Ho, Institute of Science in Society, Roster of Experts on Cartagena Protocol for Biosafety

Dr David Hookes, PhD in molecular biology, Honorary Senior Research Fellow in Department of Computer Science, Liverpool University

Prof Malcolm Hooper, Professor Emeritus of Medicinal Chemistry at the University of Sunderland

Dr. Don M. Huber, Professor Emeritus Purdue University; 55 years research, microbial ecologist/plant pathologist

Dr Jonathan Latham, Executive Director, The Bioscience Resource Project

Dr Ulrich Loening, Reader in Zoology and Director of the Centre for Human Ecology (retired), University of Edinburgh

Dr Alexander Kenneth Lough, PhD, DSc, FRSC, FRSE, Formerly at the Rowett Research Institute.

Dr Eva Novotny, University of Cambridge (retired); formerly Co-ordinator for GM Issues at Scientists for Global Responsibility

Dr Arpad Pusztai, FRSE, formerly at the Rowett Research Institute, Aberdeen

Dr Fakhar Qureshi (PhD), formerly at the BBSRC, Institute for Animal Health, Compton, Newbury

Prof Peter Saunders, Co-director, Institute of Science in Society; Emeritus Professor of Mathematics, King’s College London.

Dr Eva Sirinathsinghji, Institute of Science in Society, London, UK

Dr William W.M. Steiner, Ph.D. (Genetics). former Dean of the College of Agriculture, Forestry and Natural Resource Management, University of Hawaii, Hilo; former Director of the USGS Pacific Island Ecosystems Research Center, Honolulu, Hawaii; former researcher at USDA Biological Control of Insects Research Laboratory, Columbia, Missouri

Professor Terje Traavik, PhD, Professor of Virology and Professor Emeritus of Gene Ecology, UiT The Arctic University of Norway; Founder and Former Scientific Director of the National Competence Institute GenÖk- Centre of Biosafety; Former member of the governmental Norwegian Biotechnology Advisory Board.

Dr Hector Valenzuela, Full Professor and Vegetable Crops Extension Specialist, College of Tropical Agriculture and Human Resources, University of Hawaii at Manoa

Prof Brian Wynne, Professor Emeritus of Science Studies at Lancaster University; Former member of science committees of the Royal Society and European Environment Agency; Former special advisor to the House of Lords Select Committee on Science and Technology

Non-scientists with relevant qualifications

Dr Myrto Ashe MD, M.P.H., Functional Medicine, San Rafael, California.

Prof Philip L. Bereano, Professor Emeritus of Technology and Public Policy, University of Washington; Negotiator for the Cartagena Biosafety Protocol, the Supplemental Protocol on Liability and Redress; and Codex Alimentarius Guidelines on Food from Modern Biotechnology.

Helen Browning OBE; Director, Soil Association, UK; Former member of the UK Government’s Agriculture and Environment Biotechnology Commission.

Dr Brian Higginson MB, BS, LRCP, MRCS, DO, DRCOG, DAB, General Practitioner (retired) in UK National Health Service.

Dr Rosemary Mason, MB, ChB, FRCA; Former Consultant Anaesthetist at West Glamorgan Health Authority, UK.

Dr Michelle Perro MD, Institute for Health and Healing, Greenbrae, California.

Note: Each signatory has signed as an individual, not on behalf of any organisation.

References

1. ‘Association of financial or professional conflict of interest to research outcomes on health risks or nutritional assessment studies of genetically modified products‘, Johan Diels, Mario Cunha, Célia Manaia, Bernardo Sabugosa-Madeira, Margarida Silva, Food Policy, Volume 36, Issue 2, April 2011, Pages 197-203.

2. (a) (i) Re: Prof. Gilles-Eric Séralini et al.: Letters to the Editor, Food and Chemical Toxicology, 2014. (The reference also contains supporting letters).
(ii) ‘Smelling a corporate rat‘, Jonathan Matthews, 12 December 2012, published by Spin Watch on the web site Scribd.
(b) Re: Dr Ignacio Chapela: (i) Don’t Worry, It’s Safe to Eat, Andrew Rowell, 2003, Earthscan Publications Limited, London and Stirling, Virginia, pp. 152-153.
(ii) ‘Has GM corn “invaded” Mexico?‘, Charles C. Mann, March 2002, Science, Volune 295, Number 5560, Pages 1617-1619.
(c) Re: Dr Arpad Pusztai: (i) ‘The sinister sacking of the world’s leading GM expert and the trail that leads to Tony Blair and the White House‘, Andrew Rowell, 7 July 2003, The Daily Mail [UK], reproduced by GM Watch.
(ii) Seeds of Deception,. Jeffrey Smith, 2003, Yes! Books, Fairfield, Iowa, Pages 15-23.
(iii) Don’t Worry, It’s Safe to Eat, Andrew Rowell, 2003, Earthscan Publications Limited, London and Stirling, Virginia; Page 78 ff.

3. ‘Spinning Food‘, Kari Hamerschlag, Anna Lappé and Stacy Malkan, Friends of the Earth Report, June 2015.

4. (a) ‘Impacts of Genetically Engineered Crops on Pesticide Use in the United States: The First Thirteen Years‘, Cherles Benbrook, November 2009, The Organic Center, Critical Issue Report.
(b) ‘Impacts of generically engineered crops on pesticide use in the U.S. – the first sixteen years‘, Charles M. Benbrook, 2012, Environmental Sciences Europe, Volume 24, Page 24-36.

5. ‘Impacts of generically engineered crops on pesticide use in the U.S. – the first sixteen years‘, Charles M. Benbrook, 2012, Environmental Sciences Europe, Volume 24, Page 24-36.

6. ‘Failure to Yield‘, Doug Gurian-Sherman, April 2009, Union of Concerned Scientists.

7. E.g., ‘Syngenta’s Agrisure Artesian drought-tolerant maize‘.

8. ‘Impacts of Genetically Engineered Crops on Pesticide Use in the United States: The First Thirteen Years‘, Cherles Benbrook, November 2009, The Organic Center, Critical Issue Report.

9. ‘Results of a 13 week safety assurance study with rats fed grain from glyphosate tolerant corn‘, B. Hammond, R. Dudek, J. Lemen and M. Nemeth, 2004, Food and Chemical Toxicology, Volume 42, Pages 1003-1014.

10. ‘Ban GMOs Now‘, Mae-Wan Ho and Eva Sirinathsinghji, May 2013, Institute of Science in Society.

11. ‘Deformities, sickness & livestock deaths: the real cost of glyphosate & GM animal feed?‘, Andrew Wasley, 28 November 2013, The Ecologist.

12. ‘Glyphosate‘, IARC Monograph 112, July 2015:

13. (a) ‘IARC’s Report on Glyphosate‘, Monsanto Company, July 2015:
(b) ‘Monsanto says panel to review WHO finding on cancer link to herbicide‘, Carey Gillam, July 2015, Reuters, Yahoo News.

14. (a) ‘Food Futures Now‘, Mae-wan.Ho, Sam Burcher, Lim Li Ching and others, Institute of Science in Society and Third World Network Report, March 2008:
(b) ‘The Farming Systems Trial, Celebrating 30 Years‘, Rodale Institute, 2012:

Note: This list is representative, not comprehensive.

 

Emissions cuts pledges too weak to achieve 2C ‘safety limit’

With less than three months to go before the start of the UN climate change conference in Paris, the world is a long way short of promising cuts in greenhouse gas emissions big enough to deliver a good chance of climate safety.

The UN Framework Convention on Climate Change (UNFCCC) has asked world governments to submit plans – known as Intended Nationally Determined Contributions (INDCs) – detailing the emissions cuts they will agree to make.

By 1st September 29 governments had submitted their INDCs to the UN, among them the EU which covers all its member states. These INDCs collectively cover 56 countries, 43% of global population and 65% of global greenhouse gas emissions.

But a new study reveals that the climate targets so far submitted will lead to global emissions far higher than the levels needed to hold warming to below 2C – the internationally-agreed safety limit.

Four research institutes – Climate Analytics, Ecofys, NewClimate Institute and the Potsdam Institute for Climate Impact Research (PIK) – have joined up to form Climate Action Tracker (CAT). They have just released their analysis at climate talks under way in Bonn, Germany, at the start of the last but one week of negotiations before Paris, and they make for dismal reading.

Emissions set to rise far above the 2C pathway

With the INDCs submitted to date, the CAT report projects that total global emissions are on track to be 53-57 GtCO2e in 2025 and 55-59 GtCO2e (gigatonnes of  carbon dioxide equivalent) in 2030, levels it describes as “far above the least-cost global pathways consistent with limiting warming below 2°C.”

CAT has assessed 15 of the INDCs covering 64.5% of global emissions. Of these it judges seven to be ‘inadequate’, six as ‘medium’ and only two as ‘sufficient’ for reaching the goal of limiting the rise in average global temperatures to within 2C of pre-industrial levels, in order to avert serious climate change.

The CAT analysis shows that to hold global warming below 2C, governments need to significantly strengthen their INDCs and collectively reduce global emissions: “Additional reductions in the order of 12-15 GtO2e by 2025 and of 17-21 GtCO2e by 2030 are needed for global emissions to be consistent with a 2°C pathway.”

If the current 2030 INDCs are locked in, CAT says that holding warming below 2C would become almost impracticable, as CO2 emission reduction rates would need to exceed 5% a year after 2030, and would make holding warming below 1.5C virtually impossible. Many climate scientists say the 2C safety limit is too high, and argue for a 1.5C maximum instead.

Bill Hare, a physicist who is co-founder and CEO of Climate Analytics, says: “It is clear that if the Paris meeting locks in present climate commitments for 2030, holding warming below 2C could essentially become infeasible, and 1.5C beyond reach. Given the present level of pledged climate action, commitments should only be made until 2025. The INDCs therefore need to be considerably strengthened for the period 2020-2025.”

Only two countries are on target: Ethiopia and Morocco

The seven countries whose INDCs are described as inadequate by CAT are Australia, Canada, Japan, New Zealand, Singapore, South Korea and Russia. It says their proposals are not considered to be a fair contribution to limiting warming to 2C – from almost any perspective.

China, the EU, Mexico, Norway, Switzerland and the US are judged ‘medium’, which the CAT says means they are “within the upper and least ambitious end of what could be considered as fair, and if all countries put forward a similar level of ambition warming would exceed 2°C”.

“One would have expected all the new government climate targets combined to put the world on a lower emissions pathway, but they haven’t”, says Louise Jeffery, a PIK researcher on climate impacts and vulnerabilities. “One contributing factor is the fact that Russia, Canada, and New Zealand’s INDCs are inconsistent with their stated long-term (2050) goals.”

The INDCs of two African countries, Ethiopia and Morocco, are the only ones assessed by the CAT as being in line with the ambition to limit temperature rise to 2C.

Countries need to step up their targets – and their policies!

In most cases, CAT also found that the policies governments have in place now would not reduce emissions enough even to match their INDCs for 2025. The exceptions are China and the EU, who would have to implement minimal extra policies to meet their targets, and could even exceed them.

“Some countries propose INDCs close to the current trajectory giving confidence that they are met (e.g. EU and China). Others have put forward a target that would be a significant change in trend, but these are not yet supported by any significant existing legislation, e.g. Australia and Canada, raising questions about the likely implementation.

“Yet others are showing progress in policy implementation, continuously moving their future trajectories downwards, but policies are not yet sufficient to meet their (still inadequate) INDCs (e.g. USA). The gap between pledges and policies increases through time, highlighting the need for long-term policy action.”

Niklas Höhne, a founding partner of NewClimate Institute, says: “Most governments that have already submitted an INDC need to review their targets in light of the global goal and, in most cases, will need to strengthen them.”

INDCs are yet to come from 140 countries. The ten highest emitters yet to submit INDCs are India, Brazil, Iran, Indonesia, Saudi Arabia, South Africa, Thailand, Turkey, Ukraine, and Pakistan.

 


 

The report:How close are INDCs to 2 and 1.5C pathways?

Alex Kirby writes for Climate News Network.

 

We scientists welcome Scotland’s GM-free status

Dear Mr Lochhead,

Firstly, as members of the science community, we wish to congratulate you on your recent announcement of a GM-free status for Scotland.

You have received a letter from Sense About Science urging you to abandon Scotland’s ban on genetically modified (GM) crops and asking for a meeting to persuade you in that direction. We, the undersigned, urge you to maintain Scotland’s admirable integrity on this issue and not to fall victim to the unfounded claims made for GM crops.

Each of us has been involved either with the process of genetic modification and/or with the debate around genetic modification. Over many years, the following facts have become clear to us.

Who supports GM crops?

It is a documented fact [1] that professionals with a career or financial interest in a controversial product are much more likely to endorse it than are those who have no such interest. This is very much the case with GM.

In this instance, however, the stakes for a multi-billion dollar industry, and the organisations and scientists funded by it, are so high that opposition extends far beyond endorsement.

‘Unfavourable’ research results in company-sponsored studies are routinely suppressed; and we see, repeatedly, vitriolic attacks and defamation of independent scientists who report evidence of harm resulting from GM crops, as in laboratory animal-feeding trials. [2]

The GM industry and disinformation

In spite of the huge body of evidence to the contrary that continues to accumulate, the GM industry wishes the public and politicians to believe that GM crops are friendly to the environment and safe to consume by humans and animals – and, indeed, to be an essential tool of modern agriculture.

A number of industry-backed ‘front’ organisations have been created that employ ‘stealth PR techniques’ to promote the interests of clients, and Sense About Science is one of these. [3]

GM pesticide use and yields

GM crops are designed for intensive, chemical agriculture, which has already led to the degradation of our soils and the micro-organisms that promote soil fertility. Food crops now contain only a fraction of the mineral content they had a mere 50 years ago.

The amount of pesticides that are applied to GM crops is increasing year-on-year. [4] A class of GM crops that actually have decreased amounts of applied insecticide (the Bt crops) are engineered to produce their own insecticide within their cells – and these toxins cannot be washed off but must be eaten.

The total amount of insecticide occurring in and on Bt crops is estimated to be greater than if an applied insecticide had been used. [5] As weeds and insects become resistant to the controlling chemicals, additional and more highly toxic pesticides are being added.

Yields of GM crops are similar to, or lower than, yields of conventional crops [6]; there is no gene for higher yield in any GM crop.

If a bigger harvest occurs, it is rather a decreased loss to weeds or insects than an intrinsically higher yield. Some newer GM crops that are claimed to give intrinsically higher yields or other desirable trait were actually developed by traditional breeding to achieve that trait, and they were subsequently made GM (and patentable) by introducing the usual herbicide-tolerance or insect-resistance genes. [7]

Safety of GM crops for environment and health

In the United States, where GM crops are most widely grown, an increasing number of weeds has become resistant to Roundup, the glyphosate-based herbicide that is used on most herbicide-tolerant GM crops.

On some farms, old-fashioned weeding by hand has become necessary, and there are even cases where farms have had to be abandoned because the weed problem became intractable. [8] Some varieties of weeds are now resistant to several herbicides.

GM seed developers perform their own animal-feeding nutritional and safety studies. Usually these are commercially confidential and unavailable for public scrutiny. Those that are published are short-term and dismiss as not ‘biologically meaningful’ the statistically significant differences they acknowledge to occur between the animals fed a GM variety and the control animals. [9]

Independent scientists almost always find harm to health (immune system, kidneys, liver, reproductive fertility, etc.) when animals are given GM feed. [10] Many farmers have also reported health problems from GM feed, and these disappear when a non-GM diet is re-introduced. [11]

In March of this year, the widely used herbicide glyphosate was declared to be ‘probably carcinogenic‘ by the experts in the International Agency for Research on Cancer (IARC), which operates under the auspices of the World Health Organisation. [12] Only peer-reviewed publications were considered, not unpublished company-sponsored studies.

Monsanto Company, which owns the huge majority of patents on GM crops dependent on glyphosate, has described this report as “junk science” and is now trying to discredit it and to have it retracted. [13]

Genetics remains an imperfectly understood science, and genetic engineering often induces unexpected and even harmful changes that may not be recognised during testing.

More information

An excellent online publication that provides summaries and examples of all aspects of GM processes and products can be found online. Two of the three authors are geneticists who either are, or have been, engaged in genetic engineering.

From our perspective as independent scientists, a Scottish ban on GMOs is entirely justified on scientific grounds alone. We hope that you and your advisors will not be beguiled by the unjustified claims and promises of an increasingly desperate GM industry.

Scotland must maintain its high-quality crops and foods, without contamination by genetically engineered varieties. Instead of pursuing a route that is already causing serious environmental and health problems, Scotland should become a leader in the science of agroecology, which has already proved its efficacy and sustainability. [14]

Together with non-invasive modern innovations like Marker Assisted Selection to speed traditional breeding, agroecology has promise to replace our present conventional and GM agriculture and to produce healthy soils leading to healthy plants, animals and food.

Yours sincerely,

The undersigned

 


 

Scientists

Prof. Carlo Leifert (PhD, Dipl. Ing. agr.), School of Agriculture, Food and Rural Development at Newcastle University; Director, Stockbridge Technology Centre

Dr Michael Antoniou, King’s College London, Head of Gene Expression and Therapy Group

Prof Susan Bardocz, PhD, formerly at University of Debrecen, Hungary; formerly at the Rowett Research Institute, Aberdeen

Dr. E. Ann Clark, Associate Professor (retired), Plant Agriculture, University of Guelph, Ontario, Canada

Prof. Joe Cummins Emeritus Professor of Genetics Western University, London Ontario, Canada Fellow of Science In Society, London, UK; Many reports and articles opposing Genetic Engineering beginning around 1980

Dr S. W. B. Ewen, M.B.Ch.B., Ph.D.,F.R.C.Path, retired, Histopathologist at Aberdeen Royal Infirmary

Dr John Fagan, Director, Earth Open Source

Dr Angelika Hilbeck, Chair, European Network of Scientists for Social and Environmental Responsibility (ENSSER), Germany

Dr Mae-Wan Ho, Institute of Science in Society, Roster of Experts on Cartagena Protocol for Biosafety

Dr David Hookes, PhD in molecular biology, Honorary Senior Research Fellow in Department of Computer Science, Liverpool University

Prof Malcolm Hooper, Professor Emeritus of Medicinal Chemistry at the University of Sunderland

Dr. Don M. Huber, Professor Emeritus Purdue University; 55 years research, microbial ecologist/plant pathologist

Dr Jonathan Latham, Executive Director, The Bioscience Resource Project

Dr Ulrich Loening, Reader in Zoology and Director of the Centre for Human Ecology (retired), University of Edinburgh

Dr Alexander Kenneth Lough, PhD, DSc, FRSC, FRSE, Formerly at the Rowett Research Institute.

Dr Eva Novotny, University of Cambridge (retired); formerly Co-ordinator for GM Issues at Scientists for Global Responsibility

Dr Arpad Pusztai, FRSE, formerly at the Rowett Research Institute, Aberdeen

Dr Fakhar Qureshi (PhD), formerly at the BBSRC, Institute for Animal Health, Compton, Newbury

Prof Peter Saunders, Co-director, Institute of Science in Society; Emeritus Professor of Mathematics, King’s College London.

Dr Eva Sirinathsinghji, Institute of Science in Society, London, UK

Dr William W.M. Steiner, Ph.D. (Genetics). former Dean of the College of Agriculture, Forestry and Natural Resource Management, University of Hawaii, Hilo; former Director of the USGS Pacific Island Ecosystems Research Center, Honolulu, Hawaii; former researcher at USDA Biological Control of Insects Research Laboratory, Columbia, Missouri

Professor Terje Traavik, PhD, Professor of Virology and Professor Emeritus of Gene Ecology, UiT The Arctic University of Norway; Founder and Former Scientific Director of the National Competence Institute GenÖk- Centre of Biosafety; Former member of the governmental Norwegian Biotechnology Advisory Board.

Dr Hector Valenzuela, Full Professor and Vegetable Crops Extension Specialist, College of Tropical Agriculture and Human Resources, University of Hawaii at Manoa

Prof Brian Wynne, Professor Emeritus of Science Studies at Lancaster University; Former member of science committees of the Royal Society and European Environment Agency; Former special advisor to the House of Lords Select Committee on Science and Technology

Non-scientists with relevant qualifications

Dr Myrto Ashe MD, M.P.H., Functional Medicine, San Rafael, California.

Prof Philip L. Bereano, Professor Emeritus of Technology and Public Policy, University of Washington; Negotiator for the Cartagena Biosafety Protocol, the Supplemental Protocol on Liability and Redress; and Codex Alimentarius Guidelines on Food from Modern Biotechnology.

Helen Browning OBE; Director, Soil Association, UK; Former member of the UK Government’s Agriculture and Environment Biotechnology Commission.

Dr Brian Higginson MB, BS, LRCP, MRCS, DO, DRCOG, DAB, General Practitioner (retired) in UK National Health Service.

Dr Rosemary Mason, MB, ChB, FRCA; Former Consultant Anaesthetist at West Glamorgan Health Authority, UK.

Dr Michelle Perro MD, Institute for Health and Healing, Greenbrae, California.

Note: Each signatory has signed as an individual, not on behalf of any organisation.

References

1. ‘Association of financial or professional conflict of interest to research outcomes on health risks or nutritional assessment studies of genetically modified products‘, Johan Diels, Mario Cunha, Célia Manaia, Bernardo Sabugosa-Madeira, Margarida Silva, Food Policy, Volume 36, Issue 2, April 2011, Pages 197-203.

2. (a) (i) Re: Prof. Gilles-Eric Séralini et al.: Letters to the Editor, Food and Chemical Toxicology, 2014. (The reference also contains supporting letters).
(ii) ‘Smelling a corporate rat‘, Jonathan Matthews, 12 December 2012, published by Spin Watch on the web site Scribd.
(b) Re: Dr Ignacio Chapela: (i) Don’t Worry, It’s Safe to Eat, Andrew Rowell, 2003, Earthscan Publications Limited, London and Stirling, Virginia, pp. 152-153.
(ii) ‘Has GM corn “invaded” Mexico?‘, Charles C. Mann, March 2002, Science, Volune 295, Number 5560, Pages 1617-1619.
(c) Re: Dr Arpad Pusztai: (i) ‘The sinister sacking of the world’s leading GM expert and the trail that leads to Tony Blair and the White House‘, Andrew Rowell, 7 July 2003, The Daily Mail [UK], reproduced by GM Watch.
(ii) Seeds of Deception,. Jeffrey Smith, 2003, Yes! Books, Fairfield, Iowa, Pages 15-23.
(iii) Don’t Worry, It’s Safe to Eat, Andrew Rowell, 2003, Earthscan Publications Limited, London and Stirling, Virginia; Page 78 ff.

3. ‘Spinning Food‘, Kari Hamerschlag, Anna Lappé and Stacy Malkan, Friends of the Earth Report, June 2015.

4. (a) ‘Impacts of Genetically Engineered Crops on Pesticide Use in the United States: The First Thirteen Years‘, Cherles Benbrook, November 2009, The Organic Center, Critical Issue Report.
(b) ‘Impacts of generically engineered crops on pesticide use in the U.S. – the first sixteen years‘, Charles M. Benbrook, 2012, Environmental Sciences Europe, Volume 24, Page 24-36.

5. ‘Impacts of generically engineered crops on pesticide use in the U.S. – the first sixteen years‘, Charles M. Benbrook, 2012, Environmental Sciences Europe, Volume 24, Page 24-36.

6. ‘Failure to Yield‘, Doug Gurian-Sherman, April 2009, Union of Concerned Scientists.

7. E.g., ‘Syngenta’s Agrisure Artesian drought-tolerant maize‘.

8. ‘Impacts of Genetically Engineered Crops on Pesticide Use in the United States: The First Thirteen Years‘, Cherles Benbrook, November 2009, The Organic Center, Critical Issue Report.

9. ‘Results of a 13 week safety assurance study with rats fed grain from glyphosate tolerant corn‘, B. Hammond, R. Dudek, J. Lemen and M. Nemeth, 2004, Food and Chemical Toxicology, Volume 42, Pages 1003-1014.

10. ‘Ban GMOs Now‘, Mae-Wan Ho and Eva Sirinathsinghji, May 2013, Institute of Science in Society.

11. ‘Deformities, sickness & livestock deaths: the real cost of glyphosate & GM animal feed?‘, Andrew Wasley, 28 November 2013, The Ecologist.

12. ‘Glyphosate‘, IARC Monograph 112, July 2015:

13. (a) ‘IARC’s Report on Glyphosate‘, Monsanto Company, July 2015:
(b) ‘Monsanto says panel to review WHO finding on cancer link to herbicide‘, Carey Gillam, July 2015, Reuters, Yahoo News.

14. (a) ‘Food Futures Now‘, Mae-wan.Ho, Sam Burcher, Lim Li Ching and others, Institute of Science in Society and Third World Network Report, March 2008:
(b) ‘The Farming Systems Trial, Celebrating 30 Years‘, Rodale Institute, 2012:

Note: This list is representative, not comprehensive.