Monthly Archives: September 2015

Bonn climate talks end with no draft text for Paris

The latest round of climate talks in the German city of Bonn have ended with a failure to deliver common ground for the negotiations at the UN climate summit in Paris at the end of this year.

The Paris talks, involving all UN member states, are meant to deliver an agreement on which to base a new world climate treaty to replace the expired Kyoto Protocol. But experts now fear that there is not enough time left to see any major breakthrough.

Jan Kowalzig, climate change policy adviser at Oxfam, described last week’s negotiations in Bonn as “unbearably tardy”. He said:

“If the negotiators keep up that slow pace, the ministers at the UN summit will get an unfinished paper that they will have to resolve with no time for reflection. The outcome will then most likely be an extremely weak new treaty that will not save the world from climate change.”

A bunch of ideas and lots of unresolved obstacles

In Bonn, the Ad Hoc Working Group on the Durban Platform for Enhanced Action (ADP) – the body set up by the UN Climate Change Convention to devise a successor to the Kyoto Protocol – was asked to produce a paper for the Paris summit.

But after a week of negotiations, they ended up with just a bunch of ideas and lots of unresolved obstacles.

“We cannot go on working on that basis”, says Sarah Blau, who led the EU’s delegation in Bonn. “We would love to start working on a new treaty, but all options have to be on the table. We have not reached that stage yet.”

Two major hurdles remain as the Paris deadline nears: climate finance, and emissions cuts. Back in 2010, the world agreed on building up a Green Climate Fund to help developing nations to tackle the impacts of climate change.

The developed nations promised to provide the fund with US$100 billion by 2020. But so far, there’s only around US$10 billion in the pot. So who will contribute how much? And by when? The diplomats in Bonn were unable to say.

On the emissions cuts, it is becoming increasingly obvious that the existing pledges are far from enough to keep the world below the 2C level – the internationally-agreed ‘safety limit’ to try to prevent runaway climate change.

So the developing nations are demanding regular updates and adjustments to the agreed emissions cuts every five years, to check whether the world is still on the right track. The EU disapproves of this, saying updates every 10 years are sufficient.

As the EU wants to achieve its planned 40% CO2 reduction by 2020, it would not take its next step until 2030. “We feel confident that our 40% CO2 target by 2020 is one of the most ambitious goals, and we do not see any need for more regular adjustments”, Blau says.

Long-term target

Greenpeace says the EU’s 10-year strategy could render the 2C limit meaningless. According to Martin Kaiser, head of the Greenpeace climate policy unit,

“It would be a catastrophe if the new treaty froze the existing reduction targets and pledges. We do need more regular adjustments that respect the latest climate science outcomes and the development of renewable energies.”

The only progress in Bonn was the wider acceptance among UN member states of the need to write a long-term target into a new global climate treaty.

But it remains unresolved whether that should be a zero CO2 emissions target, a 100% renewable energy target, or just a repetition of the existing 2C limit – which many climate scientists think should in any case be reduced to 1.5C.

At the end of September, heads of state are due to meet in New York at the UN general assembly. In mid-October, there will be another preparatory meeting in Bonn, hoping finally to produce an agreed paper for Paris.

“We are definitely running out of time”, warns Christoph Bals, policy director at the NGO Germanwatch.

“What we truly need now are clear signals from the ministers and heads of state ahead of Paris. Otherwise, the next UN climate summit is most likely to fail.”

 


 

Henner Weithöner is a Berlin-based freelance journalist specialising in renewable energy and climate change. He originally write this article for Climate News Network. LinkedIn: de.linkedin.com/pub/henner-weithöner/48/5/151/; Twitter: @weithoener

 

Osborne’s nuclear fantasies – can you hear me Major Tom?

There is an increased air of unreality about what is going on in UK Government on energy policy, and what is visible to the rest of the real world.

Nowhere is this clearer than their continued bullishness on the proposed Hinkley Point nuclear power station.

On the one hand, earlier today the Financial Times joined environmental campaigners in saying that the project should be abandoned.

That’s become almost normal, given that the energy minister who last gave the go ahead to new nuclear power in the UK (David Howell, now Lord Howell) has withdrawn his support.

But the reason the Financial Times offers for its view is interesting, as they argue that “the cost of alternative low carbon sources, such as solar, and better battery technology, is falling fast.”

Readers will be familiar with this line of argument as the green movement and academics have been saying this for some years. It feels a bit different when Europe’s premier financial newspaper says the same.

Other media outlets like The Times are being critical of the Hinkley project as Chatham House calculate subsidies as being £40 billion over the lifetime of the project. Others put it as high as €108 billion (£78.6 billion).

Support for the nuclear industry even seems to be ebbing in its French heartlands – a new law, adopted in July 2015, will requires the share of nuclear power in France’s energy mix to fall from 75% today to 50% by 2025, with the gap being filled mainly by wind and solar.

Meanwhile in Parliament …

Away from all this, UK Government set its course a decade ago. And the fact that the world has changed completely doesn’t seem to have any bearing on their thinking.

This was beautifully illustrated by the UK’s Chancellor George Osborne, who in stark contrast has claimed that the country’s first new nuclear power station is the cheapest form of low-carbon generation available – cheaper even than onshore wind.

He made the astonishing claim as he appeared in front of the House of Lords Economic Affairs Committee in their ‘annual evidence session’ on Tuesday to be grilled by Lord Turnbull, who was Permanent Secretary to the Treasury from 1998 to 2002.

“Shouldn’t we really go back to the drawing board, rather than plumping for what I think will be a kind of bottomless pit and a big white elephant?” Turnbull asked.

To which Osborne replied that the agreed ‘strike price’ of £92.50 per MWh (in 2013 money) is still “subject to final negotiation”, adding: “It is still substantially cheaper than other low-carbon technology like offshore wind or onshore wind.”

Video: George Osborne’s evidence to the House of Lords Economic Affairs Committee. Hinkley C Evidence begins around 16.34.

He also suggested that the UK taxpayer doesn’t really bear any of the risk should the reactor design used by French state owned builders, Areva, turn out to be a dud. “I’m not bearing the construction risk or the design risk”, he insisted.

Both are fairly clear claims but – as at least the first is revealed by the Financial Times analysis – they aren’t true.

The UK’s £17 billion guarantee for Hinkley construction finance

It’s important to be clear that we are not talking about the relative costs of nuclear and renewables in various economic models – but the actual costs of particular projects.

Hinkley has been christened by the FT the “biggest and most controversial infrastructure project in Europe”. It has also been named as the most expensive object ever built – at least on planet Earth as it appears the International Space Station is more costly.

It is therefore rather disturbing to see Osborne coming out with seemingly complete nonsense during parliamentary evidence. If the conception of the world inside the Treasury is so distant from reality, we all have good reason to worried.

First his suggestion that UK taxpayers “don’t bear the risk – it’s for EDF and its shareholders” and later that UK is “not bearing construction risk or design risk”.

It’s true that we are not bearing all of the risk – some does lie with EDF through it’s subsdiary New Nuclear Power (NNP). But the UK is substantially underwriting the project through the government’s Infrastructure Guarantee Scheme which may be offered, for example, to Chinese state investors who don’t want to take a risk.

The National Audit Office said earlier this year that UK infrastructure guarantees are “up to £17 billion for Hinkley Point C nuclear power plant” out of a total construction reported by the EU Commission at £24.5 billion.

Under some circumstances if the Hinkley project is started but abandoned, or doesn’t work when completed, then UK citizens will be required to come up with that £17 billion, just over two thirds of the projected cost. That, after all, is the point of a guarantee.

Which is the cheapest of them all?

Secondly the Chancellor said Hinkley was “substantially cheaper than any other low carbon technology”, going on to clarify that it was cheaper than both offshore and onshore wind.

He may be 50% right – about offshore wind. Its costs by the mid-2020’s are hard to gauge and may well be higher. Still, bear in mind that EDF said last week that Hinkley C won’t be finished in 2023 as originally planned, and gave no new completion date.

So by the time Hinkley C begins putting power into the grid, if indeed it ever does, offshore wind could very well be cheaper. And it’s a racing certainty that it will become a lot cheaper over Hinkley C’s 35-year index-linked subsidy lifetime.

But Osborne really is quite wrong when it comes to onshore. IRENA, the International Renewable Energy Agency, has already done a UK-specific calculation on this (see Fig 2.10 p42) which finds the cost of onshore wind to be far lower (see chart, above right).

The chart shows onshore wind coming in at well under £80 per MWh even after all the costs of managing intermittency are included. The caption also notes that the integration costs are “estimated conservatively”, based on a 30-40% wind power penetration of national electricity supply. “For lower shares integration costs would be much less.”

But leaving models aside any casual inspection of what UK Government is actually doing would show his statement to be untrue. Onshore wind and ground-based solar projects have bid and been awarded UK Government contracts at a price of around £80 per MWh, but the ‘strike price’ for Hinkley is £92.50 per MWh.

Which planet is the Treasury on?

As Osborne told the Lords Committee the final price is “still subject to final negotiation” – but given the problems the EPR design for Hinkley has been encountering in construction in France and Finland it seems very unlikely the costs will be going down.

Estimated costs at France’s Flamanville EPR have more than trebled from €3.3 billion to €10.5 billion, even as construction times has doubled from six to 12 years. That’s so long as the metallurgically-flawed reactor vessel doesn’t need replacement (we should know that next month). If it does, all bets are off.

Essentially the Chancellor would appear to be either misleading a Parliamentary Committee, or has a very poor grasp of the facts and figures in relation to the single biggest contract his Government is likely to sign.

Given that the negotiation is in secret, it would appear that UK energy consumers will be on the hook for a £80 billion contract – and only afterwards will we find out what Government have assumed before signing it.

On the basis of the Chancellor’s statements, it looks like their assumptions are delusional.

 


 

Dr Doug Parr is Scientific Director of Greenpeace UK.

This article is based on one originally published on Greenpeace EnergyDesk.

 

Is radiation good for you? The US Nuclear Regulatory Commission could decide it is

The US Nuclear Regulatory Commission is considering a move to eliminate the ‘Linear No-Threshold’ (LNT) basis of radiation protection that the US has used for decades and replace it with the ‘radiation hormesis’ theory – which holds that low doses of radioactivity are good for people.

The change is being pushed by “a group of pro-nuclear fanatics – there is really no other way to describe them”, charges the Nuclear Information and Resource Service (NIRS) based near Washington DC.

“If implemented, the hormesis model would result in needless death and misery”, says Michael Mariotte, NIRS president. The current US requirement that nuclear plant operators reduce exposures to the public to “as low as reasonably achievable” would be “tossed out the window.

“Emergency planning zones would be significantly reduced or abolished entirely. Instead of being forced to spend money to limit radiation releases, nuclear utilities could pocket greater profits. In addition, adoption of the radiation model by the NRC would throw the entire government’s radiation protection rules into disarray, since other agencies, like the EPA, also rely on the LNT model.”

“If anything”, says Mariotte, “the NRC radiation standards need to be strengthened.”

The NRC has a set a deadline of 19th November for people to comment on the proposed change. If it agrees to the switch, “This would be the most significant and alarming change to US federal policy on nuclear radiation”, reports Nuclear-News.

“The Nuclear Regulatory Commission may decide that exposure to ionizing radiation is beneficial – from nuclear bombs, nuclear power plants, depleted uranium, x-rays and Fukushima”, notes Nuclear-News.

“No protective measures or public safety warnings would be considered necessary. Clean-up measures could be sharply reduced … In a sense, this would legalize what the government is already doing-failing to protect the public and promoting nuclear radiation.”

If only radiation wasn’t dangerous, nuclear power would be so easy …

In the wake of the Manhattan Project, the US crash program during World War II to build atomic bombs and the spin-offs of that program – led by nuclear power plants – there was a belief, for a time, that there was a certain ‘threshold’ below which radioactivity wasn’t dangerous.

But as the years went by it became clear there was no threshold – that any amount of radiation could injure and kill, that there was no ‘safe’ dose. Low levels of radioactivity didn’t cause people to immediately sicken or die. But, it was found, after a ‘latency’ or ‘incubation’ period of several years, the exposure could then result in illness and death.

Thus, starting in the 1950s, the ‘Linear No-Threshold’ standard was adopted by the governments of the US and other countries and international agencies.

It holds that radioactivity causes health damage – in particular cancer – directly proportional to dose, and that there is no ‘threshold’. Moreover, because the effects of radiation are cumulative, the sum of several small exposures are considered to have the same effect as one larger exposure, something called ‘response linearity’.

The LNT standard has presented a major problem for those involved in developing nuclear technology notably at the national nuclear laboratories established for the Manhattan Project – Los Alamos, Oak Ridge and Argonne national laboratories – and those later set up as the Manhattan Project was turned into the US Atomic Energy Commission.

On one hand, Dr. Alvin Weinberg, director of Oak Ridge National Laboratory, declared in New Scientist magazine in 1972: “If a cure for cancer is found the problem of radiation standards disappear.”

‘We need more, not less radiation’

Meanwhile, other nuclear proponents began pushing a theory they named ‘radiation hormesis’ that claimed that the LNT standard was incorrect and that a little amount of radioactivity was good for people.

A leader in the US advocating hormesis has been Dr. T. D. Luckey. A biochemistry professor at the University of Missouri-Columbia and visiting scientist at Argonne National Laboratory, he authored the book Hormesis and Ionizing Radiation and Radiation Hormesis and numerous articles.

In one ‘Radiation Hormesis Overiew‘, he contends: “We need more, not less, exposure to ionizing radiation. The evidence that ionizing radiation is an essential agent has been reviewed … There is proven benefit.” Radioactivity “activates the immune system”, he continues, adding:

“The trillions of dollars estimated for worldwide nuclear waste management can be reduced to billions to provide safe, low-dose irradiation to improve our health. The direction is obvious; the first step remains to be taken … Evidence of health benefits and longer average life-span following low-dose irradiation should replace fear.”

A 2011 story in the St. Louis Post Dispatch quoted Dr. Luckey as saying “if we get more radiation, we’d live a more healthful life.” It also noted that he kept on a shelf in his bedroom a rock “the size of a small bowling ball, dotted with flecks of uranium, spilling invisible rays.”

It reported that recently Dr. Luckey “noticed a small red splotch on his lower back. It looked like a mild sunburn, the first sign of too much radiation. So he pushed the rock back on the shelf, a few inches farther away, just to be safe.”

At Brookhaven National Laboratory (BNL), set up by the US Atomic Energy Commission in 1947 to develop civilian uses of nuclear technology and conduct research in atomic science, a highly active proponent of hormesis has been Dr. Ludwig E. Feinendegen. Holding posts as a professor in his native Germany and a BNL scientist, he authored numerous papers advocating hormesis.

In a 2005 article published in the British Journal of Radiology he wrote of “beneficial low level radiation effects” and asserted that the “LNT hypothesis for cancer risk is scientifically unfounded and appears to be invalid in favor of a threshold or hormesis.”

Three petitions submitted to the NRC

The three petitions to the NRC asking it scuttle the LNT standard and replace it with the hormesis theory were submitted by Dr. Mohan Doss on behalf of the organization Scientists for Accurate Radiation Information; Dr. Carol Marcus of the UCLA medical school; and Mark Miller, a health physicist at Sandia National Laboratories.

The Nuclear Information and Resource Service points out that the US Environmental Protection Agency or EPA is fully supportive of LNT. The agency’s reason for accepting LNT – and history of the standard – were spelled out in 2009 by Dr. Jerome Puskin, chief of its Radiation Protection Division.

The EPA, Dr. Puskin states, “is responsible for protecting the public from environmental exposures to radiation. To meet this objective the agency sets regulatory limits on radionuclide concentrations in air, water, and soil.”

The agency bases its protective exposure limits on “scientific advisory bodies, including the US National Academy of Sciences, the International Commission on Radiological Protection, the United Nations Scientific Committee on the Effects of Ionizing Radiation, and the National Council on Radiation Protection and Measurements, with additional input from its own independent review.”

The LNT standard “has been repeatedly endorsed” by all of these bodies, he writes, and “It is difficult to imagine any relaxation in this approach unless there is convincing evidence that LNT greatly overestimates risk at the low doses of interest.” And “no such change can be expected” in view of the determination of the National Academies of Sciences’ BEIR VII committee. (BEIR is for Biological Effects of Ionizing Radiation.)

BEIR VII found that “the balance of evidence from epidemiologic, animal and mechanistic studies tend to favor a simple proportionate relationship at low doses between radiation dose and cancer risk.”

As chair of the BEIR VII committee, Dr. Richard Monson, associate dean of the Harvard School of Public Health, said in 2005 on issuance of its report: “The scientific research base shows that there is no threshold of exposure below which low levels of ionizing radiation can be demonstrated to be harmless or beneficial.”

‘Tell NRC: A little radiation is BAD for you’

A European expert on radioactivity, Dr. Ian Fairlie, who as an official in the British government worked on radiation risks and has been a consultant on radiation matters to the European Parliament and other government entities, has presented detailed comments to the NRC on the petitions that it drop LNT and adopt the hormesis theory.

Dr. Fairlie says “the scientific evidence for the LNT is plentiful, powerful and persuasive.” He summarizes many studies done in Europe and the United States including BEIR VII. As to the petitions to the NRC, “my conclusion is that they do not merit serious consideration.” They “appear to be based on preconceptions or even ideology, rather than the scientific evidence which points in the opposite direction.”

An additional issue in the situation involves how fetuses and children “are the most vulnerable” to radiation and women “more vulnerable than men”, states an online petition opposing the change. It was put together by the organization Beyond Nuclear, also based near Washington DC.

Headed “Protect children from radiation exposure”, it advises: “Tell NRC: A little radiation is BAD for you. It can give you cancer and other diseases … NRC should NOT adopt a ‘little radiation is good for you’ model. Instead, they should fully protect the most vulnerable which they are failing to do now.”

How might the commissioners of the NRC decide the issue? Like the Atomic Energy Commission which it grew out of, the NRC is an unabashed booster of nuclear technology and long devoted to drastically downplaying the dangers of radioactivity.

A strong public stand – many negative comments – over their deciding that ‘radioactivity is good for you’ could make all the difference.

 


 

Karl Grossman is professor of journalism at the State University of New York / College at Old Westbury, and the author of ‘Cover Up: What You Are Not Supposed to Know About Nuclear Power’ and host of the nationally-aired TV program ‘EnviroCloseup‘.

Petition:Protect children from radiation exposure!‘ (Change.org)

Comment online: The NRC has a set a deadline of 19th November for people to comment on the proposed change. The public can send comments to the US Government’s regulations website.

Comment by regular mail to: Secretary, US Nuclear Regulatory Commission, Washington, DC 20555-0001, Attention: Rulemakings and Adjudications Staff. Docket ID needed to be noted on any letter is the code NRC-2015-0057.

 

Study: more testing essential to defeat bovine TB

A new study by scientists from Queen Mary University shows that the most effective way to eliminate TB from the UK’s cattle herds is to test them more often.

The main reason is that current TB tests fail to detect many cattle that are incubating the disease. Infected cattle that are not identified will then go on to infect other members of the herd.

“The main conclusion of the analysis conducted here is that more frequent testing is leading to lower TB infections in cattle both in terms of TB prevalence as well as TB incidence”, the paper concludes.

Lead author Dr Aristides Moustakas, said: “It is clear that the Welsh policy of frequent testing up to every six months and the Scottish policy of risk-based surveillance are producing reductions in the both the incidence and prevalence of TB in cattle.”

And his co-author Professor Matthew Evans, concurred: “It is clear that testing cattle frequently is the most effective way of reducing Bovine TB. Farmers and policymakers should not ignore this evidence which is based on the government’s data.”

The paper does not address the effectiveness of badger culling, as currently pursued in three English counties, as a means of controlling bovine TB. But the implication clear – that it is at best a strategy of secondary importance.

Success in Scotland and Wales, failure in England

In the study, the scientists compared the success at tackling TB of England, Wales and Scotland, since “regional differences in TB detections may provide insights of different policies against eradicating the disease.”

Scotland has had a risk based surveillance testing policy under which high risk herds are tested frequently, and in September 2009 was officially declared as TB free.

Wales has had an annual (or more frequent) testing policy since January 2010, and in some Welsh counties cattle are tested every six months. Under this regime bovine TB in cattle has undergone a sharp decline: “Both the number of new herd incidents and number of herds not TB free are declining in Wales”, states the report.

But in in England several herds are still tested every four years except some high TB prevalence areas where annual testing is applied. The sometimes long period between tests in England gives infected cattle that were missed abundant opportunity to infect other herd members.

As the authors point out, the “overall increase pattern of both TB incidence and prevalence in cattle in GB is thus driven by the English regions. Scotland and Wales both have a declining number of new herd incidents as well as herds not TB free and thus the current programme applied, all else being equal, appears to be leading to eradication or control of the disease …

“The main result derived here from statistical analysis of publicly available data from the British Government show that increased cattle testing leads to TB control or possibly eradication as exemplified by the results in Wales. This conclusion fully supports the outputs of a computational model suggesting that all eradication scenarios included cattle testing frequencies of annual or even more frequent testing.”

More TB infections detected in winter

The study also examines the fact that most ‘new herd’ infections are detected in winter, and that most ‘new cattle’ infections in non TB-free herds are detected in late winter.

One reason for the winter detection spike is that that’s when most TB tests are carried out – suggesting that infections are being missed at other times of year: “This implies that the more one tests, the more infected cattle is likely to detect and thus should test more often.”

Another explanation is that the TB spreads more easily from animal to animal in winter housing quarters when cattle are kept in close proximity.

And a further important point emerges: “new herd incidents as well as total tests on herds are lowest during summer months, when cattle are out in the field, the period that interactions with badgers are maximised. If badgers are the agent of cattle infection it is logical to test cattle during summer months.”

Indeed it appears rather extraordinary that, given many farmers’ insistence that badgers are the main route by which cattle are infected with TB, those same farmers are waiting until winter to test their cattle, giving additional time for infections to take root and spread to other animals.

The importance of public data

The study also emphasises the necessity of making all data on TB infections, badger culling and related matters public in order to facilitate independent scientific research:

“We would like to highlight the importance of pubic data. In order to predict and mitigate disease spread informed decisions are needed. These decisions need to be taken based on data analysis and predictive models calibrated with data. Our view is that making regional TB data available so that potential differences and underlying management decisions is a very good step forward.

“We argue that making publicly available the data regarding badger culling experiments as agents of TB infecting cattle will greatly facilitate their analysis and to informed decisions regarding TB control in GB.”

Policy makers must listen!

Professor Alastair Macmillan, veterinary advisor at Humane Society International, commented: “This new paper provides extremely strong evidence of what many experts in veterinary disease control have known for many years – that it is crucial to test cattle as frequently as possible in order to control bovine TB.

“The Queen Mary researchers have shown without doubt that killing badgers will have little effect, whilst employing the policies of Wales and Scotland, where badgers are not culled, will continue to have a dramatic impact on reducing TB in cattle.

“Frequent cattle testing is particularly important as the sensitivity of currently available diagnostic tests is not very high, meaning that cattle incubating TB are not detected and are allowed to remain in the herd to infect others over the following months. These cattle are by far the most common reason why cattle herds suffer repeated TB breakdowns, not badgers.

“The government must heed this evidence and stop wasting time and resources on killing badgers to no effect. All efforts must instead be focused on far more frequent cattle testing and strict cattle movement control. How much more research and scientific evidence does this government need before it listens to the rational facts?”

 



Oliver Tickell edits The Ecologist.

The paper:Regional and temporal characteristics of Bovine Tuberculosis of cattle in Great Britain‘ by Aristides Moustakas and Matthew R Evans is published in Stochastic Environmental Research and Risk Assessment.

 

Whitewashed – the short and miserable life of game birds

Defra, England’s rural affairs department, has just published its long-awaited report on cage-based breeding for pheasants and partridges reared for the shooting industry.

The biased and inadequate study, which cost the taxpayer £500,000, put commercial interests above animal welfare from the start, thus undermining its very purpose.

The title for this project states that its purpose was to: “provide scientific evidence on whether cage-based breeding for pheasants and partridges can fully meet birds’ needs, and if not to identify best practice for improving the breeding environment for gamebirds.” However, the project as described in the report failed to investigate this question.

Rather than compare the welfare of birds kept in cages to those in free-range systems, as is necessary to answer the question posed in the project title, Defra simply examined the impact of various industry-favoured “enrichments” on birds confined in cages within a very limited size range that would be “feasible for commercial implementation”.

Thus commercial interests were put above animal welfare from the start, undermining the very purpose of this half a million pound project.

Free range? Not even considered

The failure to conduct a comparative study between the welfare of birds kept in free-range systems and those confined in cages is inadequately mitigated by a desk-based study investigating the ecology and behaviour of wild pheasants and partridges in their native home range.

These findings are then used to design assorted ‘enrichments’ intended to mimic natural resources or permit natural behaviours. The fundamental flaw with this approach is that birds and people have profoundly different sensory perception.

The human eye can never know whether a piece of Astroturf resembles grass to a bird with much superior vision, nor whether a small piece of doweling could ever fulfil the same behavioural needs as a tree branch. The ability of these artificial substitutes to “fully meet birds’ needs” can only be measured in a comparative study including birds with access to the real thing.

Added to this shortcoming is that fact that the ‘enrichments’ were not even chosen or designed to best meet the welfare needs of the birds, but rather to “maximise their subsequent ease of use by the industry.”

The clear and overwhelming industry bias in the design of this project is wholly unsurprising considering the clear and overwhelming industry bias of the stakeholder group. Only one animal welfare organisation was represented, compared to five game industry bodies.

We would be interested to hear from Defra as to how this skewed representation was considered appropriate for a project that claimed to be assessing the birds’ needs, and why other animal welfare organisations with an established interest in this issue – such as the League and Animal Aid – were excluded from the stakeholder group.

Despite industry bias, serious welfare problems still emerged

While the prioritising of industry interests and input virtually guaranteed this project would produce results that suited the industry, there are still many findings that are highly uncomfortable for the industry and which the final report attempts to downplay, including:

1. Birds in cages suffer substantial foot damage: 23% of caged partridges and 25% of caged pheasants endured painful foot problems such as lesions, swelling and bruising, yet the report considers the welfare impact to be “small” as the majority of birds did not experience these problems.

This conclusion conveniently ignores the fact that millions of birds are confined in breeding cages every year, meaning the number of individuals suffering from painful foot damage is likely to be upwards of half a million. This is not a small welfare impact.

2. Confinement causes aggressive behaviour. Feather damage caused by pecking, which the study identifies as the primary cause of early mortality, was recorded in 39% of caged partridges and 70% of caged pheasants.

The staggeringly high figure for pheasants does not take into account that all the birds in this study were bitted – had small plastic devices pushed through their nostrils to help lessen the impact of aggressive pecking – so the injuries caused in these stressful conditions would be even worse if the birds were not deliberately mutilated before caging.

3. Caged birds want to escape. Observers recorded multiple ‘jump escapes’ in both pheasants and partridges. As the name suggests, this behaviour is a futile attempt to fly away which is hindered by the mesh ceiling on the cages.

During the very limited observation periods, it was observed 28 times in pheasants confined in floor pens compared to 60 times in caged pheasants; while in partridges it was observed 42 times in enriched cages and 126 times in partridges in barren cages. This is a clear sign that these birds were stressed.

The data that is missing from the report also suggests a deliberate attempt to downplay the negative welfare impacts of caging. No mortality figures are reported for any phase of the project, or the results of the autopsies which the report claims were carried out on all birds found dead.

Additionally, no objective measures of stress were employed such as cortisol levels – a standard measure in animal welfare science – only subjective behavioural measures were used.

Even this highly subjective behavioural analysis is undermined by the omission of an ethogram providing definitions of the recorded behaviours – also standard practice in animal behaviour research.

For example, we would be interested to know why preening was considered an indicator of good welfare when many studies show it is often a coping mechanism animals employ in a stressful situation which they are unable to escape.

Taxpayers let down by industry bias

In short, we believe the taxpayer has been woefully let down by this project. It simply did not and could not examine whether cages meet the welfare needs of pheasants and partridges used for breeding. It was designed by the game bird industry to provide results which would justify the continued confinement of millions of birds.

Although the industry bodies claim that is what the results show, anyone reading beyond the report summary will be struck by the level of suffering observed in both barren and ‘enriched’ cages.

The League believes that this study confirms what the British Association for Shooting and Conservation stated in December 2010 when urging MPs to sign an Early Day Motion calling for an outright ban on breeding cages, which stated:

“The available space in such cages is so limited, that the welfare of the birds is seriously compromised … the system does not conform, whether enriched or not, to the five freedoms which are at the basis of the UK’s welfare law.”

We will continue to expose the suffering of birds confined in these cages and campaign against their use.

 


 

The Defra report:Study to determine whether cage-based breeding can meet the needs of game birds, and if not, to identify best practice. – AW1303.

Dr Toni Shephard BSc MSc PhD is Head of Policy and Research at the League Against Cruel Sports. A lifelong animal welfare advocate, she combines this passion with her expertise in ecology and animal behaviour to ensure the League’s policies and campaigns are science-based and compelling. She has particular admiration for adaptable and successful – yet much maligned and persecuted – species such as foxes, magpies and rats. Her hope is that better education and understanding of these animals and the important ecological roles they play will lead to more tolerant and compassionate attitudes towards them.

More about the gamebird shooting industry and the way the birds suffer at the League Against Cruel Sports website.

 

GMOs and the puppetmasters of academia – what the New York Times left out

“Reading the emails make(s) me want to throw up” tweeted the Food Babe after reading a lengthy series of them posted online by the NY Times on 5th September.

The emails in question result from a Freedom of Information Act (FOIA) request and are posted in the side bars of a front-page article by Times reporter Eric Lipton (‘Food Industry Enlisted Academics in GMO Lobbying War, Emails Show‘). See also this account on The Ecologist.

The article is highly disturbing, but, as the Food Babe implied, the Times buried the real story. The real scoop was not the perfidy and deceit of a handful of individual professors.

Buried in the emails is proof positive of active collusion between the agribusiness and chemical industries, numerous and often prominent academics, PR companies, and key administrators of land grant universities for the purpose of promoting GMOs and pesticides.

In particular, nowhere does the Times note that one of the chief colluders was none other than the President of the American Association for the Advancement of Science (AAAS).

All this is omitted entirely, or buried in hard-to-notice side bars, which are anyway unavailable to print readers. So, here is the article Eric Lipton should have written.

First, the Lipton story

The Lipton article seems, at first sight, to be impressive reporting. Lipton describes how Kevin Folta, Chair of the Dept. of Horticulture at the University of Florida secretly took expenses and $25,000 of unrestricted money from Monsanto to promote GMO crops.

On behalf of the biotech industry, or via the PR firm Ketchum, Folta wrote on websites and attended public events, trainings, lobbying efforts and special missions.

Parts of this were already known, but Lipton digs up further damning evidence and quotes from Folta. They include an email to Monsanto that solidly contradicts Folta’s previous denials of a relationship with Monsanto and the biotech industry:

“I am grateful for this opportunity and promise a solid return on the investment”, Folta wrote after receiving the $25,000 check, thereby showing both a clear understanding of his role and the purpose of the money.

The article goes on to similarly expose Bruce Chassy (Prof Emeritus, University of Illinois) and David Shaw (Mississippi State University). It also discussses, presumably for ‘balance’, agronomist and GMO critic Charles Benbrook, then at Washington State University, who unlike the others openly acknowledged his funding.

What Lipton missed

But readers of the emails can find facts that are much more damaging to perceptions of academic independence than that contained in the main article.

For one thing, the money Folta received is insignificant besides the tens of millions his university was taking from Syngenta (>$10million), Monsanto(>$1million), Pioneer (>$10million), and BASF (>$1million) – money that it’s hard to believe did not have a role in protecting Kevin Folta as he roamed zealously (and often offensively) over the internet, via his twitter account, blog, podcast, and OpEds, squelching dissent and ridiculing GMO critics wherever he went.

Also missing from the main Times article is a sense of the extensive and intricate networking of a small army of academics furthering the interests of Monsanto and other parts of the chemical, agribusiness and biotech industries.

Folta rarely acted alone. His networks are filled with economists, molecular biologists, plant pathologists, development specialists, and agronomists, many of them much more celebrated than Kevin Folta, but all of them in a knowing loop with industry and the PR firms.

Their job was acknowledged openly in emails: “We are all bad-ass shills for the truth. It’s a pleasure shilling with you.” Or, as Folta himself put it: “I’m glad to sign on to whatever you like, or write whatever you like.”

More generally, the group’s role was to initiate academic publications and other articles and to firefight legislative, media and scientific threats to the GMO and pesticide industries, all the while keeping their industry links hidden.

Naming the names

The academics identified by these emails as cooperating with industry and PR firms include:

Profs. Bruce Chassy (University of Illinois) and Alan McHughen (University of California, Riverside) who worked together to destroy the credibility of Russian scientist and GMO critic Irina Ermakova.

They persuaded the journal Nature Biotechnology to interview Ermakova about her research and describe it. This interview was followed by a detailed critique of her research (about which none of the authors were expert). Ermakova was neither told of the critique nor given a chance to answer it. This whole elaborate subterfuge required her to be sent a dummy proof of the article she thought she was publishing in the journal.

Prof. Calestuous Juma (Harvard University) longtime advocate of GMOs for Africa.

Prof. Wayne Parrott (University of Georgia) a serial intervener in academic GMO debates.

Prof. Roger Beachy (Danforth Center, formerly USAID). Beachy is the principle living exponent of a classic biotech strategy: to respond rapidly to a report or publication critical of some aspect of the technology with a multi-author ‘rebuttal‘.

Thus the inaugural report of the Bioscience Resource Project on the genome damage caused by genetic engineering (A. K. Wilson, J. R. Latham and R. A. Steinbrecher 2006) was met, even before formal publication, with both barrels from 23 professors, including Roger Beachy (Altpeter et al 2005)].

Prof. Ron Herring (Cornell) who has helped to promote GMOs in India and fought to defuse the farmer suicide debate in India.

Prof. C S Prakash (Tuskegee University) is the convener of the influential listserv AgBioWorld. AgBioWorld was the all-important conduit for a petition signed by 3,000 scientists calling for the retraction of a 2001 scientific paper showing GMO contamination of Mexican corn (Quist and Chapela 2001).

As detailed in an article called ‘The Fake Persuaders‘, the scientists who initiated the petition, and made inaccurate and inflammatory statements about the authors, were not real people. However, their emails could be traced back to servers belonging to Monsanto or Bivings, a PR company that was working with Monsanto at the time.

Prof. Nina Fedoroff (Penn State) is the most prominent of all of the scientists looped into all of the Times emails. Nina Fedoroff was the 2011-2012 President of the American Association for the Advancement of Science. The AAAS is the foremost scientific body in the US.

During her Presidency, Fedoroff, who is also a contributor to the NY Times, used her position to coordinated and sign a letter on behalf of 60 prominent scientists. This letter was sent to EPA as part of an effort to defeat a pesticide regulatory effort.

The real coordinator was Monsanto but Fedoroff participated in phone conferences and email exchanges with them (including with the prominent lobbyist Stanley Abramson) and gets credit in the emails for “moving the ball far down the field”. Yet Nina Fedoroff is not once named in the main article and nowhere at all is her position noted.

So the story that academia’s most vocal GMO defenders, and some of its most prominent scientists, are copied into these emails is missing. The focus on individuals like Folta occludes a demonstration, for the first time ever, of long-suspected and intricate coordination and cooperation among them.

Also looped in to various of the emails are supposedly independent individuals and organisations who speak in favour of biotechnology, self-reportedly out of personal passion. These include Dr Steve Savage, Karl Haro von Mogel of Biofortified, Mischa Popoff (of the Heartland Institute) and Jon Entine (then affiliated with George Mason University and now head of the Genetic Literacy Project and a Forbes Magazine columnist). All are revealed, by the emails but not the article, as biotech insiders.

Others Professors cc’d into emails include Peter Davies (Cornell), Carl Pray (Rutgers), Tony Shelton (Cornell), Peter Phillips (University of Saskatchewan), Prabhu Pingali (Cornell), Elizabeth Earle (Cornell), Peter Hobbs (Cornell), Janice Thies (Cornell) and Ann Grodzins Gold (Syracuse), Martina Newell-McGloughlin (UC Davis).

Cooperation among academics is not a crime. But these emails show, as in the EPA letter example, that a company (usually Monsanto, but also Dow and Syngenta and a PR firm, often several of them, plus sometimes the biotech lobbyists BIO or CropLife America) were invariably looped in to these emails, and further, that initiatives usually began with one of these non-academic entities, and were shepherded by them.

Only rarely is there even a suggestion from the emails that the various academics were out in front, though that was always the intended impression of the result.

Connivance of top university staff

But perhaps the biggest of all revelations within these emails is the connivance of senior university administrators, especially at Cornell University. The NY Times article focuses on the misdeeds of Mississippi State University Vice President David Shaw.

But, looped into one email string, along with the PR firm Ketchum and Jon Entine are various Cornell email addresses and names. These are ignored by Lipton, but the email addresses belong to very senior members of the Cornell administration. They include Ronnie Coffman (Director of Cornell’s College of Agriculture and Life Science) and Sarah Evanega Davidson (now director of the Gates-funded Cornell Alliance for Science).

The Alliance for Science is a PR project and international training center for academics and others who want to work with the biotech industry to promote GMOs. It is funded ($5.6 million) by the Gates Foundation.

Its upcoming program of speakers at Cornell for September include Tamar Haspel (Washington Post reporter), Amy Harmon (New York Times reporter) and Prof. Dan Kahan (Yale Law School). These speakers are the exact ones mentioned in a proposal worked out between Kevin Folta and Monsanto in a series of email exchanges intended to enhance biotech outreach.

These email exchanges also propose setting up ‘Ask Me Anything’ events to be held at universities around the country with Kevin Folta as of the panelists. On Sept 10th the Cornell Alliance for Science is hosting an event in downtown Ithaca (home town of Cornell). It is called ‘Ask Me Anything About GMOs’ and Kevin Folta is a panelist.

Somehow or other Davidson’s Cornell Alliance for Science read Monsanto’s lips, perfectly.

Your right to know

Let me speculate at what is really going on behind the scenes of Lipton’s article. Earlier this year, a newly-formed US group called US Right to Know (USRTK) set in motion Freedom of Information Act (FOIA) requests directed at 14 (now 43) prominent public university scientists it suspected of working with (and being paid by) the biotech industry and/or its PR intermediaries.

Now, if these 43 academics had nothing to hide, this request would not have attracted much attention and hardly any emails would have been forthcoming. However, the USRTK FOIA requests triggered a huge outcry in various quarters about the “harassment” of public scientists.

The outcry has led to OpEds in the LA Times and the controversial removal of scientific blog posts defending USRTK, and much else besides, as reputedly tens of thousands of emails (from these FOIA requests) have landed on the desks of USRTK.

What would a good PR company to recommend to its clients in such a situation? In order to preempt the likely upcoming firestorm, it might recommend that various media outlets run ahead of USRTK to publish a version of events in which academic small-fry like Kevin Folta, Bruce Chassy and David Shaw (of Mississippi State) are the villains.

Making them the fall guys lets others off the hook: high-profile scientists like Nina Fedoroff and Roger Beachy; the pro-biotech academic community in general; and prestigious Ivy League institutions like Cornell University.

These much bigger fish are who the NY Times should have harpooned. Since they did not, or perhaps would not, let us hope that USRTK will make better use of those emails, ideally by posting all of them online.

 


 

Dr Jonathan R. Latham is editor of Independent Science News, where this article was originally published.

References

 

 

Monsanto’s scientist shill exposed

The New York Times has published a fascinating article on the scandal surrounding Monsanto and Kevin Folta, the chairman of the horticultural sciences department at the University of Florida.

Monsanto executives recruited Dr. Folta, a molecular biologist, in the spring of 2013 after they read a blog post he had written defending the biotech industry.

According to the NYT, Monsanto and its industry partners have “passed out an undisclosed amount in special grants to scientists”, including Folta, “to help with ‘biotechnology outreach’ and to travel around the country to defend genetically modified foods.”

Folta was revealed by Freedom of Information requests to have accepted $25,000 from Monsanto, even though he had repeatedly denied having any Monsanto funding.

A damning string of emails, released as a result of the Freedom of Information requests, have been posted online by the New York Times, with a commentary by the NYT editors. Many of the emails are between Kevin Folta and Monsanto or other industry and PR players.

Monsanto’s actions in its alliance with the Grocery Manufacturers Association and the Biotechnology Industry Organization are revealed in thousands of emails requested by the nonprofit campaign group US Right to Know, which is funded by the organic foods sector.

‘I’m ready to write whatever you like’

The emails show Folta as an eager partner in a cosy relationship with Monsanto. In November 2013 Folta sent an email to employees of the PR firm Ketchum, which runs the pro-GMO website, GMO Answers, for its client, the Council for Biotechnology.

Regarding an upcoming meeting with the rest of the GMO Answers team, Folta wrote: “Tell them I’m a friend of Ketchum”. In 2014 Folta wrote to a Monsanto manager: “I’m glad to sign on to whatever you like, or write whatever you like.”

After Monsanto agreed to Folta’s funding bid for $25,000 for a pro-GMO communications programme, Folta wrote to a Monsanto executive, “I’m grateful for this opportunity and promise a solid return on the investment.”

Another Monsanto executive called the Folta deal “a great 3rd-party approach to developing the advocacy that we’re looking to develop.” The ‘third party PR technique’ is when industry places its messages in the mouths of supposedly independent third parties, such as scientists and doctors, because the public are more likely to trust them.

Folta claimed to be open about funding

Folta has repeatedly claimed that he was open about his funding arrangements. For example, early this year he wrote, “The bottom line is that my university operates under the Sunshine Law. Emails are public information, just like my funding, my salary, my cholesterol levels, and everything else about me.”

And in response to online speculation from critics about his funding sources, he wrote: “Hey guys, you know you could just reach out and ask … always glad to talk about such things. My research has been funded 100% by public sources, except for a small amount we get for strawberry research, mostly molecular marker development that helps our breeding program pyramid flavor-related genes via traditional breeding. No Monsanto.”

But while the $25,000 Folta got from Monsanto was for outreach and not research, he was anything but open about it. On page 104 of the newly released emails, you can see Folta apparently trying to hide Monsanto’s $25,000 grant so that it is not “publicly noted”.

GMO Answers

Among his outreach work for the GMO industry, Folta answered questions on GMOs for pro-GMO website GMO Answers. Ketchum, the PR agency that runs the site, provided canned answers for Folta to repeat for the reading public.

Folta had previously said of Ketchum’s pre-prepared points in an article published in Nature, “I don’t know if I used them, modified them or what …”.

But the email string published by the NYT remedies Folta’s memory failure. The NYT’s editors note: “Dr. Folta was encouraged to make any changes he wanted, but he largely stuck with the script.” Two examples, in which Folta regurgitated Ketchum’s responses, are provided.

Finally, it should be noted that while the NYT tries to draw an equivalence between Folta taking money from Monsanto and Dr Charles Benbrook being funded by the organic industry, the two are not comparable.

Benbrook never denied being funded by, or having a relationship with, the organic industry. But Folta repeatedly denied his Monsanto links.

According to the NYT, Folta is “among the most aggressive and prolific biotech proponents, although until his emails were released last month, he had not publicly acknowledged the extent of his ties to Monsanto.”

 


 

Claire Robinson is an editor at GMWatch.

This article was originally published by GMWatch and contains some additional reporting by The Ecologist.

More:GMOs and the puppetmasters of accademia – what the New York Times left out‘ by Dr Jonathan Latham.

 

Jeremy Corbyn’s innovative energy policies are no 1980s throwback

Have you heard the one about Jeremy Corbyn’s plans to renationalise the energy system?

In an interview with Greenpeace, the Labour MP and leadership candidate said: “I would personally wish that the Big Six were under public ownership, or public control in some form.”

It would be easy to take this quote out of context, add up the market value of the Big Six and suggest the Corbyn campaign wants to spend £124 billion renationalising the utilities. In the next breath however Corbyn added:

“But I don’t want to take into public ownership every last local facility because it’s just not efficient and it wouldn’t be a very good way of running things.”

So what does the Corbyn camp suggest instead? The only hard evidence is in his Protecting Our Planet manifesto, which sets out ten energy pledges and details some key policies.

It’s no aggressive nationalisation plan. What it is, is a manifesto for a more decentralised and democratically accountable system, inspired more by present-day Germany than 1980s Britain.

So does Corbyn’s energy policy look like a throwback or a revolution? There are four reasons to suspect the latter.

Introducing genuine competition

‘Competition’ in the UK energy market has left consumers bamboozled and overcharged. Our choices are like a shopping mall food court: you can have anything you like, as long as it’s fast food. The energy market is similar, most suppliers are operating the same big utility model with the same options; you can have anything you like, so long as it’s a national tariff from a large private utility.

Corbyn’s manifesto cites Germany, which allows consumers the option to buy energy from municipal utilities or co-operatives. Some new consumer options are being seen in the UK. Smarter ways of buying green energy are appearing, and Nottingham City Council has set up its own energy company with a name that sends a clear message: Robin Hood Energy.

But while it’s easy enough to build a wind turbine these days – or even a whole wind farm – it’s significantly harder for innovative new businesses to actually join the market. Corbyn’s manifesto commitment to growing municipal and co-operative models would mean consumers face more meaningful choices.

Help for smaller energy startups

The manifesto pledges to create a “route-map into tomorrow’s ‘smart energy’ systems” to “use smart technologies to run localised storage, balancing and distribution mechanisms” and allow customers the “right to have first use of the energy they generate themselves”. But why isn’t this happening already?

It’s useful to think of our electricity system like a big swimming pool. Everyone’s electricity has to go into this big pool and a vast amount of market regulation is needed to make sure the pool stays ‘balanced’ at the right level, with all the buying and trading and using of power going on underneath the surface.

This means small scale solutions to generating and using power locally are extremely difficult to set up, as they all incur the costs of trading in the big pool. To stretch the metaphor, this means little fish have to swim in a big pond.

Such a setup creates barriers to innovation and is holding back new technologies. There is no technical reason why you shouldn’t be able to choose to buy energy from local sources these days – what stands in the way is the requirement that everyone has to swim in the big pool first. By creating local energy markets, smaller but still viable businesses can flourish.

Cheap access to green investment

The manifesto commits to pursue energy investment through a National Investment Bank. While this model has seen success in Germany, what is less well understood is how important citizen banks have been in deploying this investment.

The UK doesn’t have a citizen banking sector like Germany. This means you can only invest in renewables by either buying shares in a green energy company or investing in a co-operative. However, new models are emerging. Abundance Generation, an online crowdsourcing platform, allows investors to participate in renewable energy schemes for as little as £5, and a German-style local bank is being developed in Hampshire.

While Corbyn’s manifesto sees the benefit of establishing a state investment bank to invest in the energy transition, it will be important to deliver this investment through the right institutions at the right level so citizen investment can complement state finance.

Democratising the energy sector

Throughout, the manifesto argues for more citizen influence over the energy system – and not just through supposed consumer ‘choice’.

It is not only the German system that can be drawn on to change this. Energy decision-making can be brought closer to citizens by, for instance, looking at public value energy governance which draws on Danish and North American examples, direct action to take back ownership of key infrastructure, or reframing energy as a public good.

It is clear from the manifesto that the energy policies of the Corbyn camp are anything but a throwback to monolithic state utilities. There is potential for more competition through more diverse energy business models, a clear willingness to make space for smart energy innovation, a call for different approaches to energy system finance, and a platform for more plural approaches to energy governance.

Whether or not the reader agrees with these proposals, it should be clear that they are not old solutions to old problems, but provocative responses to increasingly urgent challenges.

 


 

Stephen Hall is Research Fellow in energy economics and policy, University of Leeds.The Conversation

This article was originally published on The Conversation. Read the original article.

 

The archaic nature of ‘baseload’ power

The old grid, beholden to massive, polluting baseload power plants, is being replaced by a nimbler, high-tech 21st century system oriented toward variable renewable energy.

There are no shortage of skeptics out there, even some among environmentalists and clean energy advocates, who are unconvinced that renewable energy can ever be the dominant-perhaps even sole-source of electricity generation.

The reasons for this skepticism vary. Some, for example, argue that the land needs for sufficient generation of wind and solar power are too great. This turns out to be an incredibly lame argument, but that’s the subject of a different article.

More frequent are the arguments that ‘baseload’ power-large power plants that tend to run 24/7-are necessary to ensure reliable electricity and that the variable nature of some renewables-solar and wind-can’t provide that reliability.

Then there’s the notion that the electrical grid can only accommodate a certain level of renewables, around 30-40%. Above that and the grid pretty much breaks down. These arguments are actually related and solved in the same way.

More recently, an argument has been circling among energy nerds-especially pro-nuclear energy nerds-that the integration of renewables into the grid reaches a peak for economic reasons: that renewables are limited by their cost. Not by their high cost, but by their low cost, or as one writer put it: “solar and wind eat their own lunch.”

But that merely shows that not only must the technical nature of the grid change, and it can; but so must its economic nature, and it can too.

The good old days … too bad they were killing us

The electric grid in use today was mostly designed in the 20th century. Large baseload nuclear and fossil fuel plants were built, usually far from the largest electricity consumers (cities and large industry), and transported by huge (and not particular efficient) power lines.

Those baseload plants had, and have, high capacity factors and run pretty much all the time, although nuclear reactors have to be shut for refueling for a few weeks every 12-18 months. Utilities try to arrange those shutdowns to occur during periods of low demand.

During peak power needs – hot summer days in most of the country – smaller gas plants and in the old days even oil plants would be fired up to supplement the baseload levels. And it all worked pretty well given the technology available at the time.

But, as we all now know all too clearly, that system had a price – a price not reflected in the cost of electricity. That system was and is killing us. Those large nuclear and fossil fuel plants are spewing out carbon dioxide and radioactivity and creating large quantities of dirty and deadly waste products that society doesn’t know what to do with.

Had the cost of those effects – which do have a price, a steep one – been incorporated into the price we and our parents paid for electricity, we probably would have moved to a clean energy system much faster. As it is, we no longer have much of a choice.

Variable power sources more reliable, resilient than ‘baseload’

Fortunately, as is being proven daily in Europe, a grid based on smaller, distributed variable power sources can be just as reliable, and even more resilient and secure, than a grid reliant on baseload power.

Variable does not mean unreliable: as long as it can be reliably projected with sufficient advance time what the wind will do and thus how much wind power will be available where, and the same for the sun, then a variable grid can be highly reliable. And those can be and are, in fact, reliably projected.

The ability to integrate a moderately large amount (say 30-35% or so) of renewables into a baseload-dominated grid is a given. It is happening daily. Not so much in the US, although even here states like Iowa are getting more than 20% of their power from renewables, and the percentage of renewables is set to rise rapidly-both on their own for sound economic reasons and due to encouragement of them in the Clean Power Plan.

But at some point above 35-40% renewables or so, a conflict arises. If more renewables are to be brought into the grid, the large baseload plants have to begin closing – even if they theoretically remain useful.

That’s because the kind of grid that works for the variable renewables – a fast, nimble grid where power from different sources scattered in different locations can be ramped up and down quickly depending on where it is being generated and where it is needed – doesn’t work well for baseload plants, especially nuclear reactors, which cannot ramp up and down quickly.

Those kinds of plants were designed to run 24/7 and that’s what they do – they’re not designed to fit in with a grid that doesn’t want them to run 24/7, that instead wants them to run when their power is needed. And the higher the penetration of renewables, the less the baseload plants’ power is needed.

The new kid on the block: energy storage

Add in energy storage, the new kid on the block, and polluting power plants running 24/7 become an anachronism. When the variable sources aren’t generating what is needed, just release the stored, and cheaper, electricity they generated earlier during periods of low demand.

The polluting baseload plants then make no sense at all. Why throw carbon dioxide into the air and tritium into the water and generate lethal radioactive waste just to keep dirty and usually more expensive power plants operating just for those few hours in the week when they might be useful? With storage, they’re not needed, or even particularly useful, at all.

What’s stopping us, or slowing us anyway, is not the technology for the new grid – that exists. It’s the rules. And the political will to transform the grid to accommodate the transformative technologies that have been developed over the past two decades.

If we’re going to move into the 21st century, and with nearly 15% of the century already gone we’re a good ways into it, then we’d better get moving quickly. The old rules need to be changed; David Roberts, formerly of Grist, has compiled a useful list of some of those needed changes.

The problem – the powerful incumbents holding onto their profits

One problem, obviously, is that utilities don’t want to close their old baseload power plants if they are still useful at generating electricity. They want to put off that retirement date as long as possible. Assuming its operating and maintenance costs are not so high that it loses money, the longer a power plant runs the more profit it returns. And utilities are about making money, not transforming the grid.

In the US, at least, we’re not at the point where profitable baseload power plants have to be forced closed for the greater good-renewables don’t yet make up enough of our power to require that step. But parts of Europe are quickly getting there, and we in the US will get there in many places faster than most people now think – surely within the next decade.

Germany is already showing that a grid with a high penetration of renewables can be reliable, and that forcing reactors to close can not only be publicly acceptable, it can attain wide public support.

The larger problem in Germany these days is not the amount of renewables in place, it’s that there is so much renewable generation that the grid needs to be strengthened to better distribute that electricity across the country and for export to nations like Poland and Austria – which badly want that cheap, clean power.

Public opinion polls suggest that in the US, a similarly high penetration of renewables will be most welcome, even if anti-nuclear sentiment is not at German levels.

The real problem with renewables – they are ‘too cheap’

Perhaps forcing reactors to close won’t be necessary; enough are already unprofitable, and more are likely to become so in coming years that perhaps they will simply shut down, be replaced by renewables and it will all happen quietly and happily.

More likely though, as nuclear utilities contemplate 80-year operating licenses and squeezing every last watt of power out of them regardless of their age or safety condition, that could become the nuclear issue of the next decade for the public, state regulators and policymakers and the like: should existing reactors stay open when they’re still viable or be forced aside to welcome larger amounts of cleaner, safer and usually cheaper renewables?

From our perspective, the answer is obviously yes, they should shut down to make way for the more modern system. But that’s an answer that will take a lot of preparation and groundwork beginning now, because the nuclear utilities will fight that hard.

That’s a somewhat different issue than the one that confronts us today, which is should uneconomic reactors stay open or move aside for renewables? The nuclear utilities want the ground rules changed to force ratepayers to keep those uneconomic reactors open regardless of their cost.

That’s an easy argument to make: of course the rules shouldn’t be changed to favor the higher-priced, dirtier power source. And it appears that argument is on the verge of victory in Illinois – the most nuclear state in the US. If that argument does end up carrying the day there, it can everywhere.

As for the notion that solar and wind are too cheap, that just shows the absurd nature of the economics of electricity and the failure to consider external costs – the environmental damage they cause and the full lifecycle costs of their existence – in the economic equation.

There is more to life than the dollar, though you wouldn’t know it by how many traditional markets work, and, in fact, we have reached the point that unless ‘more to life’ is adequately factored into prices, there may not be any life at all.

The concept being bandied about by these pro-nukers is that if there is ‘too much’ solar and wind in the system, its price will eventually become zero – essentially free. And at that price – or no price if you will – the system breaks down and there will be no more investment in solar and wind. Who would want to invest in it if you have to give it away?

The cheap power ‘problem’ can be solved – if you want to solve it

There are ways around the problem even under the existing system, from feed-in tariffs to Power Purchase Agreements. And the ‘problem’ itself still has at its foundation the baseload concept of electricity generation and distribution. Absent those baseload plants, which only inhibit renewable generation anyway, there cannot be ‘too much’ renewables in the system.

But including the real costs of nuclear and fossil fuel use would be the best step. Because once added in, those costs make that kind of generation too expensive to use no matter what the competition. And if the only choice is low-cost to zero-cost renewables, well, certainly consumers wouldn’t mind.

In the real world, rather than abstract economic modeling scenarios, electricity is a necessity and it will be provided. But in the real world, in the new world of the 21st century electricity grid, it may well be that electricity itself will not be as profitable to generators as it was in the 20th century.

Energy efficiency is reducing demand and that, despite a growing population and even with economic growth, is a trend that will continue and probably accelerate (Maryland, for example, has set a new policy of reducing demand by 2% every year). Renewables act to drive down electricity prices.

Certainly the idea that individual utilities, or even a consortium dominated by a single utility (a la Vogtle or Summer) will ever again build mega-billion dollar power plants of any kind just in order to sell electricity, is a relic of the 20th century playing out today as farce.

It won’t be playing out much longer. Utilities, like Virginia’s Dominion, that may think that obsolete model still applies, will regret it.

Not too cheap to meter – but too cheap to worry about

Electricity may never be free, or too cheap to meter, but it may well become one of life’s little bargains. Long distance in my lifetime has gone from an expensive luxury item rarely used; to an inexpensive, frequently-dialed option; to a free add-on to both my landline and iphone plans.

For my millennial kids, the concept of a ‘long-distance’ call is meaningless: they’ve never made one and never will. But they do still use phones, and all the services modern phone plans offer.

The costs of electricity are going to come down too – technology and renewables are already starting to see to that – but someone, whether it be the traditional utilities or someone smarter is going to come along and figure out how to make money by providing electricity add-ons and services, even if the electricity itself is free or nearly so.

Totally free electricity may be too much to hope for, there is a grid to pay for and maintain after all, and there will be for the foreseeable future. But the money to be made will be in the add-on services, not the basic electricity.

The solar rooftop people have pretty much already figured this out for their slice of the business – whether by lease or purchase, you pay primarily for the equipment, installation and maintenance, not so much for the electricity.

But since rooftop solar doesn’t work for everyone nor everywhere, there is a market ready for something new and safe and clean and that won’t destroy the planet we live on. I’d say that’s a pretty damn large market looking for the electricity equivalent of long-distance in the iphone era. With a market like that, someone is going to deliver, even if the electricity itself is little more than a low-cost add-on to other services people want.

That won’t happen tomorrow, of course. As Barry Cinnamon of The Energy Show podcast put it, “But this change in our energy sources will take many years, just as the complete transition from ‘horse and buggy’ transportation to gas-powered cars took 50 years.

“As with other large-scale technological changes, customer economics will force the current incumbent energy providers to change (unlikely), or go out of business (more likely). It’s a virtuous cycle as more customers are satisfied with renewable power generation, and more people are employed in these industries.”

Bye-bye nuclear – no place for you in this new power market

For nuclear power though – even for ‘small modular reactors’ (which actually are not so small, most are much larger than the early US commercial reactors like Big Rock Point and Yankee Rowe and some are as large as Fukushima Daiichi Unit-1) – and fossil fuels as well, the transformation means extinction.

By definition, SMRs are also baseload power plants; despite being smaller than today’s behemoth reactors, they are designed to run 24/7 and like their larger brethren, cannot power up and down quickly.

Before they even exist, they are obsolete. Their polluting ‘baseload’ means of providing their product (electricity) will be unneeded and functionally and economically irrelevant – unable to compete with those offering electricity as part of a set of services, rather than as an end in itself.

 


 

Michael Mariotte is Executive Director at Nuclear Information and Resource Service (NIRS).

This article originally ran on Green World, a news and information service of NIRS.

 

Jeremy Corbyn’s innovative energy policies are no 1980s throwback

Have you heard the one about Jeremy Corbyn’s plans to renationalise the energy system?

In an interview with Greenpeace, the Labour MP and leadership candidate said: “I would personally wish that the Big Six were under public ownership, or public control in some form.”

It would be easy to take this quote out of context, add up the market value of the Big Six and suggest the Corbyn campaign wants to spend £124 billion renationalising the utilities. In the next breath however Corbyn added:

“But I don’t want to take into public ownership every last local facility because it’s just not efficient and it wouldn’t be a very good way of running things.”

So what does the Corbyn camp suggest instead? The only hard evidence is in his Protecting Our Planet manifesto, which sets out ten energy pledges and details some key policies.

It’s no aggressive nationalisation plan. What it is, is a manifesto for a more decentralised and democratically accountable system, inspired more by present-day Germany than 1980s Britain.

So does Corbyn’s energy policy look like a throwback or a revolution? There are four reasons to suspect the latter.

Introducing genuine competition

‘Competition’ in the UK energy market has left consumers bamboozled and overcharged. Our choices are like a shopping mall food court: you can have anything you like, as long as it’s fast food. The energy market is similar, most suppliers are operating the same big utility model with the same options; you can have anything you like, so long as it’s a national tariff from a large private utility.

Corbyn’s manifesto cites Germany, which allows consumers the option to buy energy from municipal utilities or co-operatives. Some new consumer options are being seen in the UK. Smarter ways of buying green energy are appearing, and Nottingham City Council has set up its own energy company with a name that sends a clear message: Robin Hood Energy.

But while it’s easy enough to build a wind turbine these days – or even a whole wind farm – it’s significantly harder for innovative new businesses to actually join the market. Corbyn’s manifesto commitment to growing municipal and co-operative models would mean consumers face more meaningful choices.

Help for smaller energy startups

The manifesto pledges to create a “route-map into tomorrow’s ‘smart energy’ systems” to “use smart technologies to run localised storage, balancing and distribution mechanisms” and allow customers the “right to have first use of the energy they generate themselves”. But why isn’t this happening already?

It’s useful to think of our electricity system like a big swimming pool. Everyone’s electricity has to go into this big pool and a vast amount of market regulation is needed to make sure the pool stays ‘balanced’ at the right level, with all the buying and trading and using of power going on underneath the surface.

This means small scale solutions to generating and using power locally are extremely difficult to set up, as they all incur the costs of trading in the big pool. To stretch the metaphor, this means little fish have to swim in a big pond.

Such a setup creates barriers to innovation and is holding back new technologies. There is no technical reason why you shouldn’t be able to choose to buy energy from local sources these days – what stands in the way is the requirement that everyone has to swim in the big pool first. By creating local energy markets, smaller but still viable businesses can flourish.

Cheap access to green investment

The manifesto commits to pursue energy investment through a National Investment Bank. While this model has seen success in Germany, what is less well understood is how important citizen banks have been in deploying this investment.

The UK doesn’t have a citizen banking sector like Germany. This means you can only invest in renewables by either buying shares in a green energy company or investing in a co-operative. However, new models are emerging. Abundance Generation, an online crowdsourcing platform, allows investors to participate in renewable energy schemes for as little as £5, and a German-style local bank is being developed in Hampshire.

While Corbyn’s manifesto sees the benefit of establishing a state investment bank to invest in the energy transition, it will be important to deliver this investment through the right institutions at the right level so citizen investment can complement state finance.

Democratising the energy sector

Throughout, the manifesto argues for more citizen influence over the energy system – and not just through supposed consumer ‘choice’.

It is not only the German system that can be drawn on to change this. Energy decision-making can be brought closer to citizens by, for instance, looking at public value energy governance which draws on Danish and North American examples, direct action to take back ownership of key infrastructure, or reframing energy as a public good.

It is clear from the manifesto that the energy policies of the Corbyn camp are anything but a throwback to monolithic state utilities. There is potential for more competition through more diverse energy business models, a clear willingness to make space for smart energy innovation, a call for different approaches to energy system finance, and a platform for more plural approaches to energy governance.

Whether or not the reader agrees with these proposals, it should be clear that they are not old solutions to old problems, but provocative responses to increasingly urgent challenges.

 


 

Stephen Hall is Research Fellow in energy economics and policy, University of Leeds.The Conversation

This article was originally published on The Conversation. Read the original article.