Monthly Archives: August 2015

China’s emissions 14% lower than IPCC thought

New estimates show that for more than a decade China’s greenhouse gas emissions have been overestimated by international agencies, while the country’s energy consumption has been underestimated.

The research, published today in Nature, shows that from 2000 to 2013 China produced 2.9 gigatonnes less carbon than previous estimates of its culmulative emissions, meaning that its true emissions may have been around 14% lower than calculated.

Meanwhile, with a population of almost 1.4 billion, China’s energy consumption grew 10% faster during 2000-12 than reported by its national statistics.

As the world’s biggest greenhouse gas emitter, China’s recent pledge to peak its emissions by 2030 has been praised as responsible leadership on the climate issue, but its faster-than-expected energy consumption growth means meeting this target may present an even bigger challenge.

The researchers, led by Dabo Guan, of UEA’s School of International Development, used independently assessed data on the amount of fuel burned, and new measurements of emissions factors to re-evaluate emissions of two major sources of China’s carbon dioxide emissions – the burning of fossil fuels and cement production – from 1950-2013.

Guan said the new estimates were compiled by considering fuel quality when establishing emissions inventories – something that had previously been overlooked by the Intergovernmental Panel on Climate Change (IPCC) and most international data sources.

“While China is the largest coal consumer in the world, it burns much lower-quality coal, such as brown coal, which has a lower heat value and carbon content compared to the coal burned in the US and Europe”, said Guan.

Counting coal

According to the paper, “We find that total energy consumption in China was 10 per cent higher in 2000-2012 than the value reported by China’s national statistics, that emission factors for Chinese coal are on average 40 per cent lower than the default values recommended by the Intergovernmental Panel on Climate Change, and that emissions from China’s cement production are 45 per cent less than recent estimates.

“Altogether, our revised estimate of China’s CO2 emissions from fossil fuel combustion and cement production is 2.49 gigatonnes of carbon in 2013, which is 14 per cent lower than the emissions reported by other prominent inventories. Over the full period 2000 to 2013, our revised estimates are 2.9 gigatonnes of carbon less than previous estimates of China’s cumulative carbon emissions.”

Pep Canadell, Executive director of the Global Carbon Project at CSIRO, who was not involved in the study, said a lack of research resources meant that estimates of China’s emissions relied on default values from global databases.

Guan’s research team “visited thousands of mines and by actually exploring the coal they found there was less emissions”, Canadell said.

This is a process done by many countries, but for developing nations like China the important task of compiling detailed emissions inventories has historically been too expensive.

“The default values can be quite far away from the real values”, Canadell said. “In the future we would need real values for other places such as India.”

Corinne Le Quéré, director of the UAE Tyndall Centre for Climate Change Research, said there were a lot of uncertainties in China’s data, especially given the discrepancies between national and provincial figures.

“The strong message here is that as we refine our estimates of carbon emissions we get closer to an accurate picture of what is going on and we can improve our climate projections and better inform policy on climate change.”

The good news and the bad news

The new findings are a positive step towards accurately measuring emissions, but their effect on climate policy requires acknowledging the negatives – China’s rapidly growing energy needs.

Frank Jotzo, director of the ANU Centre for Climate Economics and Policy, described continued work on primary data as important but said the findings that emissions were overestimated does not change the challenge China faces in moving away from coal.

“For global climate change mitigation to succeed, a shift from coal to other energy sources in China is essential.” he said. “China is making good progress towards that goal.”

With the Paris meeting of the UNFCCC in November this year, China’s pledge to peak emissions from all activities by 2030 requires addressing its demand for electricity generation in production, transport and industrial systems.

According to Canadell, cement production accounts for about 5% of global greenhouse emissions, but because China is “building so much” it is a much bigger fraction. China produces more than half of global production of steel and cement. Yet China’s cement emissions, as the study found, are 45% lower than previously estimated.

“I don’t think this news is making it easier or harder” for China to meet its climate targets, said Canadell. “The most important thing is to measure the speed and trends of energy consumption.”

 


 

The paper:Reduced carbon emission estimates from fossil fuel combustion and cement production in China‘ is by Zhu Liu et al, and published in Nature.

Eliza Berlage is Editor at The Conversation.The Conversation

This article was originally published on The Conversation. Read the original article.

 

China’s emissions 14% lower than IPCC thought

New estimates show that for more than a decade China’s greenhouse gas emissions have been overestimated by international agencies, while the country’s energy consumption has been underestimated.

The research, published today in Nature, shows that from 2000 to 2013 China produced 2.9 gigatonnes less carbon than previous estimates of its culmulative emissions, meaning that its true emissions may have been around 14% lower than calculated.

Meanwhile, with a population of almost 1.4 billion, China’s energy consumption grew 10% faster during 2000-12 than reported by its national statistics.

As the world’s biggest greenhouse gas emitter, China’s recent pledge to peak its emissions by 2030 has been praised as responsible leadership on the climate issue, but its faster-than-expected energy consumption growth means meeting this target may present an even bigger challenge.

The researchers, led by Dabo Guan, of UEA’s School of International Development, used independently assessed data on the amount of fuel burned, and new measurements of emissions factors to re-evaluate emissions of two major sources of China’s carbon dioxide emissions – the burning of fossil fuels and cement production – from 1950-2013.

Guan said the new estimates were compiled by considering fuel quality when establishing emissions inventories – something that had previously been overlooked by the Intergovernmental Panel on Climate Change (IPCC) and most international data sources.

“While China is the largest coal consumer in the world, it burns much lower-quality coal, such as brown coal, which has a lower heat value and carbon content compared to the coal burned in the US and Europe”, said Guan.

Counting coal

According to the paper, “We find that total energy consumption in China was 10 per cent higher in 2000-2012 than the value reported by China’s national statistics, that emission factors for Chinese coal are on average 40 per cent lower than the default values recommended by the Intergovernmental Panel on Climate Change, and that emissions from China’s cement production are 45 per cent less than recent estimates.

“Altogether, our revised estimate of China’s CO2 emissions from fossil fuel combustion and cement production is 2.49 gigatonnes of carbon in 2013, which is 14 per cent lower than the emissions reported by other prominent inventories. Over the full period 2000 to 2013, our revised estimates are 2.9 gigatonnes of carbon less than previous estimates of China’s cumulative carbon emissions.”

Pep Canadell, Executive director of the Global Carbon Project at CSIRO, who was not involved in the study, said a lack of research resources meant that estimates of China’s emissions relied on default values from global databases.

Guan’s research team “visited thousands of mines and by actually exploring the coal they found there was less emissions”, Canadell said.

This is a process done by many countries, but for developing nations like China the important task of compiling detailed emissions inventories has historically been too expensive.

“The default values can be quite far away from the real values”, Canadell said. “In the future we would need real values for other places such as India.”

Corinne Le Quéré, director of the UAE Tyndall Centre for Climate Change Research, said there were a lot of uncertainties in China’s data, especially given the discrepancies between national and provincial figures.

“The strong message here is that as we refine our estimates of carbon emissions we get closer to an accurate picture of what is going on and we can improve our climate projections and better inform policy on climate change.”

The good news and the bad news

The new findings are a positive step towards accurately measuring emissions, but their effect on climate policy requires acknowledging the negatives – China’s rapidly growing energy needs.

Frank Jotzo, director of the ANU Centre for Climate Economics and Policy, described continued work on primary data as important but said the findings that emissions were overestimated does not change the challenge China faces in moving away from coal.

“For global climate change mitigation to succeed, a shift from coal to other energy sources in China is essential.” he said. “China is making good progress towards that goal.”

With the Paris meeting of the UNFCCC in November this year, China’s pledge to peak emissions from all activities by 2030 requires addressing its demand for electricity generation in production, transport and industrial systems.

According to Canadell, cement production accounts for about 5% of global greenhouse emissions, but because China is “building so much” it is a much bigger fraction. China produces more than half of global production of steel and cement. Yet China’s cement emissions, as the study found, are 45% lower than previously estimated.

“I don’t think this news is making it easier or harder” for China to meet its climate targets, said Canadell. “The most important thing is to measure the speed and trends of energy consumption.”

 


 

The paper:Reduced carbon emission estimates from fossil fuel combustion and cement production in China‘ is by Zhu Liu et al, and published in Nature.

Eliza Berlage is Editor at The Conversation.The Conversation

This article was originally published on The Conversation. Read the original article.

 

The perfect pinta vs. the TTIP trade tanker

Cows invading supermarkets last week and milk Ice bucket challenges spreading across the UK this week.

It’s clear farmers are up for a protest with something louder than placards about ‘every day’ low prices for their milk.

Alarming headlines such as “Milk cheaper than water”, “Dairy farmers state of emergency”, filled the media.

In reality, dairy farmers are in a dire situation. Over the past year alone, nine dairy farmers went bust each week (a loss of 421 dairy farmers). Sadly, the outcome of their protest is likely to be weak.

Fragile or empty promises from a food industry structurally incapable of delivering a sustained fair deal to farmers here or overseas and limp statements from a Government disinclined to upset big food businesses.

Also looming over farmers is the threat of more and cheaper US dairy imports produced at lower standards under the new trade agreement – the Transatlantic Trade and Investment Partnership (TTIP) – currently being negotiated between the US and EU. Farmers have already raised alarm bells at the risk – including the National Farmers Union.

A wider crisis in food

Headlines and short lived price promises aside, it is clear that farmers (and consumers) are stuck in an impossibly unfair system. And they have been for decades.

Farmer protests – often involving manure or pallets full of peaches dumped on ministry doorsteps – have long been a common sight in Europe as farmers resort to desperate tactics to try and gain attention for their plight.

Thousands of small businesses (farms) are stuck in one end of an hourglass and millions of consumers in the other end with a tiny handful of supermarket buyers and dairy companies in control the middle. To extend the comparison, the hourglass gets shaken regularly as the global price for dairy commodities is managed by a handful of multinational corporations.

Shifts in global demand add the final uncertainty to the market. China for instance has created a dairy industry of its own so buys less of ours. Russia’s sanctions against EU farm imports have also wiped out what used to be a major export market for dairy and other produce.

Somewhere in all this complex system, ordinary farmers (and their animals) get left behind. As a result they are forced to sell up, so bigger farms are created which have to produce more and sell it for a lower price; pushing the land and animals ever harder to gain even less of the food pound.

We can buy cheaper produce but at a hidden high cost to public health, pollution of natural resources, rural decline and severe hardship for farm communities here and overseas.

The true extent of corporate control of food commodities – or milk to you and me – is huge. They don’t just want to have a say in the global price. They work hard to influence how food is produced. They want cheaper raw materials from wherever they can get them. TTIP may deliver..

New trade rules – in whose interest?

It is these largely unaccountable bodies that will benefit from trade negotiations like the Transatlantic Trade and Investment Partnership (TTIP). Anxious to weaken food regulations by ‘mutual recognition’ of different standards or ‘harmonizing’ standards to the ones they like, they are very active in the TTIP debate.

As noted by The Ecologist, in food that means pushing to end food border inspections, controls on chemicals, antibiotics and hormone use in livestock production, and allowing GM crops. US meat and dairy industries are pushing to eliminate or weaken animal welfare standards that they say are ‘barriers to trade’.

In dairy it’s all about animal health and milk quality. The US dairy exporters would like to see EU limits on somatic cell counts (in effect, pus) in milk removed – yet the cell count indicates mastitis, a painful infection of the breast tissue in cows. The EU standard requires better herd health so lessens the likelihood of herds being unhealthy, but means higher costs.

Protecting consumers, farmers, the environment and animals is central to a resilient, safe and healthy food system. War on Want advocates a new way to manage our food system – based on food sovereignty – an alternative food system that creates practical, sustainable and democratic solutions to the failed industrialised food model.

But if farmers, like those dairy farmers, are unable to make a living and the very companies that are squeezing them to oblivion are setting the rules in a new trade treaty, our food system will go downhill faster than M&S can push a cow out of the chilled yogurt section.

 


 

Please Join the TTIP campaign now to promote a healthy, sustainable food system which is good for us and farmers worldwide.

Vicki Hird (@vickihird) is Director of Policy and Campaigns at War on Want. She has over 20 years’ experience working on environment, justice, food and farming issues. She is an expert consultant for NGOs and institutions and was Senior Campaigner heading up the Land use, Food and Water Programme for Friends of the Earth. Previously she was Policy Director of Sustain and Coordinator of the SAFE Alliance. She has an academic background in pest management and is a Fellow of the Royal Entomological Society and the RSA. Vicki is Chair of Eating Better, a trustee of Pesticides Action Network and the Keo Foundation.

 

Do the UK government’s sums on Hinkley and climate change add up?

Deep in the bowels of the UK Government’s Department for Energy and Climate Change (DECC) are probably some very stressed civil servants.

Those who aren’t currently seeking alternative employment face what may be an even harder task; making the government’s energy sums add up in the face of changes to just about everything from the gas price to the UK’s willingness to build any more wind turbines.

The gas price has fallen – which makes subsidising nuclear (and offshore wind) much more expensive. Cheaper options for cutting emissions – like onshore wind and efficiency measures have, for various reasons, been parked. The spreadsheet must be all over the shop.

When we asked to see the modelling so far for the UK’s flagship Hinkley point C nuclear project we were told we could – but only after the deal was done and dusted.

Previous versions of the UK’s energy modelling have at times strained credulity on the speed of phaseout of coal power, or the construction of new nuclear power , but nonetheless they serve as a possible baseline for what might happen in future.

Now those modellers have some fresh, even tougher challenges. One is to demonstrate how the various policy changes and roll-backs recently announced – such as the end in support for onshore wind and solar power – can be squared with Prime Minister Cameron’s commitment to deliver on carbon budgets and Climate Change Act.

If it follows the publication schedule from last year, updated projections would be expected to come out in September, which if they model the current state of play, would be bad for UK climate credibility in the run-up to key talks on global climate change in Paris this December.

Why are these modelled projections particularly difficult? There are several reasons.

1) The cheap stuff is gone

First there’s policy, especially low carbon policy, which will drive change in the energy world. In UK that’s going through a meltdown which we haven’t previously experienced.

So for example the zero-carbon homes policy was due to produce carbon savings of 4.7million tonnes (p.28) in the key period of the UK 4th carbon budget (2023-27). That policy doesn’t exist any more.

Other savings in energy efficiency programmes and renewable heat are either on the back burner or already on the scrap-heap. So carbon savings will need to be found somewhere else to keep Cameron’s credibility in tact.

2) Everything about energy has changed (but DECC may not have noticed)

Secondly, from smart-grids to Tesla batteries, there is a wave of change running through energy globally (local examples are here and here) which other countries are relatively aware of, but which doesn’t seem to have had any impact on the policy discussion in UK Government. And so not in DECC modelling either.

These changes have been enough to cause energy giants like Centrica (slide 13 onwards) and EOn to restructure, and RWE to say we need to revamp the approach in UK. Perhaps those few civil servants who remain in post are too busy to read the papers.

3) Does anyone know what is happening with Europe?

Thirdly there’s the UK relationship with Europe. Much of the changes in UK energy and electricity systems have been driven by policy flowing from EU.

Efficiency standards in products and buildings and vehicles, the Emissions Trading Scheme, the renewables targets, and now interconnection targets all flow from Brussels.

They have not been unqualified successes but – for those concerned with reducing our use of fossil fuels – they have been an effective force overall. Will that drive still be in place as UK reconsiders its place in EU? If those EU policies don’t apply, what will take their place?

4) Some very dodgy assumptions – especially about the gas price

In that context, the fourth reason those projections are difficult is very relevant – policies and projections which UK has, but few really believe will ever happen. Assumptions, as they say, are the mother of all … mistakes.

One example would be the Carbon Floor Price, the key driver of removing existing coal stations from the electricity mix. In theory it rises to £70 per tonne of CO2 in 2030, but it is frozen at the 2015 level until 2020, with very little real genuine expectation that its rise will continue after that date.

Another example would be the continued build-out of clean energy which requires subsidy to top up the wholesale electricity price that these renewable projects get. But there’s no money on the table post 2020 and no clear timeline for producing it.

In fact there’s considerable lack of clarity on how much money is available pre-2020 as the state of the Levy Control Framework (as this fund is known) seems to have moved from being a bit tight to being in crisis in a few short months.

One reason why the money might be short is the (relatively) low gas price, which makes the clean energy ‘top-up’ more costly. But we don’t know what DECC is estimating for the price of gas in 2020s.

In fact, we don’t know a lot about how DECC justifies its actions because a lot of it is kept secret. Especially around another one of those policies which DECC thinks will happen but most people don’t – the 35% of power to be met by proposed nuclear new build programme by 2030, including the proposed new nuclear power station at Hinkley Point C.

Hinkley secrecy

A lot of modelling was seemingly done by DECC to justify why it was a good deal for the consumer, apparently saving households £75 a year and to justify to EU why it should be getting so much support from bill payers.

Unfortunately our civil servant friends are keeping it all secret, and the Freedom of Information request we submitted to see what modelling justified the investment has been rejected, so the data is being kept secret until the contract is signed.

It is not reassuring to know that when a 35 year contract involving payments of around £80 billion from UK consumers is irrevocably signed, we’ll be able to see the justification. It could be brilliant, of course, or it could be delusional.

The fact that the world is changing and that the government’s economic justification for Hinkley may therefore rest on foundations of sand is one of the reasons that HSBC and energy giant RWE have recently come out against it.

Even the ‘very pro-nuclear’ former energy minister in the government of Margaret Thatcher who gave the go-ahead to the last round of attempted new-build has said it’s “one of the worst deals ever”.

The high cost of the new nuclear programme is justified on by Secretary of State Amber Rudd on the basis that it provides reliable supply unlike that from renewables: this is undoubtedly an important additional value provided by nuclear, but because policy appears to be being made up as they go along, there is no underpinning justification to it.

Moreover the ‘always on’ property of nuclear power (except when it’s off) also has a downside: nuclear plants also providing lots of high-cost power at might when it’s least needed and prices are low.

So how valuable is that reliability, and what would the alternatives be to nuclear baseload? The International Renewable Energy Agency have already looked at this (see Fig 2.10 p.42) for 30-40% power provided by wind and concluded that, even including all the extra costs of ensuring you can keep the lights on when the wind doesn’t blow, onshore wind is still cheaper than Hinkley.

Also note that big stations like Hinkley also impose costs on the grid due to the need to secure 3.2GW of instant backup in the event of an unscheduled shutdown, which all consumers have to bear – and are generally much less talked about.

In short, DECC modelling is deeply challenging because there is no coherent or credible policy to model. But it is also behind the curve on technological development, and not taking a system-wide view, and often key material is kept secret.

Which risks leaves poor decisions being justified on the basis of soundbites. And remember, if you’re reading this in the UK, you’re paying.

 


 

Dr Doug Parr is Scientific Director of Greenpeace UK.

This article was originally published on Greenpeace EnergyDesk.

 

Do the UK government’s sums on Hinkley and climate change add up?

Deep in the bowels of the UK Government’s Department for Energy and Climate Change (DECC) are probably some very stressed civil servants.

Those who aren’t currently seeking alternative employment face what may be an even harder task; making the government’s energy sums add up in the face of changes to just about everything from the gas price to the UK’s willingness to build any more wind turbines.

The gas price has fallen – which makes subsidising nuclear (and offshore wind) much more expensive. Cheaper options for cutting emissions – like onshore wind and efficiency measures have, for various reasons, been parked. The spreadsheet must be all over the shop.

When we asked to see the modelling so far for the UK’s flagship Hinkley point C nuclear project we were told we could – but only after the deal was done and dusted.

Previous versions of the UK’s energy modelling have at times strained credulity on the speed of phaseout of coal power, or the construction of new nuclear power , but nonetheless they serve as a possible baseline for what might happen in future.

Now those modellers have some fresh, even tougher challenges. One is to demonstrate how the various policy changes and roll-backs recently announced – such as the end in support for onshore wind and solar power – can be squared with Prime Minister Cameron’s commitment to deliver on carbon budgets and Climate Change Act.

If it follows the publication schedule from last year, updated projections would be expected to come out in September, which if they model the current state of play, would be bad for UK climate credibility in the run-up to key talks on global climate change in Paris this December.

Why are these modelled projections particularly difficult? There are several reasons.

1) The cheap stuff is gone

First there’s policy, especially low carbon policy, which will drive change in the energy world. In UK that’s going through a meltdown which we haven’t previously experienced.

So for example the zero-carbon homes policy was due to produce carbon savings of 4.7million tonnes (p.28) in the key period of the UK 4th carbon budget (2023-27). That policy doesn’t exist any more.

Other savings in energy efficiency programmes and renewable heat are either on the back burner or already on the scrap-heap. So carbon savings will need to be found somewhere else to keep Cameron’s credibility in tact.

2) Everything about energy has changed (but DECC may not have noticed)

Secondly, from smart-grids to Tesla batteries, there is a wave of change running through energy globally (local examples are here and here) which other countries are relatively aware of, but which doesn’t seem to have had any impact on the policy discussion in UK Government. And so not in DECC modelling either.

These changes have been enough to cause energy giants like Centrica (slide 13 onwards) and EOn to restructure, and RWE to say we need to revamp the approach in UK. Perhaps those few civil servants who remain in post are too busy to read the papers.

3) Does anyone know what is happening with Europe?

Thirdly there’s the UK relationship with Europe. Much of the changes in UK energy and electricity systems have been driven by policy flowing from EU.

Efficiency standards in products and buildings and vehicles, the Emissions Trading Scheme, the renewables targets, and now interconnection targets all flow from Brussels.

They have not been unqualified successes but – for those concerned with reducing our use of fossil fuels – they have been an effective force overall. Will that drive still be in place as UK reconsiders its place in EU? If those EU policies don’t apply, what will take their place?

4) Some very dodgy assumptions – especially about the gas price

In that context, the fourth reason those projections are difficult is very relevant – policies and projections which UK has, but few really believe will ever happen. Assumptions, as they say, are the mother of all … mistakes.

One example would be the Carbon Floor Price, the key driver of removing existing coal stations from the electricity mix. In theory it rises to £70 per tonne of CO2 in 2030, but it is frozen at the 2015 level until 2020, with very little real genuine expectation that its rise will continue after that date.

Another example would be the continued build-out of clean energy which requires subsidy to top up the wholesale electricity price that these renewable projects get. But there’s no money on the table post 2020 and no clear timeline for producing it.

In fact there’s considerable lack of clarity on how much money is available pre-2020 as the state of the Levy Control Framework (as this fund is known) seems to have moved from being a bit tight to being in crisis in a few short months.

One reason why the money might be short is the (relatively) low gas price, which makes the clean energy ‘top-up’ more costly. But we don’t know what DECC is estimating for the price of gas in 2020s.

In fact, we don’t know a lot about how DECC justifies its actions because a lot of it is kept secret. Especially around another one of those policies which DECC thinks will happen but most people don’t – the 35% of power to be met by proposed nuclear new build programme by 2030, including the proposed new nuclear power station at Hinkley Point C.

Hinkley secrecy

A lot of modelling was seemingly done by DECC to justify why it was a good deal for the consumer, apparently saving households £75 a year and to justify to EU why it should be getting so much support from bill payers.

Unfortunately our civil servant friends are keeping it all secret, and the Freedom of Information request we submitted to see what modelling justified the investment has been rejected, so the data is being kept secret until the contract is signed.

It is not reassuring to know that when a 35 year contract involving payments of around £80 billion from UK consumers is irrevocably signed, we’ll be able to see the justification. It could be brilliant, of course, or it could be delusional.

The fact that the world is changing and that the government’s economic justification for Hinkley may therefore rest on foundations of sand is one of the reasons that HSBC and energy giant RWE have recently come out against it.

Even the ‘very pro-nuclear’ former energy minister in the government of Margaret Thatcher who gave the go-ahead to the last round of attempted new-build has said it’s “one of the worst deals ever”.

The high cost of the new nuclear programme is justified on by Secretary of State Amber Rudd on the basis that it provides reliable supply unlike that from renewables: this is undoubtedly an important additional value provided by nuclear, but because policy appears to be being made up as they go along, there is no underpinning justification to it.

Moreover the ‘always on’ property of nuclear power (except when it’s off) also has a downside: nuclear plants also providing lots of high-cost power at might when it’s least needed and prices are low.

So how valuable is that reliability, and what would the alternatives be to nuclear baseload? The International Renewable Energy Agency have already looked at this (see Fig 2.10 p.42) for 30-40% power provided by wind and concluded that, even including all the extra costs of ensuring you can keep the lights on when the wind doesn’t blow, onshore wind is still cheaper than Hinkley.

Also note that big stations like Hinkley also impose costs on the grid due to the need to secure 3.2GW of instant backup in the event of an unscheduled shutdown, which all consumers have to bear – and are generally much less talked about.

In short, DECC modelling is deeply challenging because there is no coherent or credible policy to model. But it is also behind the curve on technological development, and not taking a system-wide view, and often key material is kept secret.

Which risks leaves poor decisions being justified on the basis of soundbites. And remember, if you’re reading this in the UK, you’re paying.

 


 

Dr Doug Parr is Scientific Director of Greenpeace UK.

This article was originally published on Greenpeace EnergyDesk.

 

Welcoming refugees is the first step to freedom and justice

The mainstream right wing press is awash with racist fear-mongering, reminiscent of the darkest periods of human history.

Refugees are abused daily in the popular press – compared to cockroaches in The Sun, and even the Prime Minister has described vulnerable, displaced people in Calais as “a swarm.”

Under this swathe of noxious abuse the Tory cutting machine dismantles our nation’s capacity to help those in need, whilst our borders are fortified.

For those of us who seek to practice compassion in our daily lives, believe that the stronger in society should help the weaker and hold that our state should be founded on morality, this is a horrifying time calling upon us all to personally and purposefully present an alternative.

Fundamentally, the grave ongoing mistake causing so much misery is the insistence on The Other. We all live on one planet, we are all one species – dependent on one biosphere and the problems threatening the foundations of our society are global.

Retreating into violent parochialism doesn’t just fail to rise to the challenge … it fails to recognise the challenge at all. We have become a planetary species, now facing planetary challenges that, by definition, can not be solved through national self-interest. We have to think globally.

The all pervasive human influence on our planet

The human influence on the planet is so pronounced that the climate changes. Deserts around the equator are growing, droughts and wildfires increase and food production capabilities in many countries that are already failing to meet their people’s needs are diminishing. People in these countries are anticipated to move so that they can live.

The spark of genius that ignited the industrial revolution, that created the engines, that burn the fossils, that loads our atmosphere with greenhouse gas happened in an Englishman’s brain on these shores. For over 300 years we used this technology to subjugate much of the world in an empire upon which ‘the sun never set’. Against this historical backdrop, aggression to climate refugees is especially cruel.

More recently, our governments, in close partnership with the USA and other neo-liberal powers, have waged wars to secure oil to continue the endless expansion of a highly-polluting, linear, industrial economy. Iraq, Afghanistan, Libya and other nations have all been catastrophically attacked. Now, many millions flee the terrible war-torn remains.

The recent film Bitter Lake brilliantly demonstrates how decades of Western foreign policy has simply seen a new version of colonial warfare, domination and resource grabbing, further dividing the world and exacerbating a ‘them and us’ narrative in which the white western world is ‘good’ and any opposition is terrorist.

The ruinous state of the Middle East and North Africa is essentially down to ‘us’ – UK, Europe, USA, and NATO. Furthermore, every conflict on earth is compounded by resource depletion and climate change. Scientists now link the 2012 droughts in Russia, Ukraine, China and Argentina, which drove up global food prices with the Arab Spring, widespread civil unrest and subsequent carnage.

We need solutions, not wars

We need our leaders to end the aggression and create the clean energy infrastructure that will spare future generations the apocalyptic scenarios we are pushing up against. Instead, the oil we have seized fuels the ongoing war machine.

Domestically, these backward policies go hand in hand with shredding support for renewables and fast-tracking fracking. This is the road to hell.

The fossil fuel agenda doesn’t just deny the perilous state of the world; it causes it. This wilful obfuscation of responsibility has a human face when politicians, media barons and pundits try to outdo each other dehumanising the desperate people on our borders.

It is cowardly, callous and miserable to cause gross suffering abroad and then cower behind fences, refusing to acknowledge the catastrophe we have visited on our fellow human beings. This might be the behaviour of over-privileged Etonians, bankers or Tony Blair – but most ordinary folk help those in need.

If we drop the antaganism for five seconds we can see the people in Calais are not a “swarm” of insects but fellow humans in need. Stigmatising traumatized refugees on our borders is nasty but it is what we have come to expect from a corrupt, toxic and politically-biased media machine.

Moving beyond fear and hatred

For those of us not personally invested in oil, war and racism we have to think differently. If we want to create a better world in which there is less war and poverty, the environment recovers and people can live meaningful and decent lives, we have to think above and beyond the nation state – an outdated concept.

Linking up with fellow activists and citizens around the world – we can transition out of this doomsday economy. Working together and caring for those in need we can demonstrate that global cooperation is more important and effective than competition.

We need to rapidly manifest a sane alternative to nationalist posturing because it is clear that any civilisation attempting to sustain itself by barricading itself against the rest of a world – in which so many have nothing – is doomed from the outset.

The first step out of this dismal dark place is to provide properly for the worst off – migrants and refugees. This kernel of compassion could help reignite our imaginations about what it means to be human.

We are not just tooled-up, territorial monkeys any more – we are capable of great things. The greatest of all is kindness.

 


 

Matt Mellen is Founder and Editor of EcoHustler.

 

Obama’s ‘clean power plan’ is feeble and fragile

No doubt you heard the good news. Barack Obama has announced the US is pushing through plans to reduce emissions of greenhouse gases. Rejoice! Rejoice! We’ve got this climate problem licked – hurrah!

Hold the champagne – and not just because it’s full of bubbles of carbon dioxide – while we do a reality check. This is a distinctly underwhelming development. Let’s pick apart the spin from the reality.

First, the way the story has been told – the US commits to a 32% reduction in emissions of greenhouse gases by 2030. This is being pushed through by tightening the rules governed by the Environmental Protection Agency (EPA) – a federal agency that the US president can instruct, without the need to get past those pesky filibusterers in the dysfunctional, Republican-dominated houses of Congress. The EPA is confident that its rules have a firm legal footing and will be able to withstand the inevitable court challenges.

The effect of the rules will be to clobber the production of electricity using coal. This is certainly a ‘good thing’. Quite apart from coal’s high carbon-intensity as a means of producing power, it is dirty in other ways – resulting in pollution that is harmful to human health as well as to the environment.

With this commitment, the US can enter the climate negotiations in Paris with its head held high and can push for a global deal to head off dangerous climate change. Whoo-hoo!

The reality: it’s a development that is both fragile and feeble

It’s fragile, because unless every occupant of the White House between now and 2030 is a Democrat, it can be unpicked. Remember the bit about the EPA being a federal agency that the US president can instruct? Well, if the president happens to be a climate-denying Republican (the two words are almost, but not completely, interchangeable), he or she could countermand the previous instruction.

Of course, the ruling now will inform business decisions and will have a long-lasting effect, regardless of whether the rules are subsequently reversed, but believing that this announcement sets in stone the target of a 32% reduction in emissions requires a degree of optimism that lies somewhere between the heroic and the delusional.

It’s feeble too. A 32% reduction by 2030 sounds quite impressive until you realise it is baselined on 2005 figures. By 2013 (the latest data available) it had already fallen by 15% so the rate of improvement required in the next 15 years is actually slower than what has already been achieved.

From 2005-2013, emissions fell at a rate of 2.0% per year – to meet the commitment, emissions would need to fall by just 1.3% per year between 2013 and 2030. And the 32% reduction figure only relates to the emissions from power generation, which makes up less than a third of total US emissions.

How does the rate of committed reductions compare with what is actually required to achieve temperature rises of less than 2°C above pre-industrial levels? According to trillionthtonne.org – a website that tracks these things – to have a better than 50% chance of avoiding such dangerous climate change would require emissions to decrease by more than 2.6% per year. For the rest of time. Globally.

Not that the US should come in for special criticism – the commitments from the EU and from China are similarly insufficient to head off the threat of dangerous climate change.

No time for early nights

So why hasn’t this been reported? I think what is going on here is partisanship and a well-intentioned desire to boost the prospects of a meaningful deal in Paris.

Climate change-denying Republicans hate this plan (of course), therefore all good climate realists see it as a triumph. But it is a tiny, tiny step in the right direction and climatically immaterial.

Ah yes, you say, but it’s politically important – the world’s hegemonic power has made a commitment, and that creates a foundation upon which greater progress can be made. Let’s not be pessimistic – this could be the start of a global deal.

Well, this emperor has no clothes. The pronouncement reminds me of the words of Neville Chamberlain on his return from the Munich Conference in 1938: “I believe it is ‘peace for our time’. Go home and get a nice quiet sleep.”

 


 

Tim Kruger is James Martin Fellow, Oxford Martin School at the University of Oxford.

This article was originally published on The Conversation. Read the original article.The Conversation

 

It’s not just me on trial – it’s British democracy and British justice

Last December as part of the peaceful weekend Occupy Democracy protest in Parliament Square, I was standing quietly with a placard stating “Arrest Nick Clegg for Selling Stolen Peerages”.

Private security wardens working for Boris Johnson had been dangerously pushing protesters back over the fencing that the Greater London Authority had erected around the square to prevent the protest from taking place.

At about 7.30pm a protester whom I did not know jumped over the fence and ran past me. The warden who had been standing beside me, lunged at the protester and instinctively I reached out to protect the protester.


STOP PRESS 14/08/15 – Case Dismissed! The judge intervened half way through Donnachadh’s evidence, said it was clear that contact with the chief Heritage Warden was accidental and to continue was a waste of time. She also noted Donnachadh’s good character.


Whilst doing so I lightly bumped into the warden, who triumphantly turned to me and declared something along the lines of “I am going to get you for assault”. I stared open mouthed at him and said something like “You cannot be serious.”

About an hour later, a number of police approached me whilst I was giving a Livestream interview about how corruption works in Britain’s political system and taken to a police station and charged with ‘assault and battery’.

The trial finally takes place tomorrow, Friday 14th August at Westminster Magistrates Court.

Video: Donnachadh McCarthy’s arrest, 19th December 2014.

So why was I standing there with that placard?

The story starts nearly 20 years ago, when as a member of the Lib Dems national executive, I was astonished at how so many rich corporate donors, corporate lobbyists and corporate directors that I did not even know were members of the party, would suddenly be appointed by the leadership to be Lib Dem peers in the House of Lords, with the power to pass legislation on our behalf for life!

As part of a series of measures aimed at cleaning up the corruption and dishonesty I found at the heart of the party, I decided with colleagues, to seek to change the rules so that the leadership would no longer appoint through patronage the Party’s nominees to the Lords, but instead would have to appoint those elected by the party to be it’s nominees.

Ashdown was the leader at the time and along with the party’s parliamentary parties in both houses of Parliament, they vigorously opposed the proposed elections.

However, after numerous attempts we finally got enough local member support for the proposal to be debated at conference. Amazingly, despite the leadership trashing the proposal, we won! The party subsequently elected a panel of nominees for the Lords.

By then the party had a new leader, Charles Kennedy, who largely respected the process in his first list of nominees to the Lords – six out of eight were from the list. But then outrageously, he completely reneged on his second list where the majority of the nominees were the usual donors and cronies and not from the elected list.

I was outraged and submitted a formal complaint to the party’s Federal Appeals Panel. Kennedy told them that he made the appointments in his capacity as an MP and leader of the Parliamentary Party and so the panel had no jurisdiction over him on the issue. The Panel accepted this and so the complaint failed.

I therefore submitted a formal complaint to the Commissioner on Parliamentary Standards, reporting Kennedy’s betrayal of his pledge to honour the electoral process. This time Kennedy told the Commissioner that he appointed the peers in his capacity as party leader and so they had no jurisdiction. The Commissioner accepted this and so the complaint was not upheld.

‘The system lacks integrity – but what can I do?’

A few years later, I left the party after my resignation had been demanded by the party president Navnit Dholakia, for whistle-blowing on the party’s corrupt refusal to implement the rules on Lib Dem peers selling their services as corporate lobbyists to the nuclear, arms, GMO and alcohol corporations.

Despite the Kennedy betrayal, the party kept electing its nominees to the Lords. And Nick Clegg, the new leader, kept ignoring them by continuing to appoint the donors and cronies he chose.

In November 2014, I wrote to Lord Bew, chair of the supposed Committee on Standards in Public Life, reporting a statement by the LibDem Peer and former party treasurer, Lord Razzall, that the system of party donors was “quasi-corrupt” and asked him to conduct an inquiry into the selling of peerages, stolen from those duly elected by the party, by Nick Clegg.

Lord Bew replied stating that he agreed that the system of party donors becoming peers raised suspicions and that the party funding system was corrosive and lacked integrity. But he added that as the political parties had refused to address this lack of integrity, that there was nothing further his committee could do.

So having exhausted all political and procedural means at my disposal, I was left with nothing but my democractic right to protest, which is why I ended up standing there with that banner calling for Nick Clegg’s arrest for selling stolen peerages, on the night I was arrested.

These abuses are corrupting our politics from top to bottom

But why have I spent so much time and years campaigning on this political issue? 

It’s because I believe that the corrupt system of selling peerages by our major party leaders, goes to the heart of the corruption of our political system.

It has resulted in a House of Lords that is full of

  • private health care lobbyists selling off our precious NHS,
  • fossil fuel lobbyists trashing our planet,
  • bankers skewing the system in favour of the 1% ultra-rich so that they can accrue huge fortunes in off-shore tax-havens etc etc.

I have written about this corruption and its horrendous consequences in greater depth in the Chapter ‘House of Lordly Prostitutes’ in my book on how Britain’s democracy has been bought, The Prostitute State.

And that is the reason why I was standing there protesting with my banner last December and that is the reason why I will end up standing in the dock on Friday.

An important test of Tim Farron, and of British justice

It will be an interesting integrity test for the new Lib Dem leader Tim Farron, as to whether he will stick with the corrupt patronage system or respect the party’s electoral procedures when he makes any new LibDem appointments to the Lords.

The case against me will also put British justice itself to the test. The police and the Crown Prosecution Service are in possession of video footage of the ‘incident’ which led to my arrest. I have never seen the video but knowing what actually took place, I’m certain that it would, if shown to the court, lead to collapse of the case against me.

However the police and CPS have refused to disclose this evidence to me, in a clear violation of the Attorney General’s Guidelines On Disclosure, which state that

“Every accused person has a right to a fair trial, a right long embodied in our law and guaranteed under Article 6 of the European Convention on Human Rights (ECHR). A fair trial is the proper object and expectation of all participants in the trial process. Fair disclosure to an accused is an inseparable part of a fair trial.”

Specific examples include any material “casting doubt upon the accuracy of any prosecution evidence”, “that might go to the credibility of a prosecution witness” or that “might support a defence that is either raised by the defence or apparent from the prosecution papers.”

In an astonishing judgment that appears to be in complete violation of these guidelines, a District Judge this week refused my request for an order on the prosecution to disclose the video evidence. The matter will be raised again as it is impossible for me to receive a fair trial if such important evidence is withheld from the court.

Another development troubling me is that instead of having a District Judge presiding over the trial, my case will be heard by three lay magistrates.

One of my main defences is that the Greater London Authority was acting illegally to block freedom to assemble and protest in the public square that evening. Thus a deep knowledge of human rights and freedom to protest law will be required to give me a fair trial. I am not convinced that lay magistrates will be up to the task.

But while the legal noose tightens around my neck, what is clearly a corrupt, illegal and deeply anti-democratic practice – the sale of peerages by political parties at a going rate of some £300,000 a pop – goes uninvestigated and unprosecuted.

Truly, something is rotten in the state of Britain.

 


 

Donnachadh McCarthy is a founder of Stop Killing Cyclists, a member of Occupy Democracy, co-organiser for Occupy Rupert Murdoch Week, a former Deputy Chair of the Liberal Democrats, and author of ‘The Prostitute State – How Britain’s Democracy Has Been Bought‘. He can be reached via his website 3acorns. Follow on Facebook.

Author’s note: Any Occupy protester who would like a free ebook version of The Prostitute State – How Britain’s Democracy Has Been Bought‘, please – email me on contact AT 3acorns.co.uk.

 

Do not disturb! Persecuting badgers may perpetuate TB hotspots





A paper published today in the journal Scientific Reports shows that badger persecution may have a role in perpetuating ‘bovine TB hotspots’ by repeatedly disrupting the animals’ social structure.

Moreover the main risk factors for cattle infection with bovine TB are those linked to cattle, not badgers.

According to the paper, ‘Herd-level bovine tuberculosis risk factors: assessing the role of low-level badger population disturbance‘:

“Cattle risk factors (movements, international imports, bTB history, neighbours with bTB) were more strongly associated with herd risk than area-level measures of badger social group density, habitat suitability or persecution (sett disturbance).

“Highest risks were in areas of high badger social group density and high rates of persecution, potentially representing both responsive persecution of badgers in high cattle risk areas and effects of persecution on cattle bTB risk through badger social group disruption.”

Co-author Rowland Kao of the Boyd Orr Centre for Population and Ecosystem Health at the University of Glasgow said: “What we know from the Randomised Badger Culling Trial (RCBT) is that intense culling of badgers over a small area can have an overall negative impact on cattle btB.

“Here, we show that badger persecution over a very broad area does not appear to reduce the risk for cattle – further it is illegal, and may even make matters worse.”

The RBCT, which took place in England, showed that while intensive culling was associated with a decrease in cattle bTB inside cull areas, it also prompted an increase in bTB in neighbouring herds. The likely reason is that disruptions to the social structure of badgers causes increased spread of bTB to cattle from infected badgers.

Farm level risk factors are the key to bTB

The Wellcome Trust-funded research was carried out by the University of Glasgow, Queen’s University Belfast (QUB) and the Agri-Food and Biosciences Institute (AFBI). Dr David Wright of QUB, who led the study, said:

“Whilst the incidence of badger persecution was low, we hypothesised that those taking pre-emptive action against badgers may contribute to maintaining the disease. We were interested in investigating the interactions between cattle and disturbed and undisturbed badger populations.”

The analysis was based on surveys of badger setts in Northern Ireland of which about one in 20 showed signs of interference such as digging indicative of badger baiting, entrances being blocked with soil, boulders, branches or other debris inserted directly into holes, dumping of farm debris including bricks on top of setts.

Also recorded was agricultural disturbance such as setts being ploughed over or damaged by livestock trampling, development such as the construction of roads or newly built houses and slurry being pumped into holes.

The UK Government spends more than £100m per annum in testing, slaughter and compensation. Badger culling trials in the UK and Ireland have failed to show definitive benefits in terms of bTB reduction – and persecution may be one reason why.

Farmers must be aware of the risks of disturbing badgers

According to the scientists, badger persecution was more common in areas that had a history of high cattle bTB risk, indicating that responsive persecution is taking place in areas where badgers are perceived to be a threat. The paper states:

“Our results do not exclude the possibility that bTB risk differentials among areas are maintained by continued high levels of persecution, potentially through badger population perturbation. The two processes, responsive persecution and perturbation may operate in parallel, leading to positive feedbacks which may contribute to the persistence of bTB hotspots in certain areas independent of established cattle risk factors.”

And advises farmers and others against disturbing badger setts: “Persecution did not appear to substantially reduce cattle bTB risk and may have exacerbated the problem by triggering perturbation. Therefore, it may be beneficial to inform stakeholders of the risks incurred by disturbing setts.

“These findings should also be considered when designing bTB control programmes that use sub-lethal interventions in the badger population (including proposed badger vaccination programme) and efforts should be made to minimise disturbance of badger social group structure in the implementation of such programmes.”

The study, the authors conclude, also highlights “the importance of preventing transmission within the primary population through discouraging unnecessary cattle movement and increasing further the efficacy of testing programmes.

“Interventions to address these issues, including risk-based trading and bTB testing programmes are likely to be considerably less expensive and more publicly acceptable than schemes based on culling of badgers and may be more cost-effective and easier to monitor.”

 


 

The paper:Herd-level bovine tuberculosis risk factors: assessing the role of low-level badger population disturbance‘ by David M Wright et al is published in Scientific Reports and is available open access.

Oliver Tickell edits The Ecologist.

 






Fukushima: thousands have died, thousands more will die





Official data from Fukushima show that nearly 2,000 people died from the effects of evacuations necessary to avoid high radiation exposures from the disaster.

The uprooting to unfamiliar areas, cutting of family ties, loss of social support networks, disruption, exhaustion, poor physical conditions and disorientation can and do result in many people, in particular older people, dying.

Increased suicide has occurred among younger and older people following the Fukushima evacuations, but the trends are unclear.

A Japanese Cabinet Office report stated that, between March 2011 and July 2014, 56 suicides in Fukushima Prefecture were linked to the nuclear accident. This should be taken as a minimum, rather than a maximum, figure.

Mental health consequences

It is necessary to include the mental health consequences of radiation exposures and evacuations. For example, Becky Martin has stated her PhD research at Southampton University in the UK shows that “the most significant impacts of radiation emergencies are often in our minds.”

She adds: “Imagine that you’ve been informed that your land, your water, the air that you have breathed may have been polluted by a deadly and invisible contaminant. Something with the capacity to take away your fertility, or affect your unborn children.

“Even the most resilient of us would be concerned … many thousands of radiation emergency survivors have subsequently gone on to develop Post-Trauma Stress Disorder (PTSD), depression, and anxiety disorders as a result of their experiences and the uncertainty surrounding their health.”

It is likely that these fears, anxieties, and stresses will act to magnify the effects of evacuations, resulting in even more old people dying or people committing suicide.

Such considerations should not be taken as arguments against evacuations, however. They are an important, life-saving strategy. But, as argued by Becky Martin,

“We need to provide greatly improved social support following resettlement and extensive long-term psychological care to all radiation emergency survivors, to improve their health outcomes and preserve their futures.”

Untoward pregnancy outcomes

Dr Alfred Körblein from Nuremburg in Germany recently noticed and reported on a 15% drop (statistically speaking, highly significant) in the numbers of live births in Fukushima Prefecture in December 2011, nine months after the accident.

This might point to higher rates of early spontaneous abortions. He also observed a (statistically significant) 20% increase in the infant mortality rate in 2012, relative to the long-term trend in Fukushima Prefecture plus six surrounding prefectures, which he attributes to the consumption of radioactive food:

“The fact that infant mortality peaks in May 2012, more than one year after the Fukushima accident, suggests that the increase is an effect of internal rather than external radiation exposure.

“In Germany [after the Chernobyl nuclear disaster] perinatal mortality peaks followed peaks of cesium burden in pregnant women with a time-lag of seven months [2]. May 2012 minus seven months is October 2011, the end of the harvesting season. Thus, consumption of contaminated foodstuff during autumn 2011 could be an explanation for the excess of infant mortality in the Fukushima region in 2012.”

These are indicative rather than definitive findings and need to be verified by further studies. Unfortunately, such studies are notable by their absence.

Cancer and other late effects from radioactive fallout

Finally, we have to consider the longer term health effects of the radiation exposures from the radioactive fallouts after the four explosions and three meltdowns at Fukushima in March 2011. Large differences of view exist on this issue in Japan. These make it difficult for lay people and journalists to understand what the real situation is.

The Japanese Government, its advisors, and most radiation scientists in Japan (with some honourable exceptions) minimise the risks of radiation. The official widely-observed policy is that small amounts of radiation are harmless: scientifically speaking this is untenable.

For example, the Japanese Government is attempting to increase the public limit for radiation in Japan from 1 mSv to 20 mSv per year. Its scientists are trying to force the ICRP to accept this large increase. This is not only unscientific, it is also unconscionable.

Part of the reason for this policy is that radiation scientists in Japan (in the US, as well) appear unable or unwilling to accept the stochastic nature of low-level radiation effects. ‘Stochastic’ means an all-or-nothing response: you either get cancer etc or you don’t.

As you decrease the dose, the effects become less likely: your chance of cancer declines all the way down to zero dose. The corollary is that tiny doses, even well below background, still carry a small chance of cancer: there is never a safe dose, except zero dose.

But, as observed by Spycher et al (2015), some scientists “a priori exclude the possibility that low dose radiation could increase the risk of cancer. They will therefore not accept studies that challenge their foregone conclusion.”

One reason why such scientists refuse to accept radiation’s stochastic effects (cancers, strokes, CVS diseases, hereditary effects, etc) is that they only appear after long latency periods – often decades for solid cancers. For the Japanese Government and its radiation advisors, it seems out-of-sight means out-of-mind.

This conveniently allows the Japanese Government to ignore radiogenic late effects. But the evidence for them is absolutely rock solid. Ironically, it comes primarily from the world’s largest on-going epidemiology study, the Life Span Study of the Japanese atomic bomb survivors by the RERF Foundation which is based in Hiroshima and Nagasaki.

The lessons of Chernobyl

The mass of epidemiological evidence from the Chernobyl disaster in 1986 clearly indicates that cancer etc increases will very likely also occur at Fukushima, but many Japanese (and US) scientists deny this evidence.

For example, much debate currently exists over the existence and interpretation of increased thyroid cancers, cysts, and nodules in Fukushima Prefecture resulting from the disaster. From the findings after Chernobyl, thyroid cancers are expected to start increasing 4 to 5 years after 2011.

It’s best to withhold comment until clearer results become available in 2016, but early indications are not reassuring for the Japanese Government. After then, other solid cancers are expected to increase as well, but it will take a while for these to become manifest.

The best way of forecasting the numbers of late effects (ie cancers etc) is by estimating the collective dose to Japan from the Fukushima fall out. We do this by envisaging that everyone in Japan exposed to the radioactive fallout from Fukushima has thereby received lottery tickets: but they are negative tickets. That is, if your lottery number comes up, you get cancer [1].

If you live far away from Fukushima Daiichi NPP, you get few tickets and the chance is low: if you live close, you get more tickets and the chance is higher. You can’t tell who will be unlucky, but you can estimate the total number by using collective doses.

The 2013 UNSCEAR Report has estimated that the collective dose to the Japanese population from Fukushima is 48,000 person Sv: this is a very large dose: see below.

Unfortunately, pro-nuclear Japanese scientists also criticise the concept of collective dose as it relies on the stochastic nature of radiation’s effects and on the Linear No Threshold (LNT) model of radiation’s effects which they also refute. But almost all official regulatory bodies throughout the world recognise the stochastic nature of radiation’s effects, the LNT, and collective doses.

Summing up Fukushima

About 60 people died immediately during the actual evacuations in Fukushima Prefecture in March 2011. Between 2011 and 2015, an additional 1,867 people [2] in Fukushima Prefecture died as a result of the evacuations following the nuclear disaster [3]. These deaths were from ill health and suicides.

From the UNSCEAR estimate of 48,000 person Sv, it can be reliably estimated (using a fatal cancer risk factor of 10% per Sv) that about 5,000 fatal cancers will occur in Japan in future from Fukushima’s fallout. This estimate from official data agrees with my own personal estimate using a different methodology.

In sum, the health toll from the Fukushima nuclear disaster is horrendous. At the minimum

  • Over 160,000 people were evacuated most of them permanently.
  • Many cases of post-trauma stress disorder (PTSD), depression, and anxiety disorders arising from the evacuations.
  • About 12,000 workers exposed to high levels of radiation, some up to 250 mSv
  • An estimated 5,000 fatal cancers from radiation exposures in future.
  • Plus similar (unquantified) numbers of radiogenic strokes, CVS diseases and hereditary diseases.
  • Between 2011 and 2015, about 2,000 deaths from radiation-related evacuations due to ill-health and suicides.
  • An as yet unquantified number of thyroid cancers.
  • An increased infant mortality rate in 2012 and a decreased number of live births in December 2011.

Non-health effects include

  • 8% of Japan (30,000 sq.km), including parts of Tokyo, contaminated by radioactivity.
  • Economic losses estimated between $300 and $500 billion.


Catastrophes that must never be repeated

The Fukushima accident is still not over and its ill-effects will linger for a long time into the future. However we can say now that the nuclear disaster at Fukushima delivered a huge blow to Japan and its people.

2,000 Japanese people have already died from the evacuations and another 5,000 are expected to die from future cancers.

It is impossible not to be moved by the scale of Fukushima’s toll in terms of deaths, suicides, mental ill-health and human suffering. Fukushima’s effect on Japan is similar to Chernobyl’s massive blow against the former Soviet Union in 1986.

Indeed, several writers have expressed the view that the Chernobyl nuclear disaster was a major factor in the subsequent collapse of the USSR during 1989-1990.

It is notable that Mikhail Gorbachev, President of the USSR at the time of Chernobyl and Naoto Kan, Prime Minister of Japan at the time of Fukushima have both expressed their opposition to nuclear power. Indeed Kan has called for all nuclear power to be abolished.

Has the Japanese Government, and indeed other governments (including the UK and US), learned from these nuclear disasters? The US philosopher George Santayana (1863-1962) once stated that those who cannot learn from history are doomed to repeat it.

 


 

Dr Ian Fairlie is an independent consultant on radioactivity in the environment. He has a degree in radiation biology from Bart’s Hospital in London and his doctoral studies at Imperial College in London and Princeton University in the US concerned the radiological hazards of nuclear fuel reprocessing.

Ian was formerly a DEFRA civil servant on radiation risks from nuclear power stations. From 2000 to 2004, he was head of the Secretariat to the UK Government’s CERRIE Committee on internal radiation risks. Since retiring from Government service, he has acted as consultant to the European Parliament, local and regional governments, environmental NGOs, and private individuals.

See also Ian Fairlie’s blog, where this article was originally published.

Thanks to Azby Brown, Yuri Hiranuma, Dr Tadahiro Katsuta, Dr Alfred Körblein, Becky Martin, and Mycle Schneider for comments on early drafts. Any errors are mine.

1