Monthly Archives: February 2016

Time to crack down on car pollution – the silent killer with powerful friends

Faced with a public health crisis, responsible for nearly half a million premature deaths in Europe each year, we would expect an emergency response.

We would not expect those responsible for creating such a deadly crisis to be allowed to continue getting away with it.

And it would certainly be reasonable to expect those with the power to kerb such a catastrophe take all necessary action to deal with it, rather than colluding with the perpetrators.

Yet this is a tale of a silent killer stalking our streets, backed by a powerful industry with friends in high places. This faceless slayer is air pollution, the health impacts of which cost society up to €940bn annually.

Members of the European Parliament have a chance to tackle this outrage in a crucial vote on Wednesday and we call on them to do the right thing.

One of the most significant causes of air pollution is cars. Nitrogen Oxide emissions from diesel vehicles alone is responsible for around 75,000 premature deaths in Europe per year while air pollution contributes to the deaths of 29,000 people a year in the UK with cars being a key contributor.

Yet measures being introduced by the European Commission, taken by non-elected officials in a technical working group, and backed by national governments, will allow cars to exceed legal EU limits on pollutants. Indeed, the Tory government – in a private briefing leaked to The Ecologist – have expressed “strong support for the current agreement.”

VW scandal heralds rewriting of pollution laws

This is the first European regulatory measure to be introduced in the wake of the Volkswagen scandal in which the German car manufacturer was found to have used software to purposely deceive Nitrogen Oxide (NOx) tests.

But rather than clamping down on the car industry’s irresponsible approach to pollution, EU governments and the Commission instead want to rewrite existing law, providing loopholes which will allow cars to legally pollute more.

They want to apply what is termed ‘conformity factors’ which will permit a discrepancy between the regulatory limits enshrined in law and newly introduced Real Driving Emissions (RDE) tests designed to assess vehicle emissions on the road. This would overwrite the legal pollutant limits set out in EU law since 2007 and allow the limits for Nitrogen Oxide (NOx) to be exceeded by a whopping 110% until 2021 and by 50% thereafter.

The 2007 legislation already requires the limit values for pollutants be met in “normal use” so there is no reason why these tests should allow the limits to be dramatically weakened.

The car industry lobby – a deadly influence

The decision to undermine pollutant limits decided on by directly-elected members of the European Parliament underlines the damaging, and deadly, influence car industry lobbying has on the EU Commission and EU governments.

Greens in the European Parliament recently commissioned a study on the influence of the car lobby. The report shows that key Member States and car companies have colluded to fight against progressive measures, while the European Parliament’s Environment Committee and civil society organisations have been working to protect human health and the environment.

The report finds that car manufacturers exert high level political influence in Europe, using as leverage their economic clout and the fact they employ large numbers of skilled employees. The relationship between car manufacturers and government is particularly strong in Germany, but the report cites plenty of examples of a cosy relationship in the UK too.

The report says the UK government advocated for low compliance standards and a delay in implementing emissions standards. It also, aparently, agreed privately with German Chancellor Angela Merkel to undermine EU emissions policy. Meanwhile, the Department for Environment, Food and Rural Affairs (DEFRA) has questioned scientific evidence that NOx is harmful to health.

Not to be left out of the act, the UK auto industry trade association, the Society of Motor Manufacturers and Traders (SMMT), appears to have been misleading about their actual knowledge of real world diesel emissions in public facing messaging. The SMMT launched a campaign in 2015 promoting diesel engines as the “cleanest ever”.

Getting away with murder

This Wednesday the European Parliament will get a say on whether or not to reject the proposed measures. This is a vital opportunity for MEPs to follow the recommendation of the Parliament’s environment committee and vote the measures down.

Weakening the existing EU pollution limits for cars will make it impossible for European cities to meet the standards for air quality set out in EU law. We cannot let car manufacturers, EU technicians and compliant politicians undo the progress made to protect public health and the environment.

In short, we cannot let them continue to get away with murder. 

 


 

Jean Lambert is Green MEP for London.

Molly Scott Cato is Green MEP for the South West of England and Gibraltar.

Keith Taylor is Green MEP for the South East of England.

 

‘Renewable energy highways’ offer quick fix for US emissions

The US could reduce greenhouse gas emissions from electricity generation by 80% below 1990 levels within 15 years just by using renewable sources such as wind and solar energy, according to a former government research chief.

The nation could do this using only technologies available right now, and by introducing a national grid system connected by high voltage direct current (HVDC) that could get the power without loss to those places that needed it most, when they needed it.

This utopian vision – and it has been dreamed at least twice before by researchers in Delaware and in Stanford, California – comes directly from a former chief of research in a US government agency, the National Oceanic and Atmospheric Administration (NOAA).

Their ideas are set out in a report in Nature Climate Change by Dr Alexander MacDonald – a distinguished meteorologist who was until recently head of NOAA’s Earth System Research Laboratory in Colorado – and colleagues at the University of Colorado.

Taking a network approach on a continental scale

Instead of factoring in fossil fuel backup, or yet-to-be-invented methods of storing electricity from wind and solar sources, they took a new look at the simple problems of supply and demand in a nation that tends to be sunny and warm in the south and windy in the north, but not always reliably so in either place.

Their reasoning was that storage technologies could only increase the cost of renewable energy, and increase the problem of reducing carbon emissions. So they modelled the US weather on timescales of one hour over divisions of the nation as small as 13 square kilometres to see what costs and demand and carbon dioxide emissions would be, and how easily renewable power could meet the demand.

They reasoned that even though wind turbines are vulnerable to periods of calm and that solar energy sources don’t do much in rainy weather or at night, there would always be some parts of the country that could be generating energy from a renewable source.

They then factored in future costs – the cost of both wind and solar has been falling steadily – and scaled up renewable energy to match the available wind and sunlight in the US at any time.

“Our research shows a transition to a reliable, low-carbon, electrical generation and transmission system can be accomplished with commercially available technology, and within 15 years”, says Dr MacDonald. The model embraced fossil fuel sources as well as renewable ones, for purposes of comparison. It revealed that low cost and low emissions are not mutually exclusive. The US could have both.

“The model relentlessly seeks the lowest-cost energy, whatever constraints are applied”, says Christopher Clack, a physicist and mathematician with the Cooperative Institute for Research in Environmental Sciences at the University of Colorado Boulder, and a co-author of the study. “And it always installs more renewable energy on the grid than exists today.”

HVDC power links are the ‘new interstate highways’

Even in a scenario where renewable energy cost more than experts predicted, the model produced a system that cut carbon dioxide emissions 33% below 1990 levels by 2030, and delivered electricity at about 8.6¢ per kilowatt hour (kWh). By comparison, electricity cost 9.4¢ per kWh in 2012.

If renewable energy costs were lower and natural gas costs higher, as is expected in the future, the modelled system sliced carbon dioxide emissions by 78% from 1990 levels and delivered electricity at 10¢ per kWh. The year 1990 is the baseline for greenhouse gas calculations.

The model achieved its outcome without relying on any new electrical storage systems. The national grid did need augmentation from nuclear energy, hydropower and natural gas, but the real innovation would be the connection of large numbers of low-cost renewable energy sources to high-energy-demand centres, using efficient new transmission systems.

It seems that HVDC transmission is the key to keeping costs down, and Dr MacDonald compared such power links to the interstate highways that cross the US, and which transformed the US economy 50 years ago. “With an ‘interstate for electrons’, renewable energy could be delivered anywhere in the country while emissions plummet”, he says.

“An HVDC grid would create a national electricity market in which all types of generation, including low-carbon sources, compete on a cost basis. The surprise was how dominant wind and solar could be.”

 


 

Tim Radford is a founding editor of Climate News Network and has worked for The Guardian for 32 years, for most of that time as science editor. He has been covering climate change since 1988.

This article was originally published by Climate News Network.

 

‘Renewable energy highways’ offer quick fix for US emissions

The US could reduce greenhouse gas emissions from electricity generation by 80% below 1990 levels within 15 years just by using renewable sources such as wind and solar energy, according to a former government research chief.

The nation could do this using only technologies available right now, and by introducing a national grid system connected by high voltage direct current (HVDC) that could get the power without loss to those places that needed it most, when they needed it.

This utopian vision – and it has been dreamed at least twice before by researchers in Delaware and in Stanford, California – comes directly from a former chief of research in a US government agency, the National Oceanic and Atmospheric Administration (NOAA).

Their ideas are set out in a report in Nature Climate Change by Dr Alexander MacDonald – a distinguished meteorologist who was until recently head of NOAA’s Earth System Research Laboratory in Colorado – and colleagues at the University of Colorado.

Taking a network approach on a continental scale

Instead of factoring in fossil fuel backup, or yet-to-be-invented methods of storing electricity from wind and solar sources, they took a new look at the simple problems of supply and demand in a nation that tends to be sunny and warm in the south and windy in the north, but not always reliably so in either place.

Their reasoning was that storage technologies could only increase the cost of renewable energy, and increase the problem of reducing carbon emissions. So they modelled the US weather on timescales of one hour over divisions of the nation as small as 13 square kilometres to see what costs and demand and carbon dioxide emissions would be, and how easily renewable power could meet the demand.

They reasoned that even though wind turbines are vulnerable to periods of calm and that solar energy sources don’t do much in rainy weather or at night, there would always be some parts of the country that could be generating energy from a renewable source.

They then factored in future costs – the cost of both wind and solar has been falling steadily – and scaled up renewable energy to match the available wind and sunlight in the US at any time.

“Our research shows a transition to a reliable, low-carbon, electrical generation and transmission system can be accomplished with commercially available technology, and within 15 years”, says Dr MacDonald. The model embraced fossil fuel sources as well as renewable ones, for purposes of comparison. It revealed that low cost and low emissions are not mutually exclusive. The US could have both.

“The model relentlessly seeks the lowest-cost energy, whatever constraints are applied”, says Christopher Clack, a physicist and mathematician with the Cooperative Institute for Research in Environmental Sciences at the University of Colorado Boulder, and a co-author of the study. “And it always installs more renewable energy on the grid than exists today.”

HVDC power links are the ‘new interstate highways’

Even in a scenario where renewable energy cost more than experts predicted, the model produced a system that cut carbon dioxide emissions 33% below 1990 levels by 2030, and delivered electricity at about 8.6¢ per kilowatt hour (kWh). By comparison, electricity cost 9.4¢ per kWh in 2012.

If renewable energy costs were lower and natural gas costs higher, as is expected in the future, the modelled system sliced carbon dioxide emissions by 78% from 1990 levels and delivered electricity at 10¢ per kWh. The year 1990 is the baseline for greenhouse gas calculations.

The model achieved its outcome without relying on any new electrical storage systems. The national grid did need augmentation from nuclear energy, hydropower and natural gas, but the real innovation would be the connection of large numbers of low-cost renewable energy sources to high-energy-demand centres, using efficient new transmission systems.

It seems that HVDC transmission is the key to keeping costs down, and Dr MacDonald compared such power links to the interstate highways that cross the US, and which transformed the US economy 50 years ago. “With an ‘interstate for electrons’, renewable energy could be delivered anywhere in the country while emissions plummet”, he says.

“An HVDC grid would create a national electricity market in which all types of generation, including low-carbon sources, compete on a cost basis. The surprise was how dominant wind and solar could be.”

 


 

Tim Radford is a founding editor of Climate News Network and has worked for The Guardian for 32 years, for most of that time as science editor. He has been covering climate change since 1988.

This article was originally published by Climate News Network.

 

‘Renewable energy highways’ offer quick fix for US emissions

The US could reduce greenhouse gas emissions from electricity generation by 80% below 1990 levels within 15 years just by using renewable sources such as wind and solar energy, according to a former government research chief.

The nation could do this using only technologies available right now, and by introducing a national grid system connected by high voltage direct current (HVDC) that could get the power without loss to those places that needed it most, when they needed it.

This utopian vision – and it has been dreamed at least twice before by researchers in Delaware and in Stanford, California – comes directly from a former chief of research in a US government agency, the National Oceanic and Atmospheric Administration (NOAA).

Their ideas are set out in a report in Nature Climate Change by Dr Alexander MacDonald – a distinguished meteorologist who was until recently head of NOAA’s Earth System Research Laboratory in Colorado – and colleagues at the University of Colorado.

Taking a network approach on a continental scale

Instead of factoring in fossil fuel backup, or yet-to-be-invented methods of storing electricity from wind and solar sources, they took a new look at the simple problems of supply and demand in a nation that tends to be sunny and warm in the south and windy in the north, but not always reliably so in either place.

Their reasoning was that storage technologies could only increase the cost of renewable energy, and increase the problem of reducing carbon emissions. So they modelled the US weather on timescales of one hour over divisions of the nation as small as 13 square kilometres to see what costs and demand and carbon dioxide emissions would be, and how easily renewable power could meet the demand.

They reasoned that even though wind turbines are vulnerable to periods of calm and that solar energy sources don’t do much in rainy weather or at night, there would always be some parts of the country that could be generating energy from a renewable source.

They then factored in future costs – the cost of both wind and solar has been falling steadily – and scaled up renewable energy to match the available wind and sunlight in the US at any time.

“Our research shows a transition to a reliable, low-carbon, electrical generation and transmission system can be accomplished with commercially available technology, and within 15 years”, says Dr MacDonald. The model embraced fossil fuel sources as well as renewable ones, for purposes of comparison. It revealed that low cost and low emissions are not mutually exclusive. The US could have both.

“The model relentlessly seeks the lowest-cost energy, whatever constraints are applied”, says Christopher Clack, a physicist and mathematician with the Cooperative Institute for Research in Environmental Sciences at the University of Colorado Boulder, and a co-author of the study. “And it always installs more renewable energy on the grid than exists today.”

HVDC power links are the ‘new interstate highways’

Even in a scenario where renewable energy cost more than experts predicted, the model produced a system that cut carbon dioxide emissions 33% below 1990 levels by 2030, and delivered electricity at about 8.6¢ per kilowatt hour (kWh). By comparison, electricity cost 9.4¢ per kWh in 2012.

If renewable energy costs were lower and natural gas costs higher, as is expected in the future, the modelled system sliced carbon dioxide emissions by 78% from 1990 levels and delivered electricity at 10¢ per kWh. The year 1990 is the baseline for greenhouse gas calculations.

The model achieved its outcome without relying on any new electrical storage systems. The national grid did need augmentation from nuclear energy, hydropower and natural gas, but the real innovation would be the connection of large numbers of low-cost renewable energy sources to high-energy-demand centres, using efficient new transmission systems.

It seems that HVDC transmission is the key to keeping costs down, and Dr MacDonald compared such power links to the interstate highways that cross the US, and which transformed the US economy 50 years ago. “With an ‘interstate for electrons’, renewable energy could be delivered anywhere in the country while emissions plummet”, he says.

“An HVDC grid would create a national electricity market in which all types of generation, including low-carbon sources, compete on a cost basis. The surprise was how dominant wind and solar could be.”

 


 

Tim Radford is a founding editor of Climate News Network and has worked for The Guardian for 32 years, for most of that time as science editor. He has been covering climate change since 1988.

This article was originally published by Climate News Network.

 

‘Renewable energy highways’ offer quick fix for US emissions

The US could reduce greenhouse gas emissions from electricity generation by 80% below 1990 levels within 15 years just by using renewable sources such as wind and solar energy, according to a former government research chief.

The nation could do this using only technologies available right now, and by introducing a national grid system connected by high voltage direct current (HVDC) that could get the power without loss to those places that needed it most, when they needed it.

This utopian vision – and it has been dreamed at least twice before by researchers in Delaware and in Stanford, California – comes directly from a former chief of research in a US government agency, the National Oceanic and Atmospheric Administration (NOAA).

Their ideas are set out in a report in Nature Climate Change by Dr Alexander MacDonald – a distinguished meteorologist who was until recently head of NOAA’s Earth System Research Laboratory in Colorado – and colleagues at the University of Colorado.

Taking a network approach on a continental scale

Instead of factoring in fossil fuel backup, or yet-to-be-invented methods of storing electricity from wind and solar sources, they took a new look at the simple problems of supply and demand in a nation that tends to be sunny and warm in the south and windy in the north, but not always reliably so in either place.

Their reasoning was that storage technologies could only increase the cost of renewable energy, and increase the problem of reducing carbon emissions. So they modelled the US weather on timescales of one hour over divisions of the nation as small as 13 square kilometres to see what costs and demand and carbon dioxide emissions would be, and how easily renewable power could meet the demand.

They reasoned that even though wind turbines are vulnerable to periods of calm and that solar energy sources don’t do much in rainy weather or at night, there would always be some parts of the country that could be generating energy from a renewable source.

They then factored in future costs – the cost of both wind and solar has been falling steadily – and scaled up renewable energy to match the available wind and sunlight in the US at any time.

“Our research shows a transition to a reliable, low-carbon, electrical generation and transmission system can be accomplished with commercially available technology, and within 15 years”, says Dr MacDonald. The model embraced fossil fuel sources as well as renewable ones, for purposes of comparison. It revealed that low cost and low emissions are not mutually exclusive. The US could have both.

“The model relentlessly seeks the lowest-cost energy, whatever constraints are applied”, says Christopher Clack, a physicist and mathematician with the Cooperative Institute for Research in Environmental Sciences at the University of Colorado Boulder, and a co-author of the study. “And it always installs more renewable energy on the grid than exists today.”

HVDC power links are the ‘new interstate highways’

Even in a scenario where renewable energy cost more than experts predicted, the model produced a system that cut carbon dioxide emissions 33% below 1990 levels by 2030, and delivered electricity at about 8.6¢ per kilowatt hour (kWh). By comparison, electricity cost 9.4¢ per kWh in 2012.

If renewable energy costs were lower and natural gas costs higher, as is expected in the future, the modelled system sliced carbon dioxide emissions by 78% from 1990 levels and delivered electricity at 10¢ per kWh. The year 1990 is the baseline for greenhouse gas calculations.

The model achieved its outcome without relying on any new electrical storage systems. The national grid did need augmentation from nuclear energy, hydropower and natural gas, but the real innovation would be the connection of large numbers of low-cost renewable energy sources to high-energy-demand centres, using efficient new transmission systems.

It seems that HVDC transmission is the key to keeping costs down, and Dr MacDonald compared such power links to the interstate highways that cross the US, and which transformed the US economy 50 years ago. “With an ‘interstate for electrons’, renewable energy could be delivered anywhere in the country while emissions plummet”, he says.

“An HVDC grid would create a national electricity market in which all types of generation, including low-carbon sources, compete on a cost basis. The surprise was how dominant wind and solar could be.”

 


 

Tim Radford is a founding editor of Climate News Network and has worked for The Guardian for 32 years, for most of that time as science editor. He has been covering climate change since 1988.

This article was originally published by Climate News Network.

 

‘Renewable energy highways’ offer quick fix for US emissions

The US could reduce greenhouse gas emissions from electricity generation by 80% below 1990 levels within 15 years just by using renewable sources such as wind and solar energy, according to a former government research chief.

The nation could do this using only technologies available right now, and by introducing a national grid system connected by high voltage direct current (HVDC) that could get the power without loss to those places that needed it most, when they needed it.

This utopian vision – and it has been dreamed at least twice before by researchers in Delaware and in Stanford, California – comes directly from a former chief of research in a US government agency, the National Oceanic and Atmospheric Administration (NOAA).

Their ideas are set out in a report in Nature Climate Change by Dr Alexander MacDonald – a distinguished meteorologist who was until recently head of NOAA’s Earth System Research Laboratory in Colorado – and colleagues at the University of Colorado.

Taking a network approach on a continental scale

Instead of factoring in fossil fuel backup, or yet-to-be-invented methods of storing electricity from wind and solar sources, they took a new look at the simple problems of supply and demand in a nation that tends to be sunny and warm in the south and windy in the north, but not always reliably so in either place.

Their reasoning was that storage technologies could only increase the cost of renewable energy, and increase the problem of reducing carbon emissions. So they modelled the US weather on timescales of one hour over divisions of the nation as small as 13 square kilometres to see what costs and demand and carbon dioxide emissions would be, and how easily renewable power could meet the demand.

They reasoned that even though wind turbines are vulnerable to periods of calm and that solar energy sources don’t do much in rainy weather or at night, there would always be some parts of the country that could be generating energy from a renewable source.

They then factored in future costs – the cost of both wind and solar has been falling steadily – and scaled up renewable energy to match the available wind and sunlight in the US at any time.

“Our research shows a transition to a reliable, low-carbon, electrical generation and transmission system can be accomplished with commercially available technology, and within 15 years”, says Dr MacDonald. The model embraced fossil fuel sources as well as renewable ones, for purposes of comparison. It revealed that low cost and low emissions are not mutually exclusive. The US could have both.

“The model relentlessly seeks the lowest-cost energy, whatever constraints are applied”, says Christopher Clack, a physicist and mathematician with the Cooperative Institute for Research in Environmental Sciences at the University of Colorado Boulder, and a co-author of the study. “And it always installs more renewable energy on the grid than exists today.”

HVDC power links are the ‘new interstate highways’

Even in a scenario where renewable energy cost more than experts predicted, the model produced a system that cut carbon dioxide emissions 33% below 1990 levels by 2030, and delivered electricity at about 8.6¢ per kilowatt hour (kWh). By comparison, electricity cost 9.4¢ per kWh in 2012.

If renewable energy costs were lower and natural gas costs higher, as is expected in the future, the modelled system sliced carbon dioxide emissions by 78% from 1990 levels and delivered electricity at 10¢ per kWh. The year 1990 is the baseline for greenhouse gas calculations.

The model achieved its outcome without relying on any new electrical storage systems. The national grid did need augmentation from nuclear energy, hydropower and natural gas, but the real innovation would be the connection of large numbers of low-cost renewable energy sources to high-energy-demand centres, using efficient new transmission systems.

It seems that HVDC transmission is the key to keeping costs down, and Dr MacDonald compared such power links to the interstate highways that cross the US, and which transformed the US economy 50 years ago. “With an ‘interstate for electrons’, renewable energy could be delivered anywhere in the country while emissions plummet”, he says.

“An HVDC grid would create a national electricity market in which all types of generation, including low-carbon sources, compete on a cost basis. The surprise was how dominant wind and solar could be.”

 


 

Tim Radford is a founding editor of Climate News Network and has worked for The Guardian for 32 years, for most of that time as science editor. He has been covering climate change since 1988.

This article was originally published by Climate News Network.

 

Science Museum must get out of bed with anti-science Big Fossil funders

Climate change is one of most serious challenges human civilisation has ever faced.

Despite a continued effort by vested interests to undermine or redirect attention away from the science of climate change, we are more certain than ever that the problem is as urgent as it is complex to tackle.

It is not impossible to navigate ourselves through this, but doing so will require deep introspection across almost all aspect of our society and economy.

We will need to consider deep different and sometimes drastic changes to the current status quo. The way in which we prioritise, conduct and communicate science is no exception.

If we were to ask non-scientists whether they thought that scientists were doing research, which would further damage our environment they would likely assume not.

The reality, sadly, falls short of that ideal: despite knowing our oil reserves cannot all be burned, geologists continue to look for more reserves. Physicists, chemists and chemical engineers continue to research technology to better extract more oil from the reserves and access new sources of gas through ‘fracking’.

Big Fossil arts sponsorship is is not ‘harm free’

In the area of science communication, fossil fuel companies continue to use their money and influence to appear like allies of science, when in reality the scope of the solutions they are interested in are only the ones that fit in with their profit-driven business model.

This problem is clear too in museum and cultural sponsorship by fossil fuel companies. By sponsoring the Science Museum, BP (and previously Shell) gain a legitimacy they do not deserve. BP (and Shell’s) own forecasts for fossil fuel use are consistent with over 2C warming.

We are now at a stage where both the scientific consensus and the political consensus (through the Paris agreement) acknowledge that not only should we not exceed 2C but that we should start to aim lower, towards 1.5C – a stark contrast with the future business forecasts of BP and others.

Video: PSI sponsorship protest at Science Museum from PSI on Vimeo.

It is also important to note that BP has a history with the climate-science denying lobbying group, American Legislative Exchange Council (ALEC) which has been behind model legislation that attempts to limit renewable energy in the US, stated that human-caused climate change was “uncertain”, and described attempts to curb greenhouse gas emission as a “train wreck”.

BP claimed in 2015 to be cutting ties with ALEC but BP still funds climate misinformation through other means.

Finally, no discussion of BP can be complete without examining their environmental record. The Deepwater Horizon Oil spill was the largest spill in the history of the oil industry. Before it was sealed it had leaked nearly 5 million barrels of oil.

In 2014 BP announced the clean-up was “substantially complete” the US coasts countered that lots of work still remained. The responsibility for the spill was found to lie predominately with BP’s management of the rig. The resulting fines totalled $18.7 billion – the biggest in corporate history. Undeterred, BP is planning to dig deeper wells in the Great Australian Bight.

It was for this reason, that I, along with other members of the Progressive Science Institute (PSI) and members of BP or not BP, gatecrashed the Science Museum’s ‘Late’ event themed around the BP-sponsored exhibit: Cosmonauts. I can not, as a scientist, allow this incompatibility of the Science Museum’s aims of communicating science to go unchallenged.

The Science Museum has a duty to represent and communicate the scientific consensus. In 2016, we now have a well-established scientific consensus on climate change and even an international political one through the Paris Agreement, and this has to be reflected at every level of the Science Museum, including its funding partners.

The Paris Agreement deliver unless we make it!

We cannot stop here – the Paris Agreement is nothing if left alone. It relies both on future ramping up of commitments as well as wider society to hold governments to account on keeping the commitments they have made.

Looking forward to 2100 it is clear that if we are to have made it through this challenge then the fossil fuel companies, in anything like their current form, can’t possibly exist. Today then, we must fight for this to be reflected in our financial, technological and cultural investments in them.

This means campaigning to move our money away from fossil fuels. It also needs something else: it means a radical change in how we understand Science. Science is not a process that is independent from the society it is part of and cannot afford to cut itself off from the public.

The public must be able to see and input into the priorities and values behind what science is funded. It also means a much louder, self-aware scientific voice in societal and political issues that affect everyone.

 


 

Drew Pearce is a PhD student at Imperial College London and a member of the Progressive Science Institute. PSI was founded to challenge the assumption that science is in of and in itself a progressive force in society. The reality is that scientific research is subject to the same political, economic and societal pressures and biases that are faced in any area of human endeavour and only by examining this reality critically can science live up to its ideals.

 

My Spiritual Journey

A former monk and long-term peace and environment activist, Satish Kumar has been quietly setting the Global Agenda for change for over 50 years. The event is hosted by Alternatives an independent not-for-profit organisation, based in central London, that is dedicated to raising awareness and offering practical, inspiring solutions for every day living.

Satish’s introduction to his talk
As our bodies make a physical journey, our soul is on a spiritual journey. The two journeys complement one another but often we forget the spiritual dimension of our journey and we get bogged down in the problems and trials of the physical journey. By making a spiritual journey the physical journey also becomes joyful, Satish Kumar will relate his own journey both physical and spiritual in this talk.

About Satish Kumar
A former monk and long-term peace and environment activist, Satish Kumar has been quietly setting the Global Agenda for change for over 50 years. He was just nine when he left his family home to join the wandering Jains and 18 when he decided he could achieve more back in the world, campaigning for land reform in India and working to turn Gandhi’s vision of a renewed India and a peaceful world into reality.

Inspired in his early 20s by the example of the British peace activist Bertrand Russell, Satish embarked on an 8,000-mile peace pilgrimage together with E.P. Menon. Carrying no money and depending on the kindness and hospitality of strangers, they walked from India to America, via Moscow, London and Paris, to deliver a humble packet of ‘peace tea’ to the then leaders of the world’s four nuclear powers.

Pioneering educational ventures
In 1973 Satish settled in the United Kingdom taking up the post of editor of Resurgence magazine, a position he has held ever since, making him the UK’s longest-serving editor of the same magazine. During this time, he has been the guiding spirit behind a number of now internationally-respected ecological and educational ventures including Schumacher College in South Devon where he is still a Visiting Fellow.

In his 50th year, Satish undertook another pilgrimage – again carrying no money. This time, he walked 2,000 miles to the holy places of Britain, a venture he describes as a celebration of his love of life and nature.

In July 2000 he was awarded an Honorary Doctorate in Education from the University of Plymouth. In July 2001, he received an Honorary Doctorate in Literature from the University of Lancaster.  And in the November of that same year, he was presented with the Jamnalal Bajaj International Award for Promoting Gandhian Values Abroad.

Publishing with Green Books
His autobiography, No Destination, first published by Green Books in 1978, has sold over 50,000 copies. He is also the author of You Are, Therefore I Am: A Declaration of Dependence and The Buddha and the Terrorist.

In 2005, Satish was Sue Lawley’s guest on Radio 4’s Desert Island Discs. In 2008, as part of BBC2’s Natural World series, he presented a 50-minute documentary from Dartmoor, Earth Pilgrim, which was watched by over 3.6 million people. He also appears regularly in the media, on a range of programmes including Thought for the Day and Midweek.

Internationally-renouned speaker and teacher
Satish is on the Advisory Board of Our Future Planet, a unique online community sharing ideas for real change and in recognition of his commitment to animal welfare and compassionate living, he was recently elected vice-president with the RSPCA. He continues to teach and run workshops on reverential ecology, holistic education and voluntary simplicity and is a much sought-after speaker both in the UK and abroad.

Event timetable
18.30 Doors open
19.00 Talk commences
20.30 Book Signing

Venue
St James’s Church, 197 Piccadilly, London W1J 9LL

For more information and bookings visit the Alternatives website


Satish Kumar is editor-in-chief at Resurgence & Ecologist which is published by The Resurgence Trust, an educational charity (no. 1120414). The Resurgence Trust also publishes the Ecologist website.

 

Brussels biotech lobby’s last push for ‘GM 2.0’ technologies to escape regulation

The European Commission is close to issuing a crucial decision on whether or not a new generation of genetic engineering techniques will be covered by EU GMO legislation.

If it turns out that one or more techniques are excluded from regulation, this means that the resulting GM products will go untested, unmonitored, and unlabelled – the mission of a well-rehearsed, below-the-radar industry lobby offensive.

A report by Corporate Europe Observatory published today, based on numerous documents released under freedom of information requests, illuminates how industry has attempted to bend the rules to let new GMOs slide through.

Industry has resurrected the pitch it used 20 years ago with GM 1.0 to help usher in the era of ‘GM 2.0’. This pitch predictably cites key challenges we face today, notably “rapid world population growth, climate change, and increasing scarcity of resources such as soil and water.”

New GM techniques, industry claims yet again, will come to the rescue by massively improving the precision and speed of the plant breeding process. Important objectives allegedly include pest resistance, drought tolerance …

Yet the very first new GM crop in the pipeline, developed by the Canadian company Cibus, is another herbicide-tolerant oilseed rape. Herbicide-tolerant GM crops are waging social and environmental havoc in the countries where they are mass produced. Furthermore, the claims about the benefits of GMOs have been refuted time and again.

Never mind the law – the pressure is on

As Janet Cotter and Ricarda Steinbrecher recently argued in The Ecologist, products from so-called ‘gene editing’ technologies clearly fall within the definition of a GMO in both European and international law, and present real risks to the environment and human health.

Environmental and farming groups have united to demand that products obtained through these new GM techniques will be regulated, and that GM rules are ultimately strengthened.

And the pressure is on. Industry has set up a EU-level lobbying vehicle with the goal of having as many of the new GM techniques as possible excluded from EU regulations. This ‘New Breeding Techniques (NBT) Platform’ is run by Schuttelaar & Partners, a Dutch lobby and PR firm with a shady reputation for pro-GM lobbying.

This is the crew that was responsible for coaching Monsanto in the smooth introduction of the first Roundup Ready (herbicide-tolerant) crops on the European market in 1996. The firm’s lobby campaign relied on unfounded claims about environmental benefits such as reduced pesticide use.

The big biotech corporations like Bayer, Monsanto and Dow AgroSciences have filed dozens of patent applications on new GM techniques. These techniques allow developers to make one or more changes to an existing variety with a strong market position – like the Gala apple – but then charge higher fees to growers due to the patents.

Lobbyists plan – influence key deicsion makers

In a meeting with EU decision makers in 2012, the NBT Platform stressed the fact that “the EU occupies the second place in the world for patent applications, with the UK and the Netherlands contributing most significantly.”

Companies appear to be deliberately investing in techniques designed to circumvent the EU’s GMO regulations. An NBT Platform lobby document sent to EU decision makers in 2013 states that they were developed “as a response to the de facto moratorium on GMOs that currently exists in Europe.” These investments, and the many related patent applications, now demand a financial return.

Although the NBT Platform has been active since 2011, it only recently entered cyberspace. Its official aim is to liberate as many new GM techniques as possible from the EU’s GMO law. This it hopes to achieve with new, industry-friendly interpretations of the EU GMO Directive (2001/18), which would conveniently result in all new techniques under discussion escaping GM regulation.

Its website visualises a three-stage lobby campaign of the most remarkable banality (see image, above right) culminating in ‘Agreement’. But the most important element by far must be in Phase Three, still under way: “influencing key decision makers”. That’s what lobbyists do best – out of public reach and behind closed doors

The industry has also set a number of lobbying tactics into action in its campaign for deregulation. These include:

1. Rebrand your product, rebrand yourself

Rebranding new genetic engineering as ‘new breeding techniques’ (NBTs) was industry’s first step in making this new generation of GM appear friendly and kindred to classical plant breeding.

Other labels in the new lexicon – such as ‘gene editing’ and ‘high precision breeding’ – were adopted to suggest absolute technological control of the genetic engineering process. However, precision in changing an organism’s genetic makeup does not equate safety if the impacts are not fully understood.

Furthermore, the NBT Platform does not let an opportunity go by to stress that it represents the interests of small and medium-sized enterprises (SMEs) and (public) research institutes. However, only three companies in the Platform actually qualify as SMEs, and the research institutes that are represented, like Rothamsted Research, have strong financial ties with industry.

Moreover, these SMEs and research institutes often play the role of technology suppliers for big multinationals. Some multinationals, like Bayer CropScience and Dow AgroSciences, are not on the membership list yet actively participate Platform lobby meetings.

2. Stay below the radar

Although the NBT Platform was launched in 2011, it was all but invisible until recently. It only surfaced in the EU Transparency Register in April 2015, and finally launched a dedicated website just a few months later.

Little information is available on the register: no membership list, no funding data, and no link to Schuttelaar & Partners that runs the show. The NBT Platform website does provide a list of members, but it is questionable how complete it is. (See previous point).

3. Legal creativity

The NBT Platform has developed a carefully constructed a ‘legal questionnaire’ that should guide decision makers to the desired outcome: that all new GM techniques now in question are exempt from regulation and by extension from labelling.

For instance the questionnaire ensures that GMOs that did not get ‘foreign’ DNA from unrelated species inserted (such as products from gene editing or cisgenesis) would no longer get regulated. However, making changes to the genetic make up of an organism using GM techniques needs to be regulated to discover any unintended and unexpected effects, and its potential impact on the environment.

4. Jobs and TTIP

The lobby for the new GM has threatened decision makers that the European plant breeding sector would lose much of its competitiveness and innovativeness if new GM were to be regulated.

These are magical words in Brussels, and the TTIP negotiations have been a welcome additional source of political pressure to this end. Correspondence has shown that biotech and seed industry groups have identified GM 2.0 as a trade concern to both US and EU officials, claiming that the new techniques should go unregulated “as they don’t pose any safety concerns.”

5. Doing an end-run on EU GM regulations

Individual companies have been pressing various European governments to clarify the legal status of the new GM techniques, and at the same time announcing plans to field trial them in those countries.

Canadian company Cibus, for example, followed this strategy in Germany for a new GM product developed with the ODM technique that had already been commercialised in the US.

The German government succumbed to the pressure, but thanks to a court case by environmental and sustainable farming organisations the planned field trials could not go ahead in 2015.

Helping hands at the national level

Of course, a little help from your friends is always welcome, and certain governments have joined industry’s side, actively advocating the deregulation of new GM techniques at the EU level.

The British, German and Irish governments have been particularly helpful in peddling industry’s flawed legal argumentation. They communicated their viewpoints to the Commission on two occasions last year, arguing that certain new GM techniques should go unregulated.

The Dutch Government has been at the forefront of the push for Brussels to deregulate a technique called cisgenesis (where an organism is engineered using genes from the same species or a crossable one). €10 million were given to Wageningen University to develop a cisgenic ‘national GM potato’ – even though there is virtually no commercial interest from Dutch firms to commercialise the product.

Wageningen University has consequently played an active role both in The Hague and in Brussels to push for the deregulation of cisgenesis.

Grand finale or ongoing battle?

All invested parties are now awaiting the Commission’s decision, to be presented in March 2016. As Brussels itself realises, the Commission release is likely to be just the beginning – and not the resolution – of this contentious issue.

In all likelihood, it will be the European Court of Justice that ultimately determines the regulatory fate of new GM techniques. The court case around Cibus’ herbicide-tolerant ODM oilseed rape in Germany will therefore be of great importance.

In the meantime, neither the biotech industry nor its financiers are likely to secure the certainty they have been striving for. Other actors may come into play, such as food distributors demanding direct liability for new GM products from those who put them on the market.

In addition, national parliaments could still demand labelling for new GM products, as has happened in the Netherlands.

With TTIP negotiations proceeding apace, environmental and sustainable farming groups must stay actively protect our hard-won – albeit imperfect – GM laws.

 


 

The report:Biotech lobby’s push for new GMOs to escape regulation‘.

Case studies:


Nina Holland
is researcher and campaigner at Corporate Europe Observatory, an organisation that aims to expose and challenge the corporate power exerted by transnational corporations over EU decision making in Brussels. She focuses on the agribusiness lobby, including the biotech and pesticide industries.

Further reading:Genetic Engineering in plants and the “New Breeding Techniques (NBTs)‘ by Ricarda Steinbrecher. December 2015.

 

From salt to GMOs – resistance is fertile

How can we broaden our movement to appeal to and involve the majority of people out there who do not seem to be aware, do not seem to care or are just too apathetic?

This has long been an issue for many a campaign group or activist.

There are many groups who have been offering a credible analysis of how the world functions. But many of these groups use certain language and theoretical constructs whereby they end up preaching to the converted and make little headway in galvanising mass protest, action or resistance to capitalism, especially among more affluent or politically unaware sections of the population.

Take Syria, for instance. What is happening on the ground there is too abstract for many, too far away or seemingly too unconnected from their everyday lives to have much meaning (except when the issue of immigration rears its head, whose solution, according to the mainstream media, politicians and pundits, does not involve bombing Syria less but more).

Similarly, Ukraine or Afghanistan is also often regarded as being too removed and any talk about empire or imperialism does not strike much of a chord with many people, who after a long day do not have the time, energy or inclination to sit down and do research into the machinations of empire, read Brookings Institute reports on how to deal with Iran or gem up on the Project for a New American Century.

The problem revolves around how to raise informed awareness (not spoon-fed mainstream media narratives) of these and other issues and how to make the public connect world events with their everyday lives.

Gandhiji’s powerful message of peaceful resistance

Gandhi knew how to connect everyday concerns with wider issues. In 1930, he led a ‘salt march’ to the coast of Gujarat to symbolically collect salt on the shore. His message of resistance against the British Empire revolved around a simple everyday foodstuff.

His focus on salt was questioned by sections of the press and prominent figures on his side (even the British weren’t much concerned about a march about salt), who felt that protest against British rule in India should for instance focus more directly on the heady issues of rights and democracy.

However, Gandhi knew that by concentrating on an item of daily use among ordinary Indians, such a campaign could resonate more with all classes of citizens than an abstract demand for greater political rights.

Even though salt was freely available to those living on the coast (by evaporation of sea water), Indians were forced to purchase it from the colonial government. The tax on salt represented 8.2% of the British Raj tax revenue. The issue of salt encapsulated the essence of colonial oppression at the time.

Explaining his choice, Gandhi said that next to air and water, salt is perhaps the greatest necessity of life. The prominent Congress statesman and future Governor-General of India, C. Rajagopalachari, understood what Gandhi was trying to achieve. He said:

“Suppose, a people rise in revolt. They cannot attack the abstract constitution or lead an army against proclamations and statutes…Civil disobedience has to be directed against the salt tax or the land tax or some other particular point – not that that is our final end, but for the time being it is our aim, and we must shoot straight.”

With the British imposing heavy taxes on salt and monopolising its production, Gandhi felt he could strike a chord with the masses by highlighting an issue that directly affected everyone in the country: access to and control over a daily essential. His march drew not only national but international attention to India’s struggle for independence.

From salt to the entire food chain

Today, we find the issue of food in general playing a similar role in people’s struggle for independence, but this time it is independence from the corporate tyranny of global agribusiness, and, for much of the world, independence from the US – which for a long time has been using food as a geopolitical tool to create food deficit areas, boost reliance on US exports and create dependence on oil-based chemical-intensive agriculture and ultimately the petro-dollar (see this and this and this).

Vandana Shiva draws a parallel between the seed sovereignty movement and Gandhi’s civil disobedience ‘salt march’:

“Gandhi has started the independence movement with the salt satyagraha. Satyagraha means ‘struggle for truth’. The salt satyagraha was a direct action of non-cooperation. When the British tried to create salt monopolies, he went to the beach in Dindi, picked up the salt and said, ‘Nature has given us this for free, it was meant to sustain us, we will not allow it to become a monopoly to finance the Imperial Army …

“Nature has gifted this rich biological diversity to us. We will not allow it to become the monopoly of a handful of corporations … For us, not cooperating in the monopoly regimes of intellectual property rights and patents and biodiversity – saying ‘no’ to patents on life, and developing intellectual ideas of resistance – is very much a continuation of Gandhian satyagraha …

“That is the satyagraha for the next millennium. It is what the ecology movement must engage in, not just in India, but in the United States as well.”

At the heart of the debate: patented GM seeds

With genetically modified seeds now a major issue, the debate on food has in recent years meant that the issues of food sovereignty and food independence have been given a sharper focus.

What the debate on GM has done is create increased public awareness concerning how food is produced, what is in it, who is controlling it and for what purpose. At one end of the spectrum, we have groups that were already highly politically aware about food and the geopolitics of food and agriculture.

At the other end, however, there are people who may have not been too politically aware or attracted to politics or political issues but who are being drawn towards issues like the ‘right to know’ what is in their food and the need to label GM foodstuffs on supermarket shelves.

As a result, many are being politicised as they get drawn into the great food debate because, once they begin talking about the need to label, they soon begin to realise there are powerful state-corporate forces preventing this. By delving into the politics of labelling and GM, people will hopefully be drawn towards wider debates about Monsanto and agribusiness and in turn to how these entities are shaping the global system of food and agriculture.

The basic ‘right to know’ could and should logically lead people to consider issues pertaining to seed sovereignty and patenting of seeds, petro-chemical farming and the role of oil, the destruction of indigenous agriculture across the world and corrupt trade deals like TTIP.

Resistance is fertile

For too long, so many people in the West have acted like ‘mob wives’, displaying a willingness to remain blissfully ignorant while living well from the fruits of imperialism or knowing that something might be amiss but turning a blind eye because life (for them) is good.

There is however a growing recognition that their food is not only killing them as consumers but others too and that this is part of an agenda to capture the food supply by a powerful cartel that began many decades ago and is still being played out in throughout the globe from Africa and India to Ukraine and beyond.

Protest and action against widespread oppression, violence and exploitation has to be focussed. As in Gandhi’s time, it is again food that is playing a central role in raising awareness and provoking resistance.

 


 

Colin Todhunter is an extensively published independent writer and former social policy researcher, based in the UK and India. You can support his work here.

This article was originally published on Colin’s website.