Monthly Archives: January 2016

Greenpeace ‘peer review’ climate sting’s first scalp?

Climate denier Professor Ross McKitrick has resigned as chairman of the academic advisory council of Lord Lawson’s Global Warming Policy Foundation (GWPF), according to a statement released by the charity.

The senior fellow of the ExxonMobil- and Koch-funded Fraser Institute in Canada told DeSmog UK he had informed Lord Lawson at the GWPF in September that he intended to stand down citing increased work commitments.

The academic confirmed his stepping down had nothing to do with the high-profile investigative sting by Greenpeace last month which raised serious concerns about the council’s claims to conduct peer reviews of its publications.

McKitrick said: “I decided [to step down] late last summer, because of other commitments I had. And so, I notified them at the beginning of September that I would be leaving at the end of December.”

Asked about the GWPF’s peer review claims, he said: “I am confident the process was as described and I don’t have any further comment to make to you other than what was released before Christmas.”

I’m afraid Greenpeace has got it wrong’ – or maybe not …

DeSmog UK emailed the GWPF on 8th December 2015, hours after Greenpeace published its agenda-setting investigation, asking whether McKitrick would be asked to resign in light of the charity’s academic publications apparently being sold out to an oil and gas man.

Dr Benny Peiser, the former part-time sports anthologist and director of the GWPF, said then: “I’m afraid Greenpeace has got it wrong.”

An undercover investigator from Greenpeace telephoned and exchanged emails with Professor Will Happer – also a member of the GWPF advisory council – posing as a Middle East oil and gas company representative.

During the exchange, Happer discussed producing a report praising CO2 that which would then be passed through the GWPF’s “peer review” process. He explained this was significantly easier than trying to publish in a journal. He also asked that any fee be paid to another climate denial charity.

The scandal made the front page in Britain and France during the COP21 climate conference in Paris. The Charity Commission has confirmed it will consider the Greenpeace evidence as part of an ongoing inquiry into the GWPF.

‘Peer review’ or rubber stamp?

Adam Levy, from the respected Nature journal, told openDemocracy a report described as peer reviewed by the GWPF had not gone through the process in any meaningful way.

“The review process received by this paper is not what would generally be described as peer-review. I have not encountered other instances of similar publications being cited as peer-reviewed in either academia, or science journalism”, he said.

McKitrick is something of a hero in climate denial circles for recruiting Steve McIntyre to the cause and collaborating on a decade-long attack on Professor Michael Mann and the hockey stick graph, which climaxed with the Climategate hacking.

The resignation comes almost exactly a year after McKitrick took up the post, replacing David Henderson who also stood down at his own request. Professor Christopher Essex, a longtime collaborator, will take over. McKitrick will remain a member of the council.

Lord Lawson and Dr Benny Peiser were contacted by DeSmog UK but have not yet responded with a comment. However Lawson stated in a press release:

“I am most grateful to Ross, whose work over the past year has been outstanding. He will be a hard act to follow, but I am confident that Chris is the man to do it.”

 


 

This article was originally published on DeSmog.uk.

Brendan Montague writes for DeSmog.uk. Follow Brendan @brendanmontague.

Also on The Ecologist:Climate academics for hire conceal fossil fuel funding‘ by Lawrence Carter & Maeve McClenaghan / Greenpeace Energydesk.

 

Greenpeace ‘peer review’ climate sting’s first scalp?

Climate denier Professor Ross McKitrick has resigned as chairman of the academic advisory council of Lord Lawson’s Global Warming Policy Foundation (GWPF), according to a statement released by the charity.

The senior fellow of the ExxonMobil- and Koch-funded Fraser Institute in Canada told DeSmog UK he had informed Lord Lawson at the GWPF in September that he intended to stand down citing increased work commitments.

The academic confirmed his stepping down had nothing to do with the high-profile investigative sting by Greenpeace last month which raised serious concerns about the council’s claims to conduct peer reviews of its publications.

McKitrick said: “I decided [to step down] late last summer, because of other commitments I had. And so, I notified them at the beginning of September that I would be leaving at the end of December.”

Asked about the GWPF’s peer review claims, he said: “I am confident the process was as described and I don’t have any further comment to make to you other than what was released before Christmas.”

I’m afraid Greenpeace has got it wrong’ – or maybe not …

DeSmog UK emailed the GWPF on 8th December 2015, hours after Greenpeace published its agenda-setting investigation, asking whether McKitrick would be asked to resign in light of the charity’s academic publications apparently being sold out to an oil and gas man.

Dr Benny Peiser, the former part-time sports anthologist and director of the GWPF, said then: “I’m afraid Greenpeace has got it wrong.”

An undercover investigator from Greenpeace telephoned and exchanged emails with Professor Will Happer – also a member of the GWPF advisory council – posing as a Middle East oil and gas company representative.

During the exchange, Happer discussed producing a report praising CO2 that which would then be passed through the GWPF’s “peer review” process. He explained this was significantly easier than trying to publish in a journal. He also asked that any fee be paid to another climate denial charity.

The scandal made the front page in Britain and France during the COP21 climate conference in Paris. The Charity Commission has confirmed it will consider the Greenpeace evidence as part of an ongoing inquiry into the GWPF.

‘Peer review’ or rubber stamp?

Adam Levy, from the respected Nature journal, told openDemocracy a report described as peer reviewed by the GWPF had not gone through the process in any meaningful way.

“The review process received by this paper is not what would generally be described as peer-review. I have not encountered other instances of similar publications being cited as peer-reviewed in either academia, or science journalism”, he said.

McKitrick is something of a hero in climate denial circles for recruiting Steve McIntyre to the cause and collaborating on a decade-long attack on Professor Michael Mann and the hockey stick graph, which climaxed with the Climategate hacking.

The resignation comes almost exactly a year after McKitrick took up the post, replacing David Henderson who also stood down at his own request. Professor Christopher Essex, a longtime collaborator, will take over. McKitrick will remain a member of the council.

Lord Lawson and Dr Benny Peiser were contacted by DeSmog UK but have not yet responded with a comment. However Lawson stated in a press release:

“I am most grateful to Ross, whose work over the past year has been outstanding. He will be a hard act to follow, but I am confident that Chris is the man to do it.”

 


 

This article was originally published on DeSmog.uk.

Brendan Montague writes for DeSmog.uk. Follow Brendan @brendanmontague.

Also on The Ecologist:Climate academics for hire conceal fossil fuel funding‘ by Lawrence Carter & Maeve McClenaghan / Greenpeace Energydesk.

 

Greenpeace ‘peer review’ climate sting’s first scalp?

Climate denier Professor Ross McKitrick has resigned as chairman of the academic advisory council of Lord Lawson’s Global Warming Policy Foundation (GWPF), according to a statement released by the charity.

The senior fellow of the ExxonMobil- and Koch-funded Fraser Institute in Canada told DeSmog UK he had informed Lord Lawson at the GWPF in September that he intended to stand down citing increased work commitments.

The academic confirmed his stepping down had nothing to do with the high-profile investigative sting by Greenpeace last month which raised serious concerns about the council’s claims to conduct peer reviews of its publications.

McKitrick said: “I decided [to step down] late last summer, because of other commitments I had. And so, I notified them at the beginning of September that I would be leaving at the end of December.”

Asked about the GWPF’s peer review claims, he said: “I am confident the process was as described and I don’t have any further comment to make to you other than what was released before Christmas.”

I’m afraid Greenpeace has got it wrong’ – or maybe not …

DeSmog UK emailed the GWPF on 8th December 2015, hours after Greenpeace published its agenda-setting investigation, asking whether McKitrick would be asked to resign in light of the charity’s academic publications apparently being sold out to an oil and gas man.

Dr Benny Peiser, the former part-time sports anthologist and director of the GWPF, said then: “I’m afraid Greenpeace has got it wrong.”

An undercover investigator from Greenpeace telephoned and exchanged emails with Professor Will Happer – also a member of the GWPF advisory council – posing as a Middle East oil and gas company representative.

During the exchange, Happer discussed producing a report praising CO2 that which would then be passed through the GWPF’s “peer review” process. He explained this was significantly easier than trying to publish in a journal. He also asked that any fee be paid to another climate denial charity.

The scandal made the front page in Britain and France during the COP21 climate conference in Paris. The Charity Commission has confirmed it will consider the Greenpeace evidence as part of an ongoing inquiry into the GWPF.

‘Peer review’ or rubber stamp?

Adam Levy, from the respected Nature journal, told openDemocracy a report described as peer reviewed by the GWPF had not gone through the process in any meaningful way.

“The review process received by this paper is not what would generally be described as peer-review. I have not encountered other instances of similar publications being cited as peer-reviewed in either academia, or science journalism”, he said.

McKitrick is something of a hero in climate denial circles for recruiting Steve McIntyre to the cause and collaborating on a decade-long attack on Professor Michael Mann and the hockey stick graph, which climaxed with the Climategate hacking.

The resignation comes almost exactly a year after McKitrick took up the post, replacing David Henderson who also stood down at his own request. Professor Christopher Essex, a longtime collaborator, will take over. McKitrick will remain a member of the council.

Lord Lawson and Dr Benny Peiser were contacted by DeSmog UK but have not yet responded with a comment. However Lawson stated in a press release:

“I am most grateful to Ross, whose work over the past year has been outstanding. He will be a hard act to follow, but I am confident that Chris is the man to do it.”

 


 

This article was originally published on DeSmog.uk.

Brendan Montague writes for DeSmog.uk. Follow Brendan @brendanmontague.

Also on The Ecologist:Climate academics for hire conceal fossil fuel funding‘ by Lawrence Carter & Maeve McClenaghan / Greenpeace Energydesk.

 

New research exposes hidden cocktail of bee-killing pesticides in hedgerows and wildflowers

Scientists at Sussex University have discovered that bees are exposed to a toxic chemical cocktail when feeding from wildflowers growing next to neonicotinoid treated crops in UK farmland.

The findings are published today in a paper in the journal Environment International, ‘Widespread contamination of wildflower and bee-collected pollen with complex mixtures of neonicotinoids and fungicides commonly applied to crops‘ by Arthur David et al.

The research by Sussex University and supported by the Soil Association reveals that pollinators consuming pollen from these crops or from nearby wildflowers will ingest a cocktail of fungicides and insecticides.

“In summary, our study confirms that bees foraging in arable farmland are exposed to a complex cocktail of neonicotinoid insecticides and fungicides in the pollen they collect”, they conclude.

Dave Goulson, Professor of Biology at the University of Sussex, and one of the authors of the paper, said: “It is clear that insects visiting wildflowers in field margins are chronically exposed to a cocktail of chemicals.

“The effects that this has on their health have never been studied, and there is an urgent need to do so. In the meantime, the precautionary principle would suggest that we should take steps to reduce this exposure as much as possible.”

One result is that honey from bees in urban areas is actually much cleaner than that from the countryside, at least in areas where arable crops are grown.

“Pesticide levels in pollen gathered in urban areas were much lower, but still contaminated, demonstrating both the value of the urban environment and need to minimise garden pesticide use”, commented Dr Christopher Connolly of the School of Medicine at the University of Dundee.

The countryside is now a dangerous place for bees!

One in ten species of Europe’s wild bees is facing extinction, and neonicotinoid insecticides are increasingly seen as contributing to these declines. In addition to neonicotinoids, farmers may spray some non-organic crops a dozen or more times while they are growing, with anything up to 23 different chemicals.

The wild flowers found by researchers to be contaminated by neonicotinoids and other toxic sprays included: Creeping Buttercup, Bramble, White Campion, Scented Mayweed, Hogweed, Hawthorn, Poppy, Wild Rose, Burnet Saxifrage and Fool’s Parsley.

“Overall, these results and other similar studies in France and the USA indicate that these mixtures of insecticides and fungicides appear ubiquitous in pollen samples and that even higher concentrations than the ones observed in our study can be encountered”, the authors write.

“For both species [honey bees and bumble bees], pollen from hawthorn represents a major part of the collected pollen (up to 87%) and that the pollen from hawthorn collected by honey bees was often contaminated by several pesticides (up to six) and notably at concentrations up to 29 ng/g for carbendazim.”

A prior study also suggests fungicides can act synergistically with other pesticides, making the insecticides up to 1,000 times more deadly than they are on their own. (See References)

Peter Melchett, Policy Director of the Soil Association, added: “These findings are shocking. Neonicotinoids are supposedly highly targeted insecticides yet the researchers have found that they are turning up in the pollen of poppies, blackberries and hawthorn blossom in hedges, at levels that on their own are enough to cause harm to bees.

“Worse still, they are present along with a whole cocktail of chemicals, some of which could increase the toxicity of neonicotinoids up to 1,000 times.”

UK’s ‘safe havens’ policy doomed to failure

To combat bee decline, the UK Government’s Pollinator Strategy has focused on creating ‘safe havens’ for bees by increasing flower habitats next to fields – yet this research shows these flowers may be laden with dangerous chemicals.

As a result, says Melchett, the government must urgently rethink its strategy, which already appears doomed to failure: “The UK government must act. Until now, the government’s main solution to the bee crisis is to pay farmers a small chunk of the £900 million Common Agricultural Policy (CAP) money, available to help wildlife, to create flower rich habitat next to crops.

“Yet this research suggests that these supposedly safe havens for bees are actually potentially dangerous chemical cocktail bars. These flower margins must be protected with a full ban on neonicotinoids; the current EU ban is only partial, and in the UK mainly applies to just one crop – oilseed rape. Neonicotinoids are still used on other crops, for example on over 25% of all UK cereals. Neonicotinoids will be poisoning the field margins of many of these crops.

“We also want to see the government finally setting out a strategy for reducing pesticide use in farmland – as is required by EU law. This has always been the gaping hole in the government’s strategy to save our bees”.

And in fact, the impact of the pesticides on bees could be even worse than revealed by this study, which examines only the pollen gathered by bees:

“While quantifying realistic levels of exposure via pollen as we have done here is an important step forwards, we did not examine exposure via nectar, which we intend to address in future work”, the study notes. “A major challenge which has yet to be tackled is attempting to understand what effects simultaneous exposure to multiple pesticides has upon bees in the field.”

Critics confounded

Earlier work highlighting the dangers of neonicotinoids has been criticised for exaggerating the concentrations of the pesticides that pollinators would encounter in farmed lanscapes – and the authors take the opportunity to put the record straight:

“Experimental studies such as Whitehorn et al. (2012), which describe severe impacts of neonicotinoids on bumble bees, have been criticised for using unrealistically high concentrations of pesticide (in this example 6 ng/g of imidacloprid) (Carreck and Ratnieksi 2014).

“Our data suggest that real-world exposure may often be much higher than this, for the mean concentration of thiamethoxam in our samples from 5 nests located in farmland was 18 ng/g, and one of the nests located in urban environment showed more than 19 ng/g for imidacloprid.

“It has also been demonstrated that there are synergies between neonicotinoids and DMI fungicides such as flusilazole (Iwasa et al., 2004; Schmuck et al., 2003), so the presence of both compounds at high concentrations in pollen stores of bumble bees is a cause for concern.”

Dr Connolly, who was not involved in the research, observed: “Validating semi-field studies on neonicotinoids (artificial provision of pesticide) requires confirmation that bees do actually get exposed to these levels of pesticide in the field where multiple sources of pollen and nectar exist. This study provides such a bridge to validate previous studies.

“Furthermore, of the 20 pesticides examined, most OSR pollen samples contained 7-12 different pesticides, whilst pollen gathered by honeybees and bumblebees contained 2-10 pesticides. This does not represent the full pesticide load, only a reflection on the few examined.

“The work by David et al is very important as, although in Europe, farmers are legally obliged to keep pesticide use records (EC No. 1107/2009), the data are not collected and so scientists have no access to local cocktail use. Therefore, studies like this should help governments realise the potential importance of gathering this information.”

 


 

Oliver Tickell edits The Ecologist.

The paper:Widespread contamination of wildflower and bee-collected pollen with complex mixtures of neonicotinoids and fungicides commonly applied to crops‘ by is published in the journal Environment International.

Principal source: Soil Association.

The 2016 Oxford Real Farming Conference will be held in Oxford Town Hall on 6th & 7th January. View the programme, get a flavour of the event from previous years by watching the film (in embed, above), or exploring past conferences. Book your tickets for either of both days here.

References

Iwasa T, Motoyama N, Ambrose JT, Roe MR. ‘Mechanism for the differential toxicity of neonicotinoid insecticides in the honey bee’, Apis mellifera. Crop Prot. 2004;23:371–378.

Schmuck R, Stadler T, Schmidt HW. ‘Field relevance of a synergistic effect observed in the laboratory between an EBI fungicide and a chloronicotinyl insecticide in the honeybee (Apis mellifera L, Hymenoptera)’, Pest Manag Sci. 2003;59:279-286. [PubMed]

 

Food industry must get behind ‘right to know’ on GMO

I just don’t get it.

Over the more than 20 years I have worked as a business journalist, I’ve always been motivated by a simple premise: Knowledge is power, and that power belongs with the public.

The spread of information that people can use to make decisions – what to buy, what to eat, where to invest, etc. – helps support and promote the principles of freedom and democracy, I believe.

That’s why the fear and loathing emanating from the food industry over the public’s right to information about the food they consume is so hard for me to grasp.

As we kick off 2016 the leaders of many of the nation’s largest and most powerful food companies are doubling down on their commitment to block mandatory labeling of foods made with genetically engineered crops, and they are seeking Agriculture Secretary Tom Vilsack’s help to do so.

The issue has become urgent for the industry as what would be the nation’s first mandatory labeling measure is set to go into effect on 1st July in Vermont. The industry has thus far failed to convince a federal court to block the law’s implementation, though the fight could go to trial this spring.

Citizens in many other states continue to try to pass similar mandatory labeling measures. A GMO label would allow a consumer to know at a glance information that many consider important.

Exactly why is a $2.1 trillion industry so fearful?

Given that knowledge, some people might shy away from GMO-labeled foods; others may not care. Some may seek out GMO-labeled foods if they feel they provide special value or are helping ‘feed the world’, as GMO seed developers such as Monsanto Co. claim.

But the public’s right to that knowledge – to that decision-making ability – terrifies many in an industry that generates sales of roughly $2.1 trillion annually. The fear is so strong that they have enlisted teams of legal and public relations professionals to help try to convince regulators and federal lawmakers to override Vermont’s law and prohibit any future laws like it.

The Grocery Manufacturers Association, whose members include PepsiCo., Kellogg Co. and hundreds of other large food companies, leads the charge against GMO labeling, saying it would be too costly to implement and is unnecessary because GMOs are proven safe. The organization says it is “hopeful that compromise will establish a uniform national standard for foods made with genetically engineered crops.”

The group recently put forth a proposed initiative that would add barcodes to products that consumers could scan with their smartphones to access information. But whether or not the presence of GMO ingredients would ever be required to be included in that information is unclear.

Those fighting for mandatory labeling include members of the organic and natural foods industry, but also consumer groups, environmentalists and lots of regular moms and dads who want to know what they are feeding their children.

Information wars – can a just compromise be reached?

Many of these labeling supporters cite pesticide residues on GMO foods as a concern, and contradictory science on the safety of GMOs. Some opponents say they don’t want to buy products that they feel contribute to corporate control of the world’s food supply.

A barcode won’t cut it, many of the leading GMO labeling proponents say.  They point to a national survey conducted in November by the Mellman Group that concluded 88% of people want a printed GMO label rather than having to use a smartphone app to scan a bar code.

Agriculture Secretary Vilsack looks set to sit down with representatives from both sides of the issue in January to try to forge a compromise if one can be found. Both sides say they are willing to meet in the middle.

Millions of dollars have been spent lobbying for and against labeling and fighting the issue out in the courts, and both sides are weary of the war. Details of the discussions to be held are being kept confidential, according to some participants, to give the process the greatest chance of success.

As the discussions loom, we should not lose sight of the fact that this issue – and many others – come down to the power of information, and the critical nature of who controls that information.

Those companies developing and profiting from GMOs have the information they need to patent their creations and track where and how they are used. Farmers planting GMOs are provided a range of information about the seeds, their limitations and their benefits, and can easily choose non-GMO seeds because varieties are labeled and tracked.

Systems are in place to allow food manufacturers to know whether or not they are purchasing ingredients made from GMO crops. It seems consumers are the only ones left out of the information pipeline.

Public ‘too dumb to understand’ say GMO industry boosters

Indeed, some advocating against GMO labeling argue that consumers aren’t smart enough to understand or use GMO labeling information effectively. They argue that consumers are being conned into fearing GMOs.

In a 27th December blog posting opposing GMO labeling, GMO supporters Jon Entine and retired University of Illinois professor Bruce Chassy wrote of consumers “who can’t define what a GMO is” and said that pro-labeling efforts are driven by “small groups of well-financed professional activists.” Chassy and Entine argue that these “activists” use “misinformation and fear-mongering to whip up support for their agenda.”

Such pro-GMO advocates may hope consumers also are not well informed about their connections to the corporate food industry. Chassy doesn’t mention in that blog, for instance, that for years while working as a professor of food safety at the University of Illinois, he collaborated quietly with Monsanto executives on multiple projects aimed at countering concerns about health and environmental impacts of GMOs.

Monsanto has acknowledged that it provided several unrestricted grants to the biotechnology outreach program that Chassy helped lead, but said there was nothing improper about the relationship.

That is information some might want to know. But it only became public after the non-profit group US Right to Know obtained emails between Chassy and several other university professors and Monsanto, and shared them with media outlets.

Another batch of emails recently disclosed shows discussions between Kevin Folta, chairman of the horticultural sciences department at the University of Florida, and a public relations agency about how to counter a Canadian teenager who developed a website questioning the safety of genetically modified foods. Folta also received grant money from Monsanto.

You don’t have to be anti-GMO to want to know

I don’t know about you, but this is all information I think is important. Knowing what goes on behind the scenes helps me make decisions about who I trust and what I believe about the food I buy for myself and my family.

As a journalist I’ve been fortunate enough to get behind those scenes a time or two myself: I’ve toured Monsanto’s laboratories, visited Dow AgroSciences’ test plots; and spent more time than I can calculate with farmers in their fields.

I’ve also spent countless hours with scientists on both sides of this debate; waded through stacks of legal and regulatory documents; and sat down with government regulators to talk over the myriad issues.

The knowledge I have gained leaves me straddling the fence a bit. I see benefits to GMOs, and I see risks. And I know with certainty that I want more information, not less.

Whatever one’s views are about GMOs, or other aspects of the food industry, the right to information is essential, and not one to be abridged. 

 


 

Carey Gillam has been recognized as one of the top food and agriculture journalists in the United States, winning several awards for her coverage of the industry, and appearing as an expert commentator on radio and television broadcasts. After a 17-year career at Reuters, one of the world’s largest news organizations, Gillam joined U.S. Right to Know as Research Director on Jan. 4.

This article was originally published on US Right to Know (CC BY-SA).

 

Food industry must get behind ‘right to know’ on GMO

I just don’t get it.

Over the more than 20 years I have worked as a business journalist, I’ve always been motivated by a simple premise: Knowledge is power, and that power belongs with the public.

The spread of information that people can use to make decisions – what to buy, what to eat, where to invest, etc. – helps support and promote the principles of freedom and democracy, I believe.

That’s why the fear and loathing emanating from the food industry over the public’s right to information about the food they consume is so hard for me to grasp.

As we kick off 2016 the leaders of many of the nation’s largest and most powerful food companies are doubling down on their commitment to block mandatory labeling of foods made with genetically engineered crops, and they are seeking Agriculture Secretary Tom Vilsack’s help to do so.

The issue has become urgent for the industry as what would be the nation’s first mandatory labeling measure is set to go into effect on 1st July in Vermont. The industry has thus far failed to convince a federal court to block the law’s implementation, though the fight could go to trial this spring.

Citizens in many other states continue to try to pass similar mandatory labeling measures. A GMO label would allow a consumer to know at a glance information that many consider important.

Exactly why is a $2.1 trillion industry so fearful?

Given that knowledge, some people might shy away from GMO-labeled foods; others may not care. Some may seek out GMO-labeled foods if they feel they provide special value or are helping ‘feed the world’, as GMO seed developers such as Monsanto Co. claim.

But the public’s right to that knowledge – to that decision-making ability – terrifies many in an industry that generates sales of roughly $2.1 trillion annually. The fear is so strong that they have enlisted teams of legal and public relations professionals to help try to convince regulators and federal lawmakers to override Vermont’s law and prohibit any future laws like it.

The Grocery Manufacturers Association, whose members include PepsiCo., Kellogg Co. and hundreds of other large food companies, leads the charge against GMO labeling, saying it would be too costly to implement and is unnecessary because GMOs are proven safe. The organization says it is “hopeful that compromise will establish a uniform national standard for foods made with genetically engineered crops.”

The group recently put forth a proposed initiative that would add barcodes to products that consumers could scan with their smartphones to access information. But whether or not the presence of GMO ingredients would ever be required to be included in that information is unclear.

Those fighting for mandatory labeling include members of the organic and natural foods industry, but also consumer groups, environmentalists and lots of regular moms and dads who want to know what they are feeding their children.

Information wars – can a just compromise be reached?

Many of these labeling supporters cite pesticide residues on GMO foods as a concern, and contradictory science on the safety of GMOs. Some opponents say they don’t want to buy products that they feel contribute to corporate control of the world’s food supply.

A barcode won’t cut it, many of the leading GMO labeling proponents say.  They point to a national survey conducted in November by the Mellman Group that concluded 88% of people want a printed GMO label rather than having to use a smartphone app to scan a bar code.

Agriculture Secretary Vilsack looks set to sit down with representatives from both sides of the issue in January to try to forge a compromise if one can be found. Both sides say they are willing to meet in the middle.

Millions of dollars have been spent lobbying for and against labeling and fighting the issue out in the courts, and both sides are weary of the war. Details of the discussions to be held are being kept confidential, according to some participants, to give the process the greatest chance of success.

As the discussions loom, we should not lose sight of the fact that this issue – and many others – come down to the power of information, and the critical nature of who controls that information.

Those companies developing and profiting from GMOs have the information they need to patent their creations and track where and how they are used. Farmers planting GMOs are provided a range of information about the seeds, their limitations and their benefits, and can easily choose non-GMO seeds because varieties are labeled and tracked.

Systems are in place to allow food manufacturers to know whether or not they are purchasing ingredients made from GMO crops. It seems consumers are the only ones left out of the information pipeline.

Public ‘too dumb to understand’ say GMO industry boosters

Indeed, some advocating against GMO labeling argue that consumers aren’t smart enough to understand or use GMO labeling information effectively. They argue that consumers are being conned into fearing GMOs.

In a 27th December blog posting opposing GMO labeling, GMO supporters Jon Entine and retired University of Illinois professor Bruce Chassy wrote of consumers “who can’t define what a GMO is” and said that pro-labeling efforts are driven by “small groups of well-financed professional activists.” Chassy and Entine argue that these “activists” use “misinformation and fear-mongering to whip up support for their agenda.”

Such pro-GMO advocates may hope consumers also are not well informed about their connections to the corporate food industry. Chassy doesn’t mention in that blog, for instance, that for years while working as a professor of food safety at the University of Illinois, he collaborated quietly with Monsanto executives on multiple projects aimed at countering concerns about health and environmental impacts of GMOs.

Monsanto has acknowledged that it provided several unrestricted grants to the biotechnology outreach program that Chassy helped lead, but said there was nothing improper about the relationship.

That is information some might want to know. But it only became public after the non-profit group US Right to Know obtained emails between Chassy and several other university professors and Monsanto, and shared them with media outlets.

Another batch of emails recently disclosed shows discussions between Kevin Folta, chairman of the horticultural sciences department at the University of Florida, and a public relations agency about how to counter a Canadian teenager who developed a website questioning the safety of genetically modified foods. Folta also received grant money from Monsanto.

You don’t have to be anti-GMO to want to know

I don’t know about you, but this is all information I think is important. Knowing what goes on behind the scenes helps me make decisions about who I trust and what I believe about the food I buy for myself and my family.

As a journalist I’ve been fortunate enough to get behind those scenes a time or two myself: I’ve toured Monsanto’s laboratories, visited Dow AgroSciences’ test plots; and spent more time than I can calculate with farmers in their fields.

I’ve also spent countless hours with scientists on both sides of this debate; waded through stacks of legal and regulatory documents; and sat down with government regulators to talk over the myriad issues.

The knowledge I have gained leaves me straddling the fence a bit. I see benefits to GMOs, and I see risks. And I know with certainty that I want more information, not less.

Whatever one’s views are about GMOs, or other aspects of the food industry, the right to information is essential, and not one to be abridged. 

 


 

Carey Gillam has been recognized as one of the top food and agriculture journalists in the United States, winning several awards for her coverage of the industry, and appearing as an expert commentator on radio and television broadcasts. After a 17-year career at Reuters, one of the world’s largest news organizations, Gillam joined U.S. Right to Know as Research Director on Jan. 4.

This article was originally published on US Right to Know (CC BY-SA).

 

The great bathroom debate: paper towel or hand dryer?

It’s the age-old question that continues to baffle many of us in the bathroom: when you come to drying your hands, should you reach for the paper towel, or the electric dryer?

For some, this decision might be related to hygiene, and for others, drying performance. For many, environmental concerns are also an important consideration, no doubt motivated by the fact that our daily activities contribute to the complex web of growing sustainability pressures facing the planet.

So how might we decide which of the two most common methods of drying our hands – paper towel or an electric dryer – is the most effective, and environmentally friendly, without resorting to the convenient wipe on the trousers?

Life cycle analysis is a method long used to identify life-cycle environmental impacts of products and services, including materials, manufacturing, transport, use, and end of life (e.g. disposal).

Using this analysis, we can search out ‘hot spots’ – those parts of the life cycle which have higher impacts – to identify the most important aspects for our analysis.

The heat on hand dryers …

So let’s cut to the chase: what are the hot spots for the most common hand drying systems?

Life-cycle research consistently shows that the environmental impacts of the electricity and towels used at the point when we dry our hands dwarf the impacts throughout the rest of the life cycle. These include the materials, manufacturing, and disposal of hand-dryers and towel dispensers.

This is because we use dryers and dispensers many times before they are replaced. But every time we dry our hands we consume resources, either paper or electricity.

The environmental impact of hand drying is therefore most significantly affected by how much and what type of paper towel we use, or how much energy is consumed by the electric hand dryer.

Research comparing these two methods of drying concluded that both the conventional hand dryer and the paper towel performed roughly the same, environmentally speaking.

Each method, however, gained a small advantage over the other depending on changes to critical factors such as:

  • weight and number of paper towels used per dry (the average is two)
  • proportion of recycled paper
  • power rating and length of time for drying using an electric dryer
  • other regional electricity impacts

So in some contexts a paper towel is the slightly better option, and in others, the conventional electrical hand dryer. This depends largely on how the electricity is generated, and how the towels are produced and disposed of.

The new ‘high performance’ contenders

You might have noticed a proliferation of fancy new dryers in bathrooms in recent years. While conventional dryers use a combination of warmth and air flow to evaporate and blow water off your hands, these newer dryers use a non-heated rapid air stream to simply strip the water off. Do they make the grade?

Several recent studies independently peer reviewed by experts, such as this one, this one from Massachusetts Institute of Technology (MIT), and this one I conducted in 2011, have compared several high speed dryers to paper towels and conventional electrical hand dryers.

At first glance, the two high-speed dryers investigated – namely, the XLERATOR and Dyson Airblade – already have an advantage over conventional electric dryers. They have a much shorter drying time (between 12 and 20 seconds, compared with 20-40 seconds for conventional dryers) and a lower power rating (around 1.5 kilowatts, compared with 2.4kW).

The studies mentioned above have confirmed this advantage, even when potentially lower-energy consumption by the conventional dryer is considered.

The researchers also compared the impacts associated with generating and using electricity for the dryer with the impacts and emissions related to paper production, manufacturing, and disposal.

And again the high-speed dryers came out on top. This result held even when fewer than two towels per dry were used, and when the paper was 100% recycled, both in manufacturing and disposal.

Overall, these life-cycle studies found that using a high-speed dryer reduced environmental impacts markedly. This included global warming potential, land use, water use, solid waste, ecosystem quality, and embodied energy, when compared with conventional dryers and paper towels.

Which is the greenest of them all?

It seems a compelling argument can be made that, when faced with the choice, we should reach for the high-speed electrical dryer over the conventional dryer, and even the humble paper towel.

As electrical grids become less greenhouse intensive the environmental benefits of high-speed electrical dryers over paper towels may even increase.

However, this trend could change in the future: towels may become lighter and smaller; social marketing campaigns may highlight how towels can be better used and reused; new technologies may surpass the benefits of high-speed drying.

Regardless, the key point here is that products, such as those for hand drying, should be considered within the broader context in which they occur; that is, across the entire life cycle from cradle to grave.

Only once we take into account the whole system can we make informed decisions that can secure better environmental outcomes, now and into the future.

And at least we can now feel a little less anxious the next time we’re faced with this hand-wringing dilemma.

 


 

Simon Lockrey is Research Fellow at RMIT UniversityThe Conversation.

This article was originally published on The Conversation. Read the original article here.

 

Truss’s decision: badger culling will continue, with no evidence it works

Roll badger culls out across the country, with far fewer criteria to control the gunmen – that’s what Liz Truss wants.

The Environment Secretary’s statement to Parliament on the 2015 badger culls in Somerset, Gloucester and Dorset – made just as MPs were about to go home for their Christmas jollifications – would have been laughable if it wasn’t such a dire repeat of the previous two years’ misinformation and bad science.

She cites the Chief Veterinary Officer as saying that “industry-led badger control” – a chilling term – will achieve disease control benefits. She says the government’s approach to dealing with bTB has worked in other countries.

It hasn’t. The only country that has seriously culled its badger population is Ireland, and the facts from there are very dodgy.

The one welcome announcement was that they are finally going to introduce statutory post-movement testing for cattle, something many farmers have been crying out for. But even that only goes so far.

An unsubstantiated claim by Truss

Answering MPs’ questions , Truss claimed that more than half of England (the Low Risk Area) will officially bTB free by 2020, but ignored the fact that:

  • Scotland has been officially bTB free for quite some years, without culling badgers.
  • Wales took the decision in 2009 NOT to cull badgers but to have strict bio-security measures, cattle movement controls and annual testing for all cattle. This has almost halved their cattle slaughter rate and they are on the way to becoming bTB free without killing badgers.
  • The Low Risk northern and eastern regions, although they currently have little bTB, have also not benefited from annual testing and tighter cattle movement controls. The incidence of TB is rising.

Asked by Labour MP David Hanson how many of the thousands of killed badgers had been tested for bTB, she first blamed Labour for creating the problem of bTB and then said,

“I am following the advice of the Chief Veterinary Officer, who says that culling is an important part of dealing with it. Why do Labour Members not congratulate the hard-working farmers in Somerset, Gloucestershire and Dorset who have delivered this year, and who are helping us to deal with this terrible disease?”

Untested badgers were not mentioned (bar one in the first year of culling, none have been tested).

Neil Parrish, the pro-culling Devon Tory MP said: “In Gloucestershire and Somerset, there has been a very beneficial reduction in the number of cattle suffering from TB in the badger culling areas.” He then asked, “When will the Secretary of State be able to release the figures that will show what is happening?”

Maybe when the moon turns blue, because if there genuinely were figures to suport his statement Liz Truss would have been touting them around every media outlet she could find.

Let’s use some facts

Truss claims that the badger culls in Somerset and Gloucester (their third year of culling) and Dorset (experiencing its first) have been successful. What does that mean? Successful in killing lots of badgers? Or successful in lowering the incidence of bovine TB among cattle?

The culls are being carried out in very small areas of each county (Somerset approx. 4% of the total land mass, Gloucester approx. 7% and Dorset approx. 8%). One really cannot claim that culling badgers in such a small%age of land is affecting the TB rates enough to be counted as ‘successful’.

Defra’s own statistics show that annual testing of cattle and other bTB control measures in Dorset was reducing TB without culling. And there is evidence, slight it is true, that perturbation of badger populations in Somerset has resulted in new incidents of bTB around the edge of the culling area.

This evidence comes from a website that maps bTB outbreaks in England for the last 5 years. It is worth noting that according to this map there was a total of 9-10 farms in the North Dorset culling area that had bTB breakdowns in 2015, only three of which were still under restrictions at the time of the badger cull. Compared to the spread of incidents in parts of Devon and Cornwall, this looks pretty sparse, and makes one wonder just why Dorset was allowed to have a cull.

The NFU was not happy when campaigners found and used this site. But it is factual, unlike claims based on hearsay rather than figures.

In 2014 the then Environment Secretary Owen Paterson was foolish enough to repeat to a farming journalist, as fact, something a Gloucester farmer had claimed; that since badger culling had started there had been a huge increase in ground-nesting birds (dead badgers don’t eat birds, and it’s not a staple food for live ones). This was news to the RSPB and embarrassing for Defra when they were queried about it.

This is, if course, a ‘science-led’ control of badgers

These culls are no longer pretending to be ‘pilot badger culls’, due to run for four years before being rolled out across the country. Until they are completed there can be no properly assessed scientific evidence that culling badgers will result in less bTB. To have any roll-out without that evidence is utterly unscientific. Nor is it bovine TB control. It is just ‘badger control’.

Defra launched a consultation on 28 August 2015 on their plans to ‘update’ the criteria for culling even more badgers. The 2015 culls started just three days later, on the August Bank Holiday.

They solicited responses by emailing over 300 ‘interested parties’. Others had to find out for themselves, which meant that some badger groups only had a few days to send in their responses before the month-long consultation closed.

There were 1378 responses, 90% of them from the public. Farmers and farming organisations accounted for just 3%. The fact that the 2010 consultation on badger culling elicited over 59,000 responses demonstrates how unpublicised government consultations can be, particularly when they don’t want to hear the answers.

Three proposals were offered:

  1. The length of the culls should not be limited to the current 6 weeks;
  2. Allowing culling in a much smaller area (100 sq.km rather than the current 200 sq.km plus);
  3. Providing more flexibility (or ‘anything goes’) for licensing of new areas of culling.

Having dismissed those who were against badger culling in principle (“many responses appeared to have been submitted in response to campaigns … “), it must have been clear to Defra what the majority opinion was:

All three proposals could increase the perturbation of badger populations, leading to increases of TB in cattle (as proved by the Randomised Badger Culling Trials). All three proposals were moving away from the criteria set by the RBCT – a legitimate argument seeing that the government relied heavily on the RBCT to justify culling badgers, while happily misquoting its findings.

There were also worries on welfare issues and the possibility that local populations could be wiped out. Of the several hundred responses to each question, only 40-46 people broadly supported the proposals mostly, judging from the reasons given, because they would hamper those trying to protect badgers.

To all of which Defra replied that the responses “have helped inform the Secretary of State’s decision to implement the proposals”, which is a horrifying prospect for England’s badgers. It will almost amount to badgers being shot wherever and whenever the gunmen choose.

And, seeing that the government has refused to release the true costs of culling badgers, it will cost unknown sums in policing. On only one thing have they given way – they have apparently agreed to test culled badgers for bovine TB.

And what will they do if it is found that too few badgers have bTB? Apart from staying very, very silent. Or make use of that statement (attributed to Truss) about failing defences in last month’s disastrous floods:

“Our defences worked really well right up to the point at which they failed.”

 


 

Lesley Docksey is a freelance writer who writes for The Ecologist and other media on the badger cull and other environmental topics.

Also on The Ecologist:So badger culls are working? Liz Truss, produce your evidence!‘ by Oliver Tickell.

 

Nuclear power too slow for China’s low carbon switch

Efforts to speed carbon cuts pledged under the Paris climate deal will require a fast, mass mobilisation of low carbon technology at costs that can be competitive with coal, a task to which nuclear power will likely be unsuited.

So says a new version of the World Nuclear Energy Industry Status Report, which tracks developments in the sector and provides outlooks based on developments energy and climate policy.

Time is the main enemy of the world nuclear industry, says Mycle Schneider, the author of the report which had its abridged version published in Beijing.

“Everyone needs to speed up energy transition, and cheap quick technology is going to be the first choice”, he said, pointing to figures in the report which indicate that 70% of the 60 or so reactors currently worldwide are delayed. Five of these have been listed as ‘under construction’ for over 30 years.

The 40 reactor units built between 2005 and July 2015 had an average construction time of 9.4 years, suggesting that a fast-roll of the technology in future decades is highly unlikely. In China, where 18 units under construction, the average construction time is faster at 5.7 years.

And as nuclear power has already suffered three major accidents (Three Mile Island in the US, Chernobyl in the former Soviet Union, and Fukushima in Japan), higher safety demands raise standards for construction and operation, ensuring that the majority of postponements or cancellations of nuclear power plants were due to excessive costs, Schneider adds.

Nuclear share of power generation on a long term decline

Meanwhile, wind and solar power are being rolled out at a much faster pace. Between the signing of the Kyoto Protocol in 1997 and 2014, growth in solar and wind capacity outstripped that in nuclear power. Nuclear power peaked in terms of share of energy production in 1996, at 17.6%, falling to 10.8% in 2014.

Even in China, which is bucking the trend by persisting with plans for nuclear power, the outlook for build-out is looking increasingly uncertain, as nuclear power faces both high costs and fierce competition from other energy sources in the electricity sector.

Wang Yinan, a researcher at the State Council’s Development Research Centre, is opposed to large-scale development of nuclear power, believing the risks involved are too great. The latest technology, known as Generation III+, is unproven in practice, with none of these new reactors yet in operation.

To compound the problem, regulatory staff and engineers are not yet up to speed with the new technology, raising concerns about safety.

As Schneider puts it, those capabilities take time to build up – it’s not just a matter of taking a training course in nuclear power. And in a country as densely populated as China, an accident could be catastrophic.

Nuclear target – 58GW by 2020 – unlikely to be achieved

A frenzy of construction in new coal-fired power stations and a rapid expansion of wind and solar power, has placed China’s with an electricity market in surplus. But China has approved new reactors in all of the last six years, with the exception of 2011 when a moratorium was placed after the Fukushima disaster.

National Development and Reform Commission plans would see China reach 58 gigawatts (GW) of installed nuclear power generating capacity by 2020. However as only 51 GW of capacity is already in operation or under construction, it is very unlikely this target will be reached, given that length of time it takes to build new nuclear.

However some think that on current trends at least, nuclear power is essential. Lan Ziyong, chief engineer for the China Nuclear Energy Association is one. He outlines two reasons why it is needed. “First, to meet our emissions commitments; second because wind and solar power are not stable enough.”

The future will depend on whether or not technology can keep up with the trends. Yang Fuqiang, Senior Adviser on Climate, Energy and Environment for the Natural Resources Defense Council, says that it would be feasible for nuclear power to provide 10%-12% of China’s energy, up from the current 3%.

“If there are new breakthroughs in nuclear technology and current safety concerns are resolved, nuclear power may yet have a longer lifespan”, Yang points out.

 


 

Zhang Chun is an editor in China Dialogue‘s Beijing office.

This article was originally published by China Dialogue under a Creative Commons’ Attribution-NonCommercial-NoDerivs 2.0 England & Wales License and 2.5 China License.

creative commons

 

 

Nuclear power too slow for China’s low carbon switch

Efforts to speed carbon cuts pledged under the Paris climate deal will require a fast, mass mobilisation of low carbon technology at costs that can be competitive with coal, a task to which nuclear power will likely be unsuited.

So says a new version of the World Nuclear Energy Industry Status Report, which tracks developments in the sector and provides outlooks based on developments energy and climate policy.

Time is the main enemy of the world nuclear industry, says Mycle Schneider, the author of the report which had its abridged version published in Beijing.

“Everyone needs to speed up energy transition, and cheap quick technology is going to be the first choice”, he said, pointing to figures in the report which indicate that 70% of the 60 or so reactors currently worldwide are delayed. Five of these have been listed as ‘under construction’ for over 30 years.

The 40 reactor units built between 2005 and July 2015 had an average construction time of 9.4 years, suggesting that a fast-roll of the technology in future decades is highly unlikely. In China, where 18 units under construction, the average construction time is faster at 5.7 years.

And as nuclear power has already suffered three major accidents (Three Mile Island in the US, Chernobyl in the former Soviet Union, and Fukushima in Japan), higher safety demands raise standards for construction and operation, ensuring that the majority of postponements or cancellations of nuclear power plants were due to excessive costs, Schneider adds.

Nuclear share of power generation on a long term decline

Meanwhile, wind and solar power are being rolled out at a much faster pace. Between the signing of the Kyoto Protocol in 1997 and 2014, growth in solar and wind capacity outstripped that in nuclear power. Nuclear power peaked in terms of share of energy production in 1996, at 17.6%, falling to 10.8% in 2014.

Even in China, which is bucking the trend by persisting with plans for nuclear power, the outlook for build-out is looking increasingly uncertain, as nuclear power faces both high costs and fierce competition from other energy sources in the electricity sector.

Wang Yinan, a researcher at the State Council’s Development Research Centre, is opposed to large-scale development of nuclear power, believing the risks involved are too great. The latest technology, known as Generation III+, is unproven in practice, with none of these new reactors yet in operation.

To compound the problem, regulatory staff and engineers are not yet up to speed with the new technology, raising concerns about safety.

As Schneider puts it, those capabilities take time to build up – it’s not just a matter of taking a training course in nuclear power. And in a country as densely populated as China, an accident could be catastrophic.

Nuclear target – 58GW by 2020 – unlikely to be achieved

A frenzy of construction in new coal-fired power stations and a rapid expansion of wind and solar power, has placed China’s with an electricity market in surplus. But China has approved new reactors in all of the last six years, with the exception of 2011 when a moratorium was placed after the Fukushima disaster.

National Development and Reform Commission plans would see China reach 58 gigawatts (GW) of installed nuclear power generating capacity by 2020. However as only 51 GW of capacity is already in operation or under construction, it is very unlikely this target will be reached, given that length of time it takes to build new nuclear.

However some think that on current trends at least, nuclear power is essential. Lan Ziyong, chief engineer for the China Nuclear Energy Association is one. He outlines two reasons why it is needed. “First, to meet our emissions commitments; second because wind and solar power are not stable enough.”

The future will depend on whether or not technology can keep up with the trends. Yang Fuqiang, Senior Adviser on Climate, Energy and Environment for the Natural Resources Defense Council, says that it would be feasible for nuclear power to provide 10%-12% of China’s energy, up from the current 3%.

“If there are new breakthroughs in nuclear technology and current safety concerns are resolved, nuclear power may yet have a longer lifespan”, Yang points out.

 


 

Zhang Chun is an editor in China Dialogue‘s Beijing office.

This article was originally published by China Dialogue under a Creative Commons’ Attribution-NonCommercial-NoDerivs 2.0 England & Wales License and 2.5 China License.

creative commons