Category Archives: Vulnerability Paradigm

New Book “Institutions And Incentives In Regulatory Science” (Edited by Jason Scott Johnston, 2012).

There is a very informative new book that has appeared. It is

Institutions and Incentives in Regulatory Science (Edited by Jason Scott Johnston, 2012).

Available at http://www.amazon.com/Institutions-Incentives-Regulatory-Science-Johnston/dp/0739169467, as well as other online sites.

The book summary reads [highlight added]

From endangered species protection to greenhouse gas regulations, modern regulatory interventions are justified by science.  Indeed, legislators look to science for simple answers to complex regulatory questions.  This regulatory demand for scientific answers collides with the scientific reality that on the frontiers of science, there are no simple answers, only competing hypotheses and accumulating but as yet often inconclusive evidence.   Given inevitable scientific uncertainty, regulatory agencies such as the U.S. Environmental Protection Agency are put in the position of adjudicating unresolved scientific controversies.  As the contributions to this volume show conclusively and in great detail, such agencies (and other assessment organizations such as the Intergovernmental Panel on Climate Change or IPCC) are far from unbiased in how they assess regulatory science.   They instead act as advocates for those scientific positions that further the regulatory agenda of promulgating new regulations and increasing the scope of the regulatory state.

The book describes many facts about how regulatory agencies use science to justify their regulations that may surprise and even shock many readers:

  • In the area of climate science, where the IPCC is advertised as an objective and unbiased assessment body, the facts are that the Lead Authors for IPCC Assessment Reports are chosen by political representatives on the IPCC, and have no duty to respond in any way to the comments of outside reviewers of IPCC draft chapters.  The oft-repeated claim that there are “thousands” of scientists involved in outside review of IPCC Assessment Reports is patently false, with generally only a few dozen truly independent outside reviews submitted even on key chapters.  Perhaps most strikingly, the Editors with responsibility for overseeing the decisions of chapter Authors are themselves chosen by the same people (Working Group Chairs) who pick the Authors.  An outside audit of the IPCC commissioned by the IPCC itself (done by the Interacademy Council) concluded that some body other than the IPCC should choose the Review Editors but acknowledged that there is no such outside body.
  • Perhaps more than any other U.S. environmental law, the Endangered Species Act looks to science for clear answers regarding which species are imperiled and how to protect them.  But as this book shows, for even the most basic threshold question – as to whether a population constitutes a species or sub-species – there is no scientific answer.  As for the definition of a species, there are over a dozen competing definitions, and the categorization of a sub-species is even more problematic, with a plethora of approaches that have allowed the United States Fish and Wildlife Service (USFWS) and its biological advisers in the U.S Geological Service (USGS) to effectively declare sub-species at will, as even slight morphological or genetic differences are seized upon to indicate reproductive isolation and the propriety of categorizing a population as a sub-species.   Even more seriously, the book recounts how USFWS peer review in cases of controversial taxonomic classification has involved the selective disclosure of underlying data to outside peer reviewers and has been actively controlled by USGS scientists with a strong self-interest in USFWS determinations.  The book’s ESA chapters clearly show how supposedly scientific disagreement about whether a population is or is not a legally protected sub-species in fact reflect differing policy preferences, different weights that scientists attach to potential errors in triggering, or failing to trigger, legal protection.
  • Perhaps the most dramatic case studies in the book come from the area of chemical toxicity assessment by the U.S. E.P.A. and National Institute for Environmental Health (NIEH). The book shows how the EPA has made determinations of chemical toxicity that deliberately ignore the most recent and most methodologically sound studies when those studies fail to support the agency’s preferred, pro-regulatory result of significant health risk at low doses.  The case studies here include formaldehyde, where the National Academy of Science (NAS) itself concluded that EPA’s risk assessment “was based on a subjective view of the overall data” and failed to provide a plausible method by which exposures could cause cancer, a failure especially problematic given “inconsistencies in the epidemiological data, the weak animal data, and the lack of mechanistic data.”  Equally dramatic is the story of EPA risk assessment for dioxin.  Here, the agency continues to apply its decades-old assumptions that cancer risks at low doses can be extrapolated linearly from those actually observed in animal studies at high doses, and that there is no threshold level of exposure below which excess risk falls to zero.  EPA continues to maintain these assumptions despite the NAS’s admonition that “EPA’s decision to rely solely on a default linear model lacked adequate scientific support.”  Perhaps most disturbingly, the book provides examples of how supposedly unbiased outside scientific advisory panels are tainted by conflicts of interest. In the case of bisphenol A, for example,  the NIEHS awarded $30 million in grants to study that chemical to scientists who had already  publicly stated that the chemical’s toxicity was already well-researched and reasonably certain.

All told, the institutional details and facts provided by the authors’ of Institutions and Incentives in Regulatory Science paint a picture of a serious crisis in the scientific foundations of the modern regulatory state.   But the authors go beyond this, by providing suggestions for reform.  These proposals span a wide range.  In climate science, author proposals range from calling for a much more open and adversary presentation of competing work in climate science to the abolition of the IPCC as a standing body.  In endangered species regulation, proposals range from more strictly science-based thresholds for sub-species determination to a separation of the science of species determination from the  legal consequences of listing under the ESA.  In environmental regulation, some authors call for a more open and transparent process of scientific assessment in which agencies such as the EPA publicly acknowledge and fully discuss the science on both sides of complex regulatory decisions, while others call for the strict separation of scientific assessment from regulatory authority.

The authors possess a unique combination of expertise and experience: Jamie Conrad is a principal of Conrad Law & Policy Counsel and author editor of the Environmental Science Deskbook (1998);

Susan Dudley, former Administrator of the Office of Information and Regulatory Affairs in OMB, is the founding Director of the Regulatory Studies Center at George Washington University’s Trachtenberg School of Public Policy;

George Gray, Professor of environmental and occupational health and director of the Center for Risk Science and Public Health at the George Washington University School of Public Health and Health Sciences, was formerly science advisor at the U.S. E.P.A. and Executive Director of the Harvard Center for Risk Analysis;

Jason Scott Johnston is the Henry L. and Grace Doherty Charitable Foundation Professor of Law and the Nicholas E. Chimicles Research Professor in Business Law and Regulation at the University of Virginia Law School and the author numerous articles appearing in both peer-edited law and economics journals and law reviews;

Gary E. Marchant, formerly a partner at Kirkland & Ellis is Lincoln Professor of Emerging Technologies, Law, and Ethics and Executive Director and faculty fellow at the Center for Law, Science and Innovation in Sandra Day O’Connor College of Law at Arizona State University;

Ross McKitrick, Professor of Economics at the University of Guelph is the author of Taken by Storm: The Troubled Science, Policy and Politics of Global Warming (2003) and of numerous articles appearing in peer-edited climate science journal such as Geophysical Research Letters ;

Rob Roy Ramey II, principal of Wildlife Science International, has consulted on several of the most significant Endangered Species Act listing decisions of the past decades and is the author of numerous scientific papers appearing in journals such as Science and Animal Conservation;

Katrian Miriam Wyman, Professor of Law at New York University Law School, is the editor and author (with David Schoenbrod and Richard Stewart) of Breaking the Logjam: Environmental Protection that Will Work (2010).

Comments Off

Filed under Books, Vulnerability Paradigm

New Report “Gulf Coast Climate Information Needs Assessment” By Hal Needham and Lynne Carter

There was an interesting survey of stakeholders in the report

Gulf Coast Climate Information Needs Assessment

by Hal Needham and Dr. Lynne Carter.

Here are two examples of a question and the answer:

Research Question: Coast region today ….Identify present weather/climate issues impacting your community today AND if you have any interest in how other communities are dealing with similar issues.

Most significant climate-related issues in region today:

1. Hurricanes
2. Storm Surge
3. Rainfall Flood
4. Wind Storm
5. Sea level rise

Most significant climate-related issues for the region in the future?

1. Hurricanes
2. Storm surge
3. Rainfall Flood
4. Sea level rise
5. Windstorm

They also asked

Have you noticed any changes related to a changing climate

and they summarized as

Nearly even split between ‘yes” and ‘no’ responses

Many ‘yes’ respondents provided specific examples

Of the 26 ‘no’ responses, 16 further noted that they had seen changes but they were due to natural cycles.

With respect to the use of climate models, they wrote

About one-half said they might be interested in using climate outputs but their time frames were much shorter than the 25 and 100 year outputs

While few respondents are using long-term climate projections in their decision making, many provided examples of how they might use such information in the future.

Of the total of 48 responses to this question, 21 stated clearly they were not interested in using long-term climate model projections.

and

Time frames: mostly shorter than the 25-100 years for most climate models, rather more like two-weeks to one year.

We need more such assessments by stakeholder, as this fits in the bottom-up, resource-based perspective that we have urged be adopted in our paper

Pielke, R. A., Sr., R. Wilby,  D. Niyogi, F. Hossain, K. Dairuku,J. Adegoke, G. Kallos, T. Seastedt, and K. Suding (2012), Dealing with complexity and extreme events using a bottom-up, resource-based vulnerability perspective, in Extreme Events and Natural Hazards: The Complexity Perspective, Geophys. Monogr. Ser., vol. 196, edited by A. S. Sharma et al. 345–359, AGU, Washington, D. C., doi:10.1029/2011GM001086. [the article can also be obtained from here]

source of image

Comments Off

Filed under Vulnerability Paradigm

New Paper “Decision Scaling: Linking Bottom-Up Vulnerability Analysis With Climate Projections In The Water Sector” By Brown Et Al 2012

I was alerted to a new paper by Faisal Hossain that has adopted part of  the bottom-up perspective with respect that we hae proposed in our article

Pielke, R. A., Sr., R. Wilby,  D. Niyogi, F. Hossain, K. Dairuku,J. Adegoke, G. Kallos, T. Seastedt, and K. Suding (2012), Dealing with complexity and extreme events using a bottom-up, resource-based vulnerability perspective, in Extreme Events and Natural Hazards: The Complexity Perspective, Geophys. Monogr. Ser., vol. 196, edited by A. S. Sharma et al. 345–359, AGU, Washington, D. C., doi:10.1029/2011GM001086. [the article can also be obtained from here]

The new paper is

Brown, C., Y. Ghile, M. Laverty, and K. Li (2012), Decision scaling: Linking bottom-up vulnerability analysis with climate projections in the water sector, Water Resour. Res., 48, W09537, doi:10.1029/2011WR011212.

The abstract reads [highlight added]

There are few methodologies for the use of climate change projections in decision making or risk assessment processes. In this paper we present an approach for climate risk assessment that links bottom-up vulnerability assessment with multiple sources of climate information. The three step process begins with modeling of the decision and identification of thresholds. Through stochastic analysis and the creation of a climate response function, climate states associated with risk are specified. Climate information such as available from multi-GCM, multi run ensembles, is tailored to estimate probabilities associated with these climate states. The process is designed to maximize the utility of climate information in the decision process and to allow the use of many climate projections to produce best estimates of future climate risks. It couples the benefits of stochastic assessment of risks with the potential insight from climate projections. The method is an attempt to make the best use of uncertain but potentially useful climate information. An example application to an urban water supply system is presented to illustrate the process.

The article contains the statement with respect to the top-down global climate model predictions that

A problem with this approach is that GCM projections are relatively poor scenario generators.

I agree, and would add that they have provided NO demonstration of skillful regional and local predictions of changes in climate statistics from that in the historical and paleorecord.

The authors also write

“A novel aspect of the approach is that it uses decision analysis as a framework for characterizing the climate future, and consequently, climate projections, in terms of their position relative to decision thresholds. In doing so, it uses stochastic analysis for risk identification and uses GCM projections for risk estimation, assigning probabilities to hazards, thus linking the two methods.”
 but do, at least, recognize the need to move beyond just the GCM runs. They write
“Appropriately tailored climate information, including GCM projections and stochastically generated conditions from historical and paleodata, and the application of expert judgment, may provide informative answers to this question when approached in the manner described here.”

While the paper still seems to accept the robustness of the global climate model predictions, it does recognize that there are other approaches to accept climate risk.

The article, unfortunately, does not consider other environmental and social risks, relative to climate risks. In the Pielke et al 2012 paper we wrote

“We discuss the adoption of a bottom-up, resource–based vulnerability approach in evaluating the effect of climate and other environmental and societal threats to societally critical resources.This vulnerability concept requires the determination of the major threats to local and regional water, food, energy, human health, and ecosystem function resources from extreme events including climate, but also from other social and environmental issues. After these threats are identified for each resource, then the relative risks can be compared with other risks in order to adopt optimal preferred mitigation/adaptation strategies.

This is a more inclusive way of assessing risks, including from climate variability and climate change than using the outcome vulnerability approach adopted by the IPCC. A contextual vulnerability assessment, using the bottom-up, resource-based framework is a more inclusive approach for policymakers to adopt effective mitigation and adaptation methodologies to deal with the complexity of the spectrum of social and environmental extreme events that will occur in the coming decades, as the range of threats are assessed, beyond just the focus on CO2 and a few other greenhouse gases as emphasized in the IPCC assessments.”

Nonetheless, it is refreshing to see the much-needed start of a movement away from the top-down IPCC approach of assessing risks to key resources, which, as we have shown in our papers and in my weblog posts, is a fundamentally flawed approach.

I have sent the first author a copy of our 2012 paper and hope they will move to the complete adoption of the bottom-up, resource-based perspective.

source of image

Comments Off

Filed under Research Papers, Vulnerability Paradigm

Our Chapter “Dealing with complexity and extreme events using a bottom-up, resource-based vulnerability perspective” By Pielke Sr Et Al 2012 Has Appeared

Our article

Pielke, R. A., Sr., R. Wilby,  D. Niyogi, F. Hossain, K. Dairuku,J. Adegoke, G. Kallos, T. Seastedt, and K. Suding (2012), Dealing with complexity and extreme events using a bottom-up, resource-based vulnerability perspective, in Extreme Events and Natural Hazards: The Complexity Perspective, Geophys. Monogr. Ser., vol. 196, edited by A. S. Sharma et al. 345–359, AGU, Washington, D. C., doi:10.1029/2011GM001086. [the article can also be obtained from here]

has appeared in

Sharma, A. S.,A. Bunde, P. Dimri, and D. N. Baker (Eds.) (2012), Extreme Events and Natural Hazards: The Complexity Perspective, Geophys. Monogr. Ser., vol. 196, 371 pp., AGU, Washington, D. C., doi:10.1029/GM196.

The description of the book is given on the AGU site as [highlight added]

Extreme Events and Natural Hazards: The Complexity Perspective examines recent developments in complexity science that provide a new approach to understanding extreme events. This understanding is critical to the development of strategies for the prediction of natural hazards and mitigation of their adverse consequences. The volume is a comprehensive collection of current developments in the understanding of extreme events. The following critical areas are highlighted: understanding extreme events, natural hazard prediction and development of mitigation strategies, recent developments in complexity science, global change and how it relates to extreme events, and policy sciences and perspective. With its overarching theme, Extreme Events and Natural Hazards will be of interest and relevance to scientists interested in nonlinear geophysics, natural hazards, atmospheric science, hydrology, oceanography, tectonics, and space weather.

The abstract of our article reads

“We discuss the adoption of a bottom-up, resource–based vulnerability approach in evaluating the effect of climate and other environmental and societal threats to societally critical resources.This vulnerability concept requires the determination of the major threats to local and regional water, food, energy, human health, and ecosystem function resources from extreme events including climate, but also from other social and environmental issues. After these threats are identified for each resource, then the relative risks can be compared with other risks in order to adopt optimal preferred mitigation/adaptation strategies.

This is a more inclusive way of assessing risks, including from climate variability and climate change than using the outcome vulnerability approach adopted by the IPCC. A contextual vulnerability assessment, using the bottom-up, resource-based framework is a more inclusive approach for policymakers to adopt effective mitigation and adaptation methodologies to deal with the complexity of the spectrum of social and environmental extreme events that will occur in the coming decades, as the range of threats are assessed, beyond just the focus on CO2 and a few other greenhouse gases as emphasized in the IPCC assessments.”

In the assessment of climate risks, the approach we recommend is an inversion of the IPCC process, where the threats from climate, and from other environmental and social risks are assessed first, before one inappropriately and inaccurately runs global climate models  to provide the envelope of future risks to key resources.

Comments Off

Filed under Research Papers, Vulnerability Paradigm

New Paper “Climate Feedback–Based Provisions For Dam Design, Operations, And Water Management In The 21st Century” By Hossain Et Al 2012

We have a new paper under the leadership of Fasial Hossain of the  Department of Civil and Environmental Engineering at Tennessee Technological University,

Hossain, F., A.M. Degu, W. Yigzaw, S.J. Burian, D. Niyogi, J.M. Shepherd and R.A. Pielke Sr., 2012: Climate feedback–based provisions for dam design, operations, and water management in the 21st Century. J. Hydro. Eng., DOI: DOI: 10.1061/(ASCE)HE.1943-5584.0000541, in press.

The conclusion reads in part [highlight added]

The purpose of this article is to shed light on the need for climate feedback-based considerations in dam design, operations, and water management for the 21st century. It first overviewed the known impacts on climate from changes in land use and land cover that are typically anticipated once a dam is constructed. Recent research was presented on the first-order signature around dams on local climate using observational evidence. A global overview of the location of large dams was presented to highlight the need to treat each dam uniquely according to its location and the larger setting. It is now obvious that the observational data associated with current dams, combined with the rich body of research of LCLU impact on climate, can provide the planning and engineering professions with insightful guidance for both operations and more robust dam-building in the 21st century as well as modifications of local design guidelines to account for climate feedback.

The conclusion includes the recommendation that

One way to maximize the ability of future generation of engineers to assimilate knowledge on climate modification for dam design and operations is to enhance the baccalaureate curriculum by adding prerequisite courses on atmospheric sciences and climate.

In this context, we do not mean climate education that starts from the incorrect premise that the human addition of CO2 and a few other greenhouse gases dominate the climate system response in the coming decades. We subscribe to the robust view of the climate system as reported in the 2005 NRC assessment report

National Research Council, 2005: Radiative forcing of climate change: Expanding the concept and addressing uncertainties. Committee on Radiative Forcing Effects on Climate Change, Climate Research Committee, Board on Atmospheric Sciences and Climate, Division on Earth and Life Studies, The National Academies Press, Washington, D.C., 208 pp.

and summarized in

Pielke Sr., R., K.  Beven, G. Brasseur, J. Calvert, M. Chahine, R. Dickerson, D.  Entekhabi, E. Foufoula-Georgiou, H. Gupta, V. Gupta, W. Krajewski, E.  Philip Krider, W. K.M. Lau, J. McDonnell,  W. Rossow,  J. Schaake, J.  Smith, S. Sorooshian,  and E. Wood, 2009: Climate change: The need to consider human forcings besides greenhouse gases.   Eos, Vol. 90, No. 45, 10 November 2009, 413. Copyright (2009) American   Geophysical Union.

source of image

Comments Off

Filed under Research Papers, Vulnerability Paradigm

Article Titled “Water Resource Management During Prolonged Drought Periods” By Will Alexander

Professor Will Alexander alerted me to a presentation he will be giving in Pretoria in October titled

Water resource management during prolonged drought periods [there is no url for it, but it can be obtained by requesting from Professor Alexander at alexwjr@iafrica.com].

WJR (Will) Alexander is Professor Emeritus of the Department of Civil Engineering of the University of Pretoria, South Africa, and Honorary Fellow of the South African Institution of Civil Engineering. Other posts on his perspective appear in

Guest Post “Global Floods – Why Were They Not Predicted?” By Will Alexander

Climate Change: The West vs The Rest by Will Alexander

A Guest Weblog By Will Alexander “Climategate Chaos”

He clearly has the credentials to present his viewpoint on climate science.

His article starts with [highlight added]

South Africa’s surface water resources are rapidly approaching depletion. It is essential that practitioners in these fields should have an advanced knowledge of multi-site, multi-year, periodical hydrological statistics and practical experience in these fields. Details are provided in the author’s substantial handbook on analytical methods for water resource development and management that will be distributed during the symposium.

The text includes his sobering assessment that

The following diagram shows the projected future water demand in South Africa in relation to available conventional resources. It is based on estimates by du Plessis and van Robbroeck in 1978. These estimates have been confirmed by other investigators. The diagram shows that usage will exceed the economically available runoff before the year 2020. It also shows that by 2050 the demand will exceed the total runoff from all South African rivers!

He further writes that

I cannot emphasise strongly enough, that attempts to apply climate change theory before and during drought conditions will have disastrous consequences on the welfare of this country and its poor and disadvantaged peoples for whom I have the greatest concern….

South Africa and its peoples face severe humanitarian, social and economic consequences as a result of this thoroughly unscientific policy.

The entire preprint can be obtained from Professor Alexander at alexwjr@iafrica.com

source of image

Comments Off

Filed under Vulnerability Paradigm

Article In Physics Today By David Kramer Titled “Scientists Poke Holes In Carbon Dioxide Sequestration”

There is an interesting article that has appeared in the August 2012 issue of Physics Today titled

Scientists poke holes in carbon dioxide sequestration

The article starts with the text [highlight added]

Newly published geophysical research and a committee of experts have cast doubts on whether carbon capture and storage (CCS) can play the major role that some scientists and coal producers had hoped for in mitigating climate change. A report released by the National Research Council (NRC) in mid-June warns that the injection of millions of tons of supercritical liquid carbon dioxide from fossil-fuel plants into deep geological formations is likely to create earthquakes that will fracture the surrounding impermeable rock and allow the greenhouse gas to work its way back toward the surface. Separately, Stanford University geophysicists Mark Zoback and Steven Gorelick write in a 26 June article in the Proceedings of the National Academy of Sciences that “there is a high probability that earthquakes will be triggered by injection of large volumes of CO2  into the brittle rocks commonly found in continental interiors.” They argue that “large-scale CCS is a risky, and likely unsuccessful, strategy for significantly reducing greenhouse gas emissions.”

Colorado School of Mines geologist Murray Hitzman, who chaired the NRC committee that wrote Induced Seismicity Potential in Energy Technologies, told a 19 June hearing of the Senate Committee on Energy and Natural Resources that two factors, “net fluid balance” and the volume of the injected liquid, largely determine whether an earthquake will result when liquids are pumped into underground formations. According to the NRC report, oil and gas development projects that take into account the balance between fluid injected and fluid withdrawn produce significantly fewer seismic events than projects that ignore the fluid balance. In CCS, CO2 is injected without any corresponding extraction of the brine that’s often present in the formation.
Zoback, who also appeared at the Senate committee hearing, said that for CCS to contribute significantly to mitigating climate change, about 3.5 billion metric tons worldwide would have to be sequestered annually. Right now, a few large-scale CCS operations, including one at a Norwegian gas well in the North Sea and another at a gas well in Algeria, are each storing around 1 million tons a year.

Other information in the article includes

Beginning in 2017 the FutureGen Alliance, a US-based industry–government consortium, plans to capture and store 1.3 million tons of CO2 per year at a coal-burning power plant in Meredosia, Illinois. Lawrence Pacheco, a spokesman for the $1.3 billion venture, says that at the injection site both the porosity of the sandstone formation nearly a mile below the surface and the caprock permeability are ideal for CO2 storage. In addition to a $1 billion pledge to FutureGen, DOE is funding three industrial-scale CCS projects, including a plan to capture and store 4.5 million tons a year from a methanol refinery and another to sequester 1 million tons annually from ethanol production. Two of the three projects will use the CO2 in enhanced oil recovery.

The article ends with the text

In the big picture, seismicity pales in comparison to cost as an impediment to the adoption of CCS, says Rachel Cleetus, a climate economist with the Union of Concerned Scientists. “Honestly, the challenges to CCS are so significant on the economic front that this is just going to be one more thing that makes people question the risk of going down that path versus other options that are readily available and much less risky, such as wind and solar,” she says.
“The difficulty is that carbon isn’t priced in a meaningful way,” adds GeoScience’s Batchelor. “Until carbon has a price, it bears down on the renewables, and it bears down on CCS. And the US, UK, and most European governments are not going to put their industries at a competitive disadvantage by saying we insist you do [CCS] and double the price of power on a unilateral basis.”
I was on an advisory committees to the Biological and Environmental Sciences Divisions Oak Ridge Laboratory review panel in 2008 where CCS was presented (I will have more to say about the review we completed in a later post).  We were told that the goal is to sequester ALL of the emissions into the ground. This seemed an unrealisitc goal when it was presented to us in 2008, and it still is, in my view. The Physics Today article provides new information on the practicality of CCS and the risks involved.

Comments Off

Filed under Climate Change Forcings & Feedbacks, Vulnerability Paradigm

Another Example Of Weather Risks Due To Atmospheric Circulation Patterns “Argentine Wheat Sowing Slowed By Cold, Dry Weather”

As the USA drought and heat continues to significantly affect crops, I came across an interesting news article on a weather threat in South America that is due to cold and dry weather. The article Hugh Bronstein of Reuters is titled

Argentine wheat sowing slowed by cold, dry weather

Excerpts read [highlight added]

CBOT wheat prices rise for four straight weeks

* Adverse global crop weather fans supply worries

* Argentine growers shy from wheat to avoid export curbs

“BUENOS AIRES, July 13 (Reuters) – Dry, cold weather slowed Argentine wheat planting last week as farmers struggled to penetrate their frost-covered fields, the government said on Friday, further complicating a season marked by low output expectations. Argentina is the world’s No. 6 wheat exporter and principal supplier to neighboring Brazil. But plantings are set to fall 17 percent versus the previous crop year to 3.82 million hectares.”

The lack of rain over the last seven days was aggravated by low temperatures and frost throughout Buenos Aires province,” the Agriculture Ministry said in its weekly crop report. Buenos Aires accounts for more than half of Argentina’s total wheat output. In the district of Bragado, in the northern part of the province, “frosts have delayed the advance in the planting of winter wheat,” the report said. Chicago Board of Trade wheat prices have risen for four straight weeks, up 38.1 percent in that period, as adverse crop weather in major producers such as the United States and Australia fans supply worries. “

“Argentina, the world No. 3 soybean exporter, suffered a six-week drought in the December-January dog days of the Southern Hemisphere summer. The heat wave struck just as 2011/12 soy and corn plants were in their most delicate stage of flowering. The dry spell melted original expectations of a bumper crop and heavy May rains swamped some fields in Buenos Aires province, bogging down harvesting combines and forcing farmers to leave their late-seeded soy to rot.”

“….heat and drought continued to eat away at U.S. crop prospects. Argentina is also the world’s No. 2 corn exporter and the government estimates this season’s production at 20.1 million tonnes after the drought dashed early expectations of a 2011/12 crop well over the 23 million tonnes harvested in 2010/11.”

In terms of risks from weather extremes, the current threat to crops further illustrate that a global average surface temperature anomaly is not a useful metric to assess risk. Agriculture has always been at risk from weather extremes and this threat will continue into the future regardless of whether or not there are alteration in local and regional climate from human and/or natural forcings and feedbacks. A prudent way to reduce risk is to first develop mitigation and adaptation policies to weather extremes we have already experienced, and then build in a buffer in case more extreme events actually occur in the coming decades.

As the Reuters news article wrote

But the United Nations expects global food demand to double by 2050 as world population hits 9 billion. Argentina, which boasts a fertile Pampas grains belt bigger than the size of France, will be key to feeding an increasingly hungry world.

which means risk would increase even in the absence of changes in local and regional climate statistics.

source of image

Comments Off

Filed under Climate Science Reporting, Vulnerability Paradigm

An Example Of The Failure To Properly Respond To Climate Risk By The Obama Administration

source of image 

These last few weeks have involved wildfires destroying hundreds of homes, an organized thunderstorm system called a derecho resulting in several million homes without electric power, and a drought causing agricultural loss in large areas of the central USA. So how does the US government respond?

As reported in the Hill in the article by Ben Geman (h/t Marc Morano) [highlight added]

Obama official: US climate views shifting amid wild weather

A senior Obama administration scientist said this year’s heat and Western wildfires are altering perceptions of climate change in the United States.

Jane Lubchenco, who heads the National Oceanic and Atmospheric Administration, said in Australia on Friday that many have previously regarded climate change as a “nebulous concept,” The Associated Press reports.

Many people around the world are beginning to appreciate that climate change is under way, that it’s having consequences that are playing out in real time and, in the United States at least, we are seeing more and more examples of extreme weather and extreme climate-related events,” she said at a university in Canberra, AP reports.

“People’s perceptions in the United States, at least, are in many cases beginning to change as they experience something first-hand that they at least think is directly attributable to climate change,” she said.

Lubchenco “said that while it was impossible to attribute any single weather event to climate change, the pattern of extreme events was consistent with forecast consequences of increasing greenhouse gas emissions,” AP reports.

She is the second Obama administration official to weigh in this week on the nexus between the violent U.S. weather and climate change.

Homeland Security Secretary Janet Napolitano linked climate change with the wildfires hitting Colorado.
Napolitano said “there’s a pattern here” as she noted the summer wildfires as well as the East Coast heat wave and the high-velocity winds that whipped through the mid-Atlantic late last week.

For other comments on the extreme weather by senior members of the Obama administration see Judy Curry’s post

Week in review 7/6/12

with statements by Jane Lubchenco, Undersecretary of NOAA, Morris Sherman, Undersecretary of Agriculture for Resources and Natural Environment, and Janet Napolitano, Secretary Homeland Security.

The clear implication is that the Obama administration is going to continue with the top-down, global climate model approach to respond to extreme weather events. Their focus will be on mandating reductions in CO2 emissions as a way to reduce the occurrence of these extreme events.

However, as discussed in our article

Pielke Sr., R.A., R. Wilby, D. Niyogi, F. Hossain, K. Dairuku, J. Adegoke, G. Kallos, T. Seastedt, and K. Suding, 2012: Dealing  with complexity and extreme events using a bottom-up, resource-based  vulnerability perspective. AGU Monograph on Complexity and  Extreme Events in Geosciences, in press.

the top-down approach is too narrow, and will likely result in poor policy choices since all mitigation and adaptation responses to weather extremes are not being considered.

For example, with respect to the three extreme weather events listed earlier in this post there are a number of bottom up responses that should be adopted regardless of how or if weather patterns change in the future:

1. With respect to homes lost in wildfires, one way to reduce risk is to require homes built in those areas have fire resistant construction. This means that shake roofs be prohibited. When I lived in Fort Collins, our covenants actually required us to have skaked roofs! This is no better than having kindling for a roof top. A number of the homes lost in Colorado Springs appeared to have shake roofs which will often combust just from a single ember!

source of image 

source of image 

2. With respect to the recent power outages in the eastern USA, this has been a perennial problem. Tropical storms and hurricanes, ice storms and thunderstorms have caused large losses of power in the past due to trees and branches breaking electric lines (e.g. see hurricanes for Maryland). The obvious solution is to place the electric lines underground as much as possible, as they do in Colorado, Florida and elsewhere.  The cost for this reduction of risk certainly will be less than the losses incurred by the power outages that will inevitably occur again.

source of image 

source of image 

3. With respect to the drought, crop insurance certainly is a response by many farmers.  However, this is just a short-term stop-gap approach. What is needed is the development of pipelines to ship water across large distances. This has been proposed in Colorado   and California  (Big Straw project; see and see) and is worth considering throughout agricultural regions of the country.  Canada, for example, with its vast  fresh water supplies from inland lands could provide the USA with a source of irrigation water during times of drought.

image from WUWT 

source of image 

None of these approaches depend on whether weather patterns are changing or not. They make sense regardless. This approach is much better than appears to be adopted by the Obama administration. In the upcoming election, it could be another point of contrast in policy, if the Romney campaign adopts a broader based, resource-focused approach to reduce the risks of society to the climate.

Comments Off

Filed under Climate Science Reporting, Politicalization of Science, Vulnerability Paradigm

2012 IGBP Article “Cities Expand By Area Equal To France, Germany And Spain Combined In Less Than 20 years”

There is an article in the March 2012 issue of the IGBP Newletter

Fragkias, F. and K.C. Seto, 2012: The rise and rise of urban expansion Urban land area has expanded globally during the past few decades – a trend that looks set to continue in the foreseeable future. IGBP Newsletter, 78. March 2012

that documents the dynamic character of urbanization. This land use change not only affects local and regional climate, but also results in a time varying effect on surface temperatures that have been used by the IPCC and others as the iconic metric of global warming. As we reported on in

Montandon, L.M., S. Fall, R.A. Pielke Sr., and D. Niyogi, 2011: Distribution of landscape types in the Global Historical Climatology Network. Earth Interactions, 15:6, doi: 10.1175/2010EI371

GHCNv.2 station locations are biased toward urban and cropland (>50% stations versus 18.4% of the world’s land) and past century reclaimed cropland areas (35% stations versus 3.4% land).

This bias is only going to increase in coming years as urban areas continue to expand.

The press release on the article has the title

Cities expand by area equal to France, Germany and Spain combined in less than 20 years

Text in the press release includes [highlight added]

Unless development patterns change, by 2030 humanity’s urban footprint will occupy an additional 1.5 million square kilometres – comparable to the combined territories of France, Germany and Spain, say experts at a major international science meeting underway in London.

UN estimates show human population growing from 7 billion today to 9 billion by 2050, translating into some 1 million more people expected on average each week for the next 38 years, with most of that increase anticipated in urban centres. And ongoing migration from rural to urban living could see world cities receive yet another 1 billion additional people. Total forecast urban population in 2050: 6.3 billion (up from 3.5 billion today).

Fragkias [Dr. Michail Fragkias of Arizona State University] notes that while there were fewer than 20 cities of 1 million or more a century ago, there are 450 today. While urban areas cover less than five per cent of Earth’s land surface, “the enlarged urban footprint forecast is far more significant proportionally when vast uninhabitable polar, desert and mountain regions, the world breadbasket plains and other prime agricultural land and protected areas are subtracted from the calculation.”

This article provides support for a statement in an earlier chapter of this IGBP Newsletter

Syvitski, J. 2012: An epoch of our makng. IGBP Newletter. 78.  March 2012.

which highlights that with respect to the human role on the environment (including climate ) that

“….the Anthropocene isn’t as well known as global warming, which two out of three people had heard of by 2008, according to a Gallop Poll (http://www.gallup.com/ poll/117772/Awareness-Opinions-Global-Warming-Vary-Worldwide.aspx). But the former is a more effective paradigm in describing the cumulative impact of civilisation, making global warming and its consequences but one of many ways in which humans have modified the Earth. Narrow focus on global warming might suggest that we simply need to stop emitting greenhouse gases and use renewable energy to abate the planet’s pressures. The human footprint is much larger than that.

This is the viewpoint that we presented in our article

Pielke Sr., R., K.  Beven, G. Brasseur, J. Calvert, M. Chahine, R. Dickerson, D.  Entekhabi, E. Foufoula-Georgiou, H. Gupta, V. Gupta, W. Krajewski, E.  Philip Krider, W. K.M. Lau, J. McDonnell,  W. Rossow,  J. Schaake, J.  Smith, S. Sorooshian,  and E. Wood, 2009: Climate change: The need to consider human forcings besides greenhouse gases.   Eos, Vol. 90, No. 45, 10 November 2009, 413. Copyright (2009) American   Geophysical Union.

The new IPCC reprts should heed this growing call for a broader, more complete assessment of threats to the enviroment and society, rather than their scientifically flawed focus on the radiative effects of CO2 and a few other greenhouse gases.

source of image

Comments Off

Filed under Climate Science Reporting, Definition of Climate, Vulnerability Paradigm