Category Archives: Vulnerability Paradigm

New Book “Institutions And Incentives In Regulatory Science” (Edited by Jason Scott Johnston, 2012).

There is a very informative new book that has appeared. It is

Institutions and Incentives in Regulatory Science (Edited by Jason Scott Johnston, 2012).

Available at http://www.amazon.com/Institutions-Incentives-Regulatory-Science-Johnston/dp/0739169467, as well as other online sites.

The book summary reads [highlight added]

From endangered species protection to greenhouse gas regulations, modern regulatory interventions are justified by science.  Indeed, legislators look to science for simple answers to complex regulatory questions.  This regulatory demand for scientific answers collides with the scientific reality that on the frontiers of science, there are no simple answers, only competing hypotheses and accumulating but as yet often inconclusive evidence.   Given inevitable scientific uncertainty, regulatory agencies such as the U.S. Environmental Protection Agency are put in the position of adjudicating unresolved scientific controversies.  As the contributions to this volume show conclusively and in great detail, such agencies (and other assessment organizations such as the Intergovernmental Panel on Climate Change or IPCC) are far from unbiased in how they assess regulatory science.   They instead act as advocates for those scientific positions that further the regulatory agenda of promulgating new regulations and increasing the scope of the regulatory state.

The book describes many facts about how regulatory agencies use science to justify their regulations that may surprise and even shock many readers:

  • In the area of climate science, where the IPCC is advertised as an objective and unbiased assessment body, the facts are that the Lead Authors for IPCC Assessment Reports are chosen by political representatives on the IPCC, and have no duty to respond in any way to the comments of outside reviewers of IPCC draft chapters.  The oft-repeated claim that there are “thousands” of scientists involved in outside review of IPCC Assessment Reports is patently false, with generally only a few dozen truly independent outside reviews submitted even on key chapters.  Perhaps most strikingly, the Editors with responsibility for overseeing the decisions of chapter Authors are themselves chosen by the same people (Working Group Chairs) who pick the Authors.  An outside audit of the IPCC commissioned by the IPCC itself (done by the Interacademy Council) concluded that some body other than the IPCC should choose the Review Editors but acknowledged that there is no such outside body.
  • Perhaps more than any other U.S. environmental law, the Endangered Species Act looks to science for clear answers regarding which species are imperiled and how to protect them.  But as this book shows, for even the most basic threshold question – as to whether a population constitutes a species or sub-species – there is no scientific answer.  As for the definition of a species, there are over a dozen competing definitions, and the categorization of a sub-species is even more problematic, with a plethora of approaches that have allowed the United States Fish and Wildlife Service (USFWS) and its biological advisers in the U.S Geological Service (USGS) to effectively declare sub-species at will, as even slight morphological or genetic differences are seized upon to indicate reproductive isolation and the propriety of categorizing a population as a sub-species.   Even more seriously, the book recounts how USFWS peer review in cases of controversial taxonomic classification has involved the selective disclosure of underlying data to outside peer reviewers and has been actively controlled by USGS scientists with a strong self-interest in USFWS determinations.  The book’s ESA chapters clearly show how supposedly scientific disagreement about whether a population is or is not a legally protected sub-species in fact reflect differing policy preferences, different weights that scientists attach to potential errors in triggering, or failing to trigger, legal protection.
  • Perhaps the most dramatic case studies in the book come from the area of chemical toxicity assessment by the U.S. E.P.A. and National Institute for Environmental Health (NIEH). The book shows how the EPA has made determinations of chemical toxicity that deliberately ignore the most recent and most methodologically sound studies when those studies fail to support the agency’s preferred, pro-regulatory result of significant health risk at low doses.  The case studies here include formaldehyde, where the National Academy of Science (NAS) itself concluded that EPA’s risk assessment “was based on a subjective view of the overall data” and failed to provide a plausible method by which exposures could cause cancer, a failure especially problematic given “inconsistencies in the epidemiological data, the weak animal data, and the lack of mechanistic data.”  Equally dramatic is the story of EPA risk assessment for dioxin.  Here, the agency continues to apply its decades-old assumptions that cancer risks at low doses can be extrapolated linearly from those actually observed in animal studies at high doses, and that there is no threshold level of exposure below which excess risk falls to zero.  EPA continues to maintain these assumptions despite the NAS’s admonition that “EPA’s decision to rely solely on a default linear model lacked adequate scientific support.”  Perhaps most disturbingly, the book provides examples of how supposedly unbiased outside scientific advisory panels are tainted by conflicts of interest. In the case of bisphenol A, for example,  the NIEHS awarded $30 million in grants to study that chemical to scientists who had already  publicly stated that the chemical’s toxicity was already well-researched and reasonably certain.

All told, the institutional details and facts provided by the authors’ of Institutions and Incentives in Regulatory Science paint a picture of a serious crisis in the scientific foundations of the modern regulatory state.   But the authors go beyond this, by providing suggestions for reform.  These proposals span a wide range.  In climate science, author proposals range from calling for a much more open and adversary presentation of competing work in climate science to the abolition of the IPCC as a standing body.  In endangered species regulation, proposals range from more strictly science-based thresholds for sub-species determination to a separation of the science of species determination from the  legal consequences of listing under the ESA.  In environmental regulation, some authors call for a more open and transparent process of scientific assessment in which agencies such as the EPA publicly acknowledge and fully discuss the science on both sides of complex regulatory decisions, while others call for the strict separation of scientific assessment from regulatory authority.

The authors possess a unique combination of expertise and experience: Jamie Conrad is a principal of Conrad Law & Policy Counsel and author editor of the Environmental Science Deskbook (1998);

Susan Dudley, former Administrator of the Office of Information and Regulatory Affairs in OMB, is the founding Director of the Regulatory Studies Center at George Washington University’s Trachtenberg School of Public Policy;

George Gray, Professor of environmental and occupational health and director of the Center for Risk Science and Public Health at the George Washington University School of Public Health and Health Sciences, was formerly science advisor at the U.S. E.P.A. and Executive Director of the Harvard Center for Risk Analysis;

Jason Scott Johnston is the Henry L. and Grace Doherty Charitable Foundation Professor of Law and the Nicholas E. Chimicles Research Professor in Business Law and Regulation at the University of Virginia Law School and the author numerous articles appearing in both peer-edited law and economics journals and law reviews;

Gary E. Marchant, formerly a partner at Kirkland & Ellis is Lincoln Professor of Emerging Technologies, Law, and Ethics and Executive Director and faculty fellow at the Center for Law, Science and Innovation in Sandra Day O’Connor College of Law at Arizona State University;

Ross McKitrick, Professor of Economics at the University of Guelph is the author of Taken by Storm: The Troubled Science, Policy and Politics of Global Warming (2003) and of numerous articles appearing in peer-edited climate science journal such as Geophysical Research Letters ;

Rob Roy Ramey II, principal of Wildlife Science International, has consulted on several of the most significant Endangered Species Act listing decisions of the past decades and is the author of numerous scientific papers appearing in journals such as Science and Animal Conservation;

Katrian Miriam Wyman, Professor of Law at New York University Law School, is the editor and author (with David Schoenbrod and Richard Stewart) of Breaking the Logjam: Environmental Protection that Will Work (2010).

Comments Off

Filed under Books, Vulnerability Paradigm

New Report “Gulf Coast Climate Information Needs Assessment” By Hal Needham and Lynne Carter

There was an interesting survey of stakeholders in the report

Gulf Coast Climate Information Needs Assessment

by Hal Needham and Dr. Lynne Carter.

Here are two examples of a question and the answer:

Research Question: Coast region today ….Identify present weather/climate issues impacting your community today AND if you have any interest in how other communities are dealing with similar issues.

Most significant climate-related issues in region today:

1. Hurricanes
2. Storm Surge
3. Rainfall Flood
4. Wind Storm
5. Sea level rise

Most significant climate-related issues for the region in the future?

1. Hurricanes
2. Storm surge
3. Rainfall Flood
4. Sea level rise
5. Windstorm

They also asked

Have you noticed any changes related to a changing climate

and they summarized as

Nearly even split between ‘yes” and ‘no’ responses

Many ‘yes’ respondents provided specific examples

Of the 26 ‘no’ responses, 16 further noted that they had seen changes but they were due to natural cycles.

With respect to the use of climate models, they wrote

About one-half said they might be interested in using climate outputs but their time frames were much shorter than the 25 and 100 year outputs

While few respondents are using long-term climate projections in their decision making, many provided examples of how they might use such information in the future.

Of the total of 48 responses to this question, 21 stated clearly they were not interested in using long-term climate model projections.

and

Time frames: mostly shorter than the 25-100 years for most climate models, rather more like two-weeks to one year.

We need more such assessments by stakeholder, as this fits in the bottom-up, resource-based perspective that we have urged be adopted in our paper

Pielke, R. A., Sr., R. Wilby,  D. Niyogi, F. Hossain, K. Dairuku,J. Adegoke, G. Kallos, T. Seastedt, and K. Suding (2012), Dealing with complexity and extreme events using a bottom-up, resource-based vulnerability perspective, in Extreme Events and Natural Hazards: The Complexity Perspective, Geophys. Monogr. Ser., vol. 196, edited by A. S. Sharma et al. 345–359, AGU, Washington, D. C., doi:10.1029/2011GM001086. [the article can also be obtained from here]

source of image

Comments Off

Filed under Vulnerability Paradigm

New Paper “Decision Scaling: Linking Bottom-Up Vulnerability Analysis With Climate Projections In The Water Sector” By Brown Et Al 2012

I was alerted to a new paper by Faisal Hossain that has adopted part of  the bottom-up perspective with respect that we hae proposed in our article

Pielke, R. A., Sr., R. Wilby,  D. Niyogi, F. Hossain, K. Dairuku,J. Adegoke, G. Kallos, T. Seastedt, and K. Suding (2012), Dealing with complexity and extreme events using a bottom-up, resource-based vulnerability perspective, in Extreme Events and Natural Hazards: The Complexity Perspective, Geophys. Monogr. Ser., vol. 196, edited by A. S. Sharma et al. 345–359, AGU, Washington, D. C., doi:10.1029/2011GM001086. [the article can also be obtained from here]

The new paper is

Brown, C., Y. Ghile, M. Laverty, and K. Li (2012), Decision scaling: Linking bottom-up vulnerability analysis with climate projections in the water sector, Water Resour. Res., 48, W09537, doi:10.1029/2011WR011212.

The abstract reads [highlight added]

There are few methodologies for the use of climate change projections in decision making or risk assessment processes. In this paper we present an approach for climate risk assessment that links bottom-up vulnerability assessment with multiple sources of climate information. The three step process begins with modeling of the decision and identification of thresholds. Through stochastic analysis and the creation of a climate response function, climate states associated with risk are specified. Climate information such as available from multi-GCM, multi run ensembles, is tailored to estimate probabilities associated with these climate states. The process is designed to maximize the utility of climate information in the decision process and to allow the use of many climate projections to produce best estimates of future climate risks. It couples the benefits of stochastic assessment of risks with the potential insight from climate projections. The method is an attempt to make the best use of uncertain but potentially useful climate information. An example application to an urban water supply system is presented to illustrate the process.

The article contains the statement with respect to the top-down global climate model predictions that

A problem with this approach is that GCM projections are relatively poor scenario generators.

I agree, and would add that they have provided NO demonstration of skillful regional and local predictions of changes in climate statistics from that in the historical and paleorecord.

The authors also write

“A novel aspect of the approach is that it uses decision analysis as a framework for characterizing the climate future, and consequently, climate projections, in terms of their position relative to decision thresholds. In doing so, it uses stochastic analysis for risk identification and uses GCM projections for risk estimation, assigning probabilities to hazards, thus linking the two methods.”
 but do, at least, recognize the need to move beyond just the GCM runs. They write
“Appropriately tailored climate information, including GCM projections and stochastically generated conditions from historical and paleodata, and the application of expert judgment, may provide informative answers to this question when approached in the manner described here.”

While the paper still seems to accept the robustness of the global climate model predictions, it does recognize that there are other approaches to accept climate risk.

The article, unfortunately, does not consider other environmental and social risks, relative to climate risks. In the Pielke et al 2012 paper we wrote

“We discuss the adoption of a bottom-up, resource–based vulnerability approach in evaluating the effect of climate and other environmental and societal threats to societally critical resources.This vulnerability concept requires the determination of the major threats to local and regional water, food, energy, human health, and ecosystem function resources from extreme events including climate, but also from other social and environmental issues. After these threats are identified for each resource, then the relative risks can be compared with other risks in order to adopt optimal preferred mitigation/adaptation strategies.

This is a more inclusive way of assessing risks, including from climate variability and climate change than using the outcome vulnerability approach adopted by the IPCC. A contextual vulnerability assessment, using the bottom-up, resource-based framework is a more inclusive approach for policymakers to adopt effective mitigation and adaptation methodologies to deal with the complexity of the spectrum of social and environmental extreme events that will occur in the coming decades, as the range of threats are assessed, beyond just the focus on CO2 and a few other greenhouse gases as emphasized in the IPCC assessments.”

Nonetheless, it is refreshing to see the much-needed start of a movement away from the top-down IPCC approach of assessing risks to key resources, which, as we have shown in our papers and in my weblog posts, is a fundamentally flawed approach.

I have sent the first author a copy of our 2012 paper and hope they will move to the complete adoption of the bottom-up, resource-based perspective.

source of image

Comments Off

Filed under Research Papers, Vulnerability Paradigm

Our Chapter “Dealing with complexity and extreme events using a bottom-up, resource-based vulnerability perspective” By Pielke Sr Et Al 2012 Has Appeared

Our article

Pielke, R. A., Sr., R. Wilby,  D. Niyogi, F. Hossain, K. Dairuku,J. Adegoke, G. Kallos, T. Seastedt, and K. Suding (2012), Dealing with complexity and extreme events using a bottom-up, resource-based vulnerability perspective, in Extreme Events and Natural Hazards: The Complexity Perspective, Geophys. Monogr. Ser., vol. 196, edited by A. S. Sharma et al. 345–359, AGU, Washington, D. C., doi:10.1029/2011GM001086. [the article can also be obtained from here]

has appeared in

Sharma, A. S.,A. Bunde, P. Dimri, and D. N. Baker (Eds.) (2012), Extreme Events and Natural Hazards: The Complexity Perspective, Geophys. Monogr. Ser., vol. 196, 371 pp., AGU, Washington, D. C., doi:10.1029/GM196.

The description of the book is given on the AGU site as [highlight added]

Extreme Events and Natural Hazards: The Complexity Perspective examines recent developments in complexity science that provide a new approach to understanding extreme events. This understanding is critical to the development of strategies for the prediction of natural hazards and mitigation of their adverse consequences. The volume is a comprehensive collection of current developments in the understanding of extreme events. The following critical areas are highlighted: understanding extreme events, natural hazard prediction and development of mitigation strategies, recent developments in complexity science, global change and how it relates to extreme events, and policy sciences and perspective. With its overarching theme, Extreme Events and Natural Hazards will be of interest and relevance to scientists interested in nonlinear geophysics, natural hazards, atmospheric science, hydrology, oceanography, tectonics, and space weather.

The abstract of our article reads

“We discuss the adoption of a bottom-up, resource–based vulnerability approach in evaluating the effect of climate and other environmental and societal threats to societally critical resources.This vulnerability concept requires the determination of the major threats to local and regional water, food, energy, human health, and ecosystem function resources from extreme events including climate, but also from other social and environmental issues. After these threats are identified for each resource, then the relative risks can be compared with other risks in order to adopt optimal preferred mitigation/adaptation strategies.

This is a more inclusive way of assessing risks, including from climate variability and climate change than using the outcome vulnerability approach adopted by the IPCC. A contextual vulnerability assessment, using the bottom-up, resource-based framework is a more inclusive approach for policymakers to adopt effective mitigation and adaptation methodologies to deal with the complexity of the spectrum of social and environmental extreme events that will occur in the coming decades, as the range of threats are assessed, beyond just the focus on CO2 and a few other greenhouse gases as emphasized in the IPCC assessments.”

In the assessment of climate risks, the approach we recommend is an inversion of the IPCC process, where the threats from climate, and from other environmental and social risks are assessed first, before one inappropriately and inaccurately runs global climate models  to provide the envelope of future risks to key resources.

Comments Off

Filed under Research Papers, Vulnerability Paradigm

New Paper “Climate Feedback–Based Provisions For Dam Design, Operations, And Water Management In The 21st Century” By Hossain Et Al 2012

We have a new paper under the leadership of Fasial Hossain of the  Department of Civil and Environmental Engineering at Tennessee Technological University,

Hossain, F., A.M. Degu, W. Yigzaw, S.J. Burian, D. Niyogi, J.M. Shepherd and R.A. Pielke Sr., 2012: Climate feedback–based provisions for dam design, operations, and water management in the 21st Century. J. Hydro. Eng., DOI: DOI: 10.1061/(ASCE)HE.1943-5584.0000541, in press.

The conclusion reads in part [highlight added]

The purpose of this article is to shed light on the need for climate feedback-based considerations in dam design, operations, and water management for the 21st century. It first overviewed the known impacts on climate from changes in land use and land cover that are typically anticipated once a dam is constructed. Recent research was presented on the first-order signature around dams on local climate using observational evidence. A global overview of the location of large dams was presented to highlight the need to treat each dam uniquely according to its location and the larger setting. It is now obvious that the observational data associated with current dams, combined with the rich body of research of LCLU impact on climate, can provide the planning and engineering professions with insightful guidance for both operations and more robust dam-building in the 21st century as well as modifications of local design guidelines to account for climate feedback.

The conclusion includes the recommendation that

One way to maximize the ability of future generation of engineers to assimilate knowledge on climate modification for dam design and operations is to enhance the baccalaureate curriculum by adding prerequisite courses on atmospheric sciences and climate.

In this context, we do not mean climate education that starts from the incorrect premise that the human addition of CO2 and a few other greenhouse gases dominate the climate system response in the coming decades. We subscribe to the robust view of the climate system as reported in the 2005 NRC assessment report

National Research Council, 2005: Radiative forcing of climate change: Expanding the concept and addressing uncertainties. Committee on Radiative Forcing Effects on Climate Change, Climate Research Committee, Board on Atmospheric Sciences and Climate, Division on Earth and Life Studies, The National Academies Press, Washington, D.C., 208 pp.

and summarized in

Pielke Sr., R., K.  Beven, G. Brasseur, J. Calvert, M. Chahine, R. Dickerson, D.  Entekhabi, E. Foufoula-Georgiou, H. Gupta, V. Gupta, W. Krajewski, E.  Philip Krider, W. K.M. Lau, J. McDonnell,  W. Rossow,  J. Schaake, J.  Smith, S. Sorooshian,  and E. Wood, 2009: Climate change: The need to consider human forcings besides greenhouse gases.   Eos, Vol. 90, No. 45, 10 November 2009, 413. Copyright (2009) American   Geophysical Union.

source of image

Comments Off

Filed under Research Papers, Vulnerability Paradigm

Article Titled “Water Resource Management During Prolonged Drought Periods” By Will Alexander

Professor Will Alexander alerted me to a presentation he will be giving in Pretoria in October titled

Water resource management during prolonged drought periods [there is no url for it, but it can be obtained by requesting from Professor Alexander at alexwjr@iafrica.com].

WJR (Will) Alexander is Professor Emeritus of the Department of Civil Engineering of the University of Pretoria, South Africa, and Honorary Fellow of the South African Institution of Civil Engineering. Other posts on his perspective appear in

Guest Post “Global Floods – Why Were They Not Predicted?” By Will Alexander

Climate Change: The West vs The Rest by Will Alexander

A Guest Weblog By Will Alexander “Climategate Chaos”

He clearly has the credentials to present his viewpoint on climate science.

His article starts with [highlight added]

South Africa’s surface water resources are rapidly approaching depletion. It is essential that practitioners in these fields should have an advanced knowledge of multi-site, multi-year, periodical hydrological statistics and practical experience in these fields. Details are provided in the author’s substantial handbook on analytical methods for water resource development and management that will be distributed during the symposium.

The text includes his sobering assessment that

The following diagram shows the projected future water demand in South Africa in relation to available conventional resources. It is based on estimates by du Plessis and van Robbroeck in 1978. These estimates have been confirmed by other investigators. The diagram shows that usage will exceed the economically available runoff before the year 2020. It also shows that by 2050 the demand will exceed the total runoff from all South African rivers!

He further writes that

I cannot emphasise strongly enough, that attempts to apply climate change theory before and during drought conditions will have disastrous consequences on the welfare of this country and its poor and disadvantaged peoples for whom I have the greatest concern….

South Africa and its peoples face severe humanitarian, social and economic consequences as a result of this thoroughly unscientific policy.

The entire preprint can be obtained from Professor Alexander at alexwjr@iafrica.com

source of image

Comments Off

Filed under Vulnerability Paradigm

Article In Physics Today By David Kramer Titled “Scientists Poke Holes In Carbon Dioxide Sequestration”

There is an interesting article that has appeared in the August 2012 issue of Physics Today titled

Scientists poke holes in carbon dioxide sequestration

The article starts with the text [highlight added]

Newly published geophysical research and a committee of experts have cast doubts on whether carbon capture and storage (CCS) can play the major role that some scientists and coal producers had hoped for in mitigating climate change. A report released by the National Research Council (NRC) in mid-June warns that the injection of millions of tons of supercritical liquid carbon dioxide from fossil-fuel plants into deep geological formations is likely to create earthquakes that will fracture the surrounding impermeable rock and allow the greenhouse gas to work its way back toward the surface. Separately, Stanford University geophysicists Mark Zoback and Steven Gorelick write in a 26 June article in the Proceedings of the National Academy of Sciences that “there is a high probability that earthquakes will be triggered by injection of large volumes of CO2  into the brittle rocks commonly found in continental interiors.” They argue that “large-scale CCS is a risky, and likely unsuccessful, strategy for significantly reducing greenhouse gas emissions.”

Colorado School of Mines geologist Murray Hitzman, who chaired the NRC committee that wrote Induced Seismicity Potential in Energy Technologies, told a 19 June hearing of the Senate Committee on Energy and Natural Resources that two factors, “net fluid balance” and the volume of the injected liquid, largely determine whether an earthquake will result when liquids are pumped into underground formations. According to the NRC report, oil and gas development projects that take into account the balance between fluid injected and fluid withdrawn produce significantly fewer seismic events than projects that ignore the fluid balance. In CCS, CO2 is injected without any corresponding extraction of the brine that’s often present in the formation.
Zoback, who also appeared at the Senate committee hearing, said that for CCS to contribute significantly to mitigating climate change, about 3.5 billion metric tons worldwide would have to be sequestered annually. Right now, a few large-scale CCS operations, including one at a Norwegian gas well in the North Sea and another at a gas well in Algeria, are each storing around 1 million tons a year.

Other information in the article includes

Beginning in 2017 the FutureGen Alliance, a US-based industry–government consortium, plans to capture and store 1.3 million tons of CO2 per year at a coal-burning power plant in Meredosia, Illinois. Lawrence Pacheco, a spokesman for the $1.3 billion venture, says that at the injection site both the porosity of the sandstone formation nearly a mile below the surface and the caprock permeability are ideal for CO2 storage. In addition to a $1 billion pledge to FutureGen, DOE is funding three industrial-scale CCS projects, including a plan to capture and store 4.5 million tons a year from a methanol refinery and another to sequester 1 million tons annually from ethanol production. Two of the three projects will use the CO2 in enhanced oil recovery.

The article ends with the text

In the big picture, seismicity pales in comparison to cost as an impediment to the adoption of CCS, says Rachel Cleetus, a climate economist with the Union of Concerned Scientists. “Honestly, the challenges to CCS are so significant on the economic front that this is just going to be one more thing that makes people question the risk of going down that path versus other options that are readily available and much less risky, such as wind and solar,” she says.
“The difficulty is that carbon isn’t priced in a meaningful way,” adds GeoScience’s Batchelor. “Until carbon has a price, it bears down on the renewables, and it bears down on CCS. And the US, UK, and most European governments are not going to put their industries at a competitive disadvantage by saying we insist you do [CCS] and double the price of power on a unilateral basis.”
I was on an advisory committees to the Biological and Environmental Sciences Divisions Oak Ridge Laboratory review panel in 2008 where CCS was presented (I will have more to say about the review we completed in a later post).  We were told that the goal is to sequester ALL of the emissions into the ground. This seemed an unrealisitc goal when it was presented to us in 2008, and it still is, in my view. The Physics Today article provides new information on the practicality of CCS and the risks involved.

Comments Off

Filed under Climate Change Forcings & Feedbacks, Vulnerability Paradigm