Category Archives: Vulnerability Paradigm

Research Paper “Discursive Stability Meets Climate Instability: A Critical Exploration Of The Concept Of ‘Climate Stabilization’ In Contemporary Climate Policy” By Boykoff Et Al 2010

In response to the post

The Need For Precise Definitions In Climate Science – The Misuse Of The Terminology “Climate Change”

I was alerted to a paper on climate stabilization which adds significant insight into this subject. The paper is

Boykoff, M. T., D. Frame, and S. Randalls, 2010. Discursive stability meets climate instability: A critical exploration of the concept of ‘climate stabilization’ in contemporary climate policy, Global Environmental Change, Vol. 20, pp. 53-64.

The abstract reads [highlight added]

The goals and objectives of ‘climate stabilization’ feature heavily in contemporary environmental policy and in this paper we trace the factors that have contributed to the rise of this concept and the scientific ideas behind it. In particular, we explore how the stabilization-based discourse has become dominant through developments in climate science, environmental economics and policymaking. That this discourse is tethered to contemporary policy proposals is unsurprising; but that it has remained relatively free of critical scrutiny can be associated with fears of unsettling often-tenuous political processes taking place at multiple scales. Nonetheless, we posit that the fundamental premises behind stabilization targets are badly matched to the actual problem of the intergenerational management of climate change, scientifically and politically, and destined to fail. By extension, we argue that policy proposals for climate stabilization are problematic, infeasible, and hence impede more productive policy action on climate change. There are gains associated with an expansion and reconsideration of the range of possible policy framings of the problem, which are likely to help us to more capably and dynamically achieve goals of decarbonizing and modernizing the energy system, as well as diminishing anthropogenic contributions to climate change.

The conclusion reads

In this paper we have argued that the elegant attraction of ‘climate stabilization’ discourses has culminated in a focus on long term mitigation targets and a cost-effective climate policy that does not address broader political and ethical questions about the timescale, actors and costs involved. It seems appropriate, scientifically, historically and socially, to question this discursive hegemony and open up debates on more productive and effective framings of climate policy. This paper therefore argues that while the climate stabilization discourse (and associated ways of thinking/proposing/acting) has been valuable in drawing greater attention to human influences on the global climate, it is time to explicitly move to more productive ways of considering minimizing detrimental impacts from human contributions to climate change.

The perspective of “it is time to explicitly move to more productive ways of considering minimizing detrimental impacts from human contributions to climate change” fits with the view we express in our paper below, however, in my view, we  need to broaden the recommended viewpoint so that

More productive ways of considering minimizing detrimental impacts from human contributions to climate (and not just a change part), as well as from all other environmental and social threats.

In our paper

Pielke Sr., R.A., R. Wilby, D. Niyogi, F. Hossain, K. Dairuku, J. Adegoke, G. Kallos, T. Seastedt, and K. Suding, 2011: Dealing with complexity and extreme events using a bottom-up, resource-based vulnerability perspective. AGU Monograph on Complexity and Extreme Events in Geosciences, in press.

we write in the abstract

We discuss the adoption of a bottom-up, resource–based vulnerability approach in evaluating the effect of climate and other environmental and societal threats to societally critical resources. This vulnerability concept requires the determination of the major threats to local and regional water, food, energy, human health, and ecosystem function resources from extreme events including climate, but also from other social and environmental issues. After these threats are identified for each resource, then the relative risks can be compared with other risks in order to adopt optimal preferred mitigation/adaptation strategies.

This is a more inclusive way of assessing risks, including from climate variability and climate change than using the outcome vulnerability approach adopted by the IPCC. A contextual vulnerability assessment, using the bottom-up, resource-based framework is a more inclusive approach for policymakers to adopt effective mitigation and adaptation methodologies to deal with the complexity of the spectrum of social and environmental extreme events that will occur in the coming decades, as the range of threats are assessed, beyond just the focus on CO2 and a few other greenhouse gases as emphasized in the IPCC assessments.

source of image

Comments Off on Research Paper “Discursive Stability Meets Climate Instability: A Critical Exploration Of The Concept Of ‘Climate Stabilization’ In Contemporary Climate Policy” By Boykoff Et Al 2010

Filed under Definition of Climate, Research Papers, Vulnerability Paradigm

The Great Fire Of 1910 Places The Current 2012 Fire Season In Perspective

source of image – fire area as of June 12 2012

This summer promises to be a season of high danger. The upcoming heat wave (see) is certainly going to dry out the forest even more than it current has. Colorado has already experienced a fire of 52,068 acres as of June 15 west of Fort Collins (see also).

However, it is useful to place fire in the western forests in perspective. In 1910 there was a truly massive fire [h/t Bill Neff]. As written at Wikipedia

The Great Fire of 1910 (also commonly referred to as the Big Blowup or the Big Burn) was a wildfire which burned about three million acres (12,000 km², approximately the size of Connecticut) in northeast Washington, northern Idaho (the panhandle), and western Montana.

This 1910 fire [also called the Big Blowup]  precipitated a US Forest Service policy that led today to the risk of extreme fires when they do inevitably occur. The effect of the 1910 fire is described in the

U.S. Forest Service History

where they write

 Contemporary critics, however, pointed out the flaws in the fire suppression policy.  Elers Koch, a forester who had fought the Big Blowup on the Lolo National Forest in Montana, argued afterward in favor of letting backcountry fires burn themselves out.  Professor Herman Chapman of Yale Forest School, who was studying Southern forests, argued that fire had an important if little understood ecological role in the landscape.  And Secretary of the Interior Richard Ballinger, whose department had seen part of the newly created Glacier National Park burn along with surrounding national forest land, argued for allowing annual burning to reduce fuel loads like the Native Americans did.

But Forest Service leadership and forestry leaders like Gifford Pinchot thought otherwise and worked for years to suppress and discredit such arguments.  In the aftermath of 1910, Chief Graves staked the agency’s continued existence on the belief that it could in fact defeat fire.  Toward that end, Graves embraced a cooperative approach with state and private associations to fight fire (realized the next year through the Weeks Act) and soon launched a fire protection campaign that involved removing fire from the landscape and changing how Americans viewed fire.  The campaign, which would lead to the creation of Smokey Bear, would last for more than half a century and completely change forest ecology throughout the country during its lifetime.  Other nations adopted the American fire suppression model, with equally devastating results.  Now the folly of fighting backcountry fires is widely accepted and the role of fire in maintaining forest health is understood.  The impact of the campaign is the most important legacy of the 1910 Fires and the Big Blowup—and it is a legacy that we are still coping with today.

Comments Off on The Great Fire Of 1910 Places The Current 2012 Fire Season In Perspective

Filed under Vulnerability Paradigm

EOS Article On Sea Level Rise “Sea Level Rise And The Ongoing Battle Of Tarawa” By S. Donner 2012

The issue of sea level rise is not an area I have completed original research on.  Nonetheless, I was intrigued by an article in the April 24 issue of EOS since it provides an example of the multi-dimensional bottom-up assessments that are needed to robustly assess risk to key societal resources (in this case coastal vulnerability to sea level changes).  We discuss this bottom-up approach is our paper

Pielke Sr., R.A., R. Wilby, D. Niyogi, F. Hossain, K. Dairuku, J. Adegoke, G. Kallos, T. Seastedt, and K. Suding, 2012: Dealing  with complexity and extreme events using a bottom-up, resource-based  vulnerability perspective. AGU Monograph on Complexity and  Extreme Events in Geosciences, in press.  https://pielkeclimatesci.files.wordpress.com/2011/05/r-365.pdf

The EOS article is

S. Donner, 2012: Sea level rise and the ongoing Battle of Tarawa. EOS Volume 93, Number 17, 24 April 2012

The article contains insightful text in a section titled “Communicating About Sea Level Rise” including [highlight added]

The failure to consider the contribution of natural variability and direct human modifications can lead to misattribution of flooding events or shoreline changes to sea level rise. Tarawa, the most easily accessible atoll in Kiribati, is a popular destination for journalists and activists interested in observing and communicating the impacts of sea level rise on a low- lying nation. For example, a Greenpeace slide show within an explanation of what sea level rise means that depicts the 2005 flooding remains among the top responses to an Internet query of “Kiribati” and “sea level rise.” These common images of flooded homes and waves crashing across the causeways—collected during an anomalous event on islets susceptible to flooding due in part to local modifications to the environment—can provide the false impression that Tarawa is subject to constant flooding because of sea level rise……Many individual observations of erosion, flooding, or groundwater salinization, recorded in community consultations for internationally funded climate change adaptation programs, are thus attributed to climate change without scientific analysis [e.g., Mackenzie, 2004]……These events are presented as examples of climate change impacts in promotional materials and at international events (e.g., “Our Road to Copenhagen,” a Kiribati side event at COP15 in Copenhagen), without any mention of ENSO- driven natural variability or local shoreline modification.

Such unverified attribution can inflame or invite skepticism of the scientific evidence for a human- caused increase in the global sea level. Such unverified attribution can inflame or invite skepticism of the scientific evidence for a human- caused increase in the global sea level. After Webb and Kench [2010] reported that the area of 23 atoll islets in Kiribati and neighboring countries had remained stable or increased over the past 20–60 years, some of the international news media reported that the effects of sea level rise on atoll nations were exaggerated and that Kiribati is not threatened by future sea level rise (e.g., R. Callick, Coral islands left high and dry, The Australian, 2010……). Though the study did show evidence that atoll islets were dynamic and do not necessarily decrease in area in response to sea level rise, the islets in question remain vulnerable to inundation from global mean sea level rise in the future, as the authors stressed in a subsequent briefing note….

The article concludes with the text

Instead of incorrectly attributing individual flood events or shoreline changes to global sea level rise, scientists and climate communicators can use such occurrences to educate the public about the various natural and human processes that affect sea level, the shoreline, and the shape of islands. This would better prepare the public and policy makers for the changes that societies are likely to experience as global sea level rises in the coming decades.

Regardless of whether the author is correct that global sea level will continue to rise in the coming decades, this article is a refreshing presentation on the need to provide an assessment of the entire spectrum of risks that exist.

source of image

Comments Off on EOS Article On Sea Level Rise “Sea Level Rise And The Ongoing Battle Of Tarawa” By S. Donner 2012

Filed under Vulnerability Paradigm

A New Paper “Vulnerability To Temperature-Related Hazards: A Meta-Analysis And Meta-Knowledge Approach” By Romero-Lankao Et Al 2012

I was alerted by Professor Karen O’Brien of the University of Oslo to an important new paper that reports on the need to complete textual vulnerabiltiy assessments of risks to key societal and environmental vulnerabilities, as we propose in our article

Pielke Sr., R.A., R. Wilby, D. Niyogi, F. Hossain, K. Dairuku, J. Adegoke, G. Kallos, T. Seastedt, and K. Suding, 2012: Dealing  with complexity and extreme events using a bottom-up, resource-based  vulnerability perspective. AGU Monograph on Complexity and  Extreme Events in Geosciences, in press.

We refer to Karen’s very important research on this topic in our AGU article. In our article we discuss the two approaches to vulnerability:

  • the top-down approach (comparitive vulnerability)
  • the bottom-up, resource based approach (contextual vulnerability)

The top-down approach (comparitive vulnerability) is the adopted approach in the IPCC reports.

The new paper is

Romero-Lankao, P., et al., 2012: Vulnerability to temperature-related hazards: A meta-analysis and metaknowledge approach. Global Environ. Change.

The abstract reads [highlight added]

Research on urban vulnerability has grown considerably during recent years, yet consists primarily of case studies based on conflicting theories and paradigms. Assessing urban vulnerability is also generally considered to be context-dependent. We argue, however, that it is possible to identify some common patterns of vulnerability across urban centers and research paradigms and these commonalities hold potential for the development of a common set of tools to enhance response capacity within multiple contexts. To test this idea we conduct an analysis of 54 papers on urban vulnerability to temperature- related hazards, covering 222 urban areas in all regions of the world. The originality of this effort is in the combination of a standard metaanalysis with a meta-knowledge approach that allows us not only to integrate and summarize results across many studies, but also to identify trends in the literature and examine differences in methodology, theoretical frameworks and causation narratives and thereby to compare ‘‘apples to oranges.’’ We find that the vast majority of papers examining urban vulnerability to temperature-related hazards come from an urban vulnerability as impact approach, and cities from middle and low income countries are understudied. One of the challenges facing scholarship on urban vulnerability is to supplement the emphasis on disciplinary boxes (e.g., temperature–mortality relationships) with an interdisciplinary and integrated approach to adaptive capacity and structural drivers of differences in vulnerability.

The authors report that “the vast majority of papers examining urban vulnerability to temperature-related hazards come from an urban vulnerability as impact approach“. This is the top-down comparitive vulnerability approach.  The authors argue for a bottom-up (contextual vulnerabilty approach), which Pielke et al 2012 also concluded is needed.

The highlights listed by the authors are:

  • Studies on urban vulnerability are based on conflicting theories and paradigms.
  • Thirteen factors account for 66% of the tallies of urban vulnerability determinants.
  • Reviewed papers mostly come from the urban vulnerability as impact paradigm.
  • Scholarship focuses on short time horizons and the city as level of analysis.
  • Cities from middle and low-income countries are understudied.

Among the conclusions are

The urban vulnerability as impact lineage is dominatedby epidemiological studies and top-down assessments.

The central message of our study is that what we know depends fundamentally on what questions we ask and how we go about answering those questions (i.e., the kind of methods and data we use or have available to us). Our combined meta-analysis and meta-knowledge exercise highlights the fact that while a great deal of research has been done addressing urban vulnerability to temperature-related hazards, the vast majority of studies fall under a single research paradigm – the urban vulnerability as impacts approach. Although this paradigm has made important contributions to the understanding of urban vulnerability, it tends to ignore other equally fundamental dimensions and determinants; to produce a set of explanatory variables that are tightly constrained by the availability of data, particularly in developing countries; and it omits any attempt to gain ethnographic knowledge of behavioral norms, social networks and risk perceptions that are equally relevant to understanding urban vulnerability.

The dominance of the urban vulnerability as impact paradigm suggests that more studies should be undertaken that apply the inherent urban vulnerability and urban resilience approaches. For instance, studies under an inherent urban vulnerability paradigm can explore underlying societal processes by which assets and options at the individual, family or community level (e.g., self-help housing or access to social networks) allow urban households to adapt, but can also shed light on why in many cases these personal assets are not enough to reduce urban populations’ vulnerability because of the role the state plays in shaping adaptive capacity through such means as promoting economic growth and poverty reduction. Meanwhile, an urban resilience framework holds promise to integrate across disciplines and illuminate a more complete set of drivers of urban vulnerability.

source of image

Comments Off on A New Paper “Vulnerability To Temperature-Related Hazards: A Meta-Analysis And Meta-Knowledge Approach” By Romero-Lankao Et Al 2012

Filed under Research Papers, Vulnerability Paradigm

Candid Statement On The Shortcomings Of Multi-Decadal Climate Model Predictions By Early Career Scientists At An NCAR Workshop

There was a candid statement about climate models that were made at the Advanced Study Program/Early Career Scientist Assembly Workshop on Regional Climate Issues in Developing Countries held in Boulder, Colorado on 19–22 October 2011. The Workshop  is reported on in the April 3 2012 issue of EOS on page 145.

The relevant text reads [highlight added]

One recurring issue throughout the workshop was that of managing complex impact assessments with a large range of results from global and regional models; variations between models are often not fully understood, accounted for, and/or communicated. Also problematic is the discrepancy between the spatial and temporal scales on which regional climate projections are made (tens of kilometers and ~30–100 years) and the scales that are of primary interest to many communities in developing countries (kilometers and 0–10 years) that are presently affected by climate change.

My Comment: I agree with this comment except I would delete “change” in the last sentence. Climate is always changing, and the use of the word “change” itself miscommunicates the actual threats faced by developing countries even with climate they have seen in the past.

The EOS article continues with

Approaches for addressing uncertainty and scaling issues might include cost-effective ensemble dynamical-statistical approaches and/or coupling regional modeling efforts to better meet specific objectives (e.g., improved integration of hydrologic models). Facilitating effective “end-to-end” communication was identified as a critical research component to increase awareness of the wider challenges and opportunities facing scientists and end users alike. Such end-to-end communication would also help to ensure that research addresses the particular needs of the communities that are its focus.

My Comment:

There is a critically important requirement, however, that is left off of the approaches. Before modeling results are even used, they must first show skill at predicting changes in climate statistics on the spatial and temporla scale needed by the impacts communties. As we present in our paper

Pielke Sr., R.A., and R.L. Wilby, 2012: Regional climate downscaling – what’s the point? Eos Forum,  93, No. 5, 52-53, doi:10.1029/2012EO050008

no regional predictive skill (of changes in climate statistics) has yet been shown on yearly, decadal or multi-decadal time scales.

Their “end-end” communication, however, is appropriately the focus as we emphasize in our article

Pielke Sr., R.A., R. Wilby, D. Niyogi, F. Hossain, K. Dairuku, J. Adegoke, G. Kallos, T. Seastedt, and K. Suding, 2012: Dealing  with complexity and extreme events using a bottom-up, resource-based  vulnerability perspective. AGU Monograph on Complexity and  Extreme Events in Geosciences, in press.

As we wrote in our abstract

We discuss the adoption of a bottom-up, resource–based vulnerability approach in evaluating the effect of climate and other environmental and societal threats to societally critical resources. This vulnerability concept requires the determination of the major threats to local and regional water, food, energy, human health, and ecosystem function resources from extreme events including climate, but also from other social and environmental issues. After these threats are identified for each resource, then the relative risks can be compared with other risks in order to adopt optimal preferred mitigation/adaptation strategies.

This is a more inclusive way of assessing risks, including from climate variability and climate change than using the outcome vulnerability approach adopted by the IPCC. A contextual vulnerability assessment, using the bottom-up, resource-based framework is a more inclusive approach for policymakers to adopt effective mitigation and adaptation methodologies to deal with the complexity of the spectrum of social and environmental extreme events that will occur in the coming decades, as the range of threats are assessed, beyond just the focus on CO2 and a few other greenhouse gases as emphasized in the IPCC assessments.

Hopefully, the attendees of the Workshop will be made aware of our bottom-up, resource-based approach for developing robust effective responses to environmental threats in their countries.

source of image

Comments Off on Candid Statement On The Shortcomings Of Multi-Decadal Climate Model Predictions By Early Career Scientists At An NCAR Workshop

Filed under Vulnerability Paradigm

Presentation Titled Promoting “The Value Of Water Cycle Remote Sensing Missions And Climate Studies To Non-Traditional Consumers” By Faisal Hossain

I want to alert you to an excellent powerpoint slide presentation by Faisal Hossain of Tennessee Technological University titled

Promoting the Value of Water Cycle Remote Sensing Missions and Climate Studies to Non-Traditional Consumers

The talk was presented on March 12 2012 at the Jet Propulsion Laboratory.

As written in the seminar announcement

Dr. Faisal Hossain is an Associate Professor in the Civil Engineering Department of Tennessee Technological University. He holds a B.S in Civil Engineering from the Indian Institute of Technology, an M.S. from the National University of Singapore and a Ph.D. from the University of Connecticut. His research interests lie in the field of water resources, remote sensing and education. He received a NASA New Investigator Program Award in 2008 and an American Society of Engineering Education Outstanding New Faculty Research Award in 2009. Currently, he is leading a capacity-building initiative to train staff in developing nations to better harness the potential of satellite remote-sensing missions.

The slides go through these three topics

Societal (Application) Value for Non-traditional Consumers.

Key findings of Recent Application-driven Research

Packaging the Research as a Product for Consumers: Lessons Learned and Way Forward

with a focus on

Water Cycle Remote Sensing: Tactical Scale of Decision Making (Transboundary Flood Management)

Climate Studies: Strategic Scale of Decision Making (Design and Operations of Large Dams)

With respect to water cycle remote sensing he writes that the consumer he is considering is the public community (transboundary flood management) who have time scales of decision making of days to weeks. For climate studies he is considering the engineering community in terms of the design and operation of large dams in which the time scale for decision making is years to decades.

His recommendations for a way forward on these issues are:

  • More Hands-on Education Effort involving (active learning) of consumers (stakeholders)
  • Co-design of Research Experiments with input from consumers.
  • Working with Philanthropic Institutions: Broaden the value of water cycle satellites (beyond water – health, food, poverty) to increase appeal to non traditional consumers. Is it possible to make massive amounts of satellite water data freely accessible on a daily basis to people around the world (much like Google Earth –intuitive design)?
  • Search Engine Optimization – Simple Issue involving social science (but can reach out to millions of web users)

I recommend viewing the entire talk as it fits with our bottom-up, resource-based focus that we discuss in our paper

Pielke Sr., R.A., R. Wilby, D. Niyogi, F. Hossain, K. Dairuku, J. Adegoke, G. Kallos, T. Seastedt, and K. Suding, 2012: Dealing  with complexity and extreme events using a bottom-up, resource-based  vulnerability perspective. AGU Monograph on Complexity and  Extreme Events in Geosciences, in press.

source of image

Comments Off on Presentation Titled Promoting “The Value Of Water Cycle Remote Sensing Missions And Climate Studies To Non-Traditional Consumers” By Faisal Hossain

Filed under Climate Science Presentations, Vulnerability Paradigm

News Article “Global Climate Models ‘Need Regional Sensitivity'” by Christine Ottery In SciDev.Net

There is an informative news article that illustrates why multi-decadal regional predictions of changes in climate statistics are of no value, The news article is on SciDev.net by Christine Ottery is titled

Global climate models ‘need regional sensitivity’

“The organization which is reported on is described as

The CGIAR Research Program on Climate Change, Agriculture and Food Security (CCAFS) is a 10-year research initiative launched by the Consultative Group on International Agricultural Research (CGIAR) and the Earth System Science Partnership (ESSP).

CCAFS seeks to overcome the threats to agriculture and food security in a changing climate, exploring new ways of helping vulnerable rural communities adjust to global changes in climate.”

Thus CCAFS already started with a perspective that accepts the model results as being able to accurately simulate changes in climate, before they performed their study. They are commended, however, for doing an evaluation of model skill and reporting on the issues that they have found.  The news article is, apparently, based on a video presentation at How good are current climate models for predicting agricultural impacts in Africa and South Asia?.

The article reads [highlight added]

Global climate change models are of limited use to agricultural policymakers in some regions of the developing world, according to a report by the CGIAR Research Program on Climate Change, Agriculture and Food Security (CCAFS).

The report was launched at the Climate Models and Farm Crop Forecasting in South Asia and Africa meeting last month (21 February).

It focused on Eastern and Western Africa, and the South Asian Indo-Gangetic plains which span parts of Bangladesh, India, Nepal and Pakistan.

The researchers, from Oxford University, United Kingdom, and the University of Cape Town, South Africa, studied the ability of global climate models to predict regional climate events such as monsoon rains and temperatures — and found mixed results.

“The models have a reasonable capability in terms of reproducing [trends in the] East African climate,” said Richard Washington, professor of climate science at the University of Oxford.

But in West Africa, particularly in the Sahel region, the models predicted more monsoon rains, of different duration, to those that were actually observed.

Similar difficulties were encountered with India’s monsoons, the authors said. Global models were generally accurate in predicting large-scale horizontal movement of air (atmospheric flow), but not the specific timing or patterns of monsoon rainfall, according to Mark New, professor of climate science at the University of Cape Town.

The authors said global models often failed to take account of complex regional climatic factors — making them less useful for policymakers.

For example, Asia’s monsoons are affected by many region-specific factors, such as El Niño events, atmospheric pressure over the North Atlantic Ocean, and Asia’s so-called ‘brown cloud’ air pollution.

“If you want a climate model that predicts monsoon rainfall variability you need one that gets each of these factors right,” New told SciDev.Net.

The authors suggested greater use of ‘ensemble’ models which combine results from several models to generate averages; and also the use of global models in conjunction with regional ones, to enable regional information to be factored in alongside larger-scale processes.

Washington said it was also important to improve field data on which models are based, and remove any existing biases.

“Otherwise we are left to choose between models that are different [without knowing] which one is better,” he said.

Philip Thornton, a senior scientist at the International Livestock Research Institute in Kenya and a modelling tools leader at CCAFS told SciDev.Net: “The more we understand [uncertainty in models], the better we can deal with it”.

My Comment:  It is refreshing to see that the impacts community is starting to assess the skill of the multi-decadal climate model predictions. This article highlights that climate prediction models predictions of multi-decadal changes in climate statistics are misleading policymakers in parts of the world. The authors of the report, however, also need to recognize that model predictions will be correct in some regions part of the time by chance, one does not know if they will be correct in the future.  Their finding that trends in East Africa were correctly predicted, but that the trends were not accurate in West Africa and Asia should be a red flag that the agreement for the one region may be fortuitous. An evaluation of the predictive skill of the models, however, appears to be a goal of CCAFS, which should be emulated by other stakeholder communities.

source of image

Comments Off on News Article “Global Climate Models ‘Need Regional Sensitivity'” by Christine Ottery In SciDev.Net

Filed under Vulnerability Paradigm

Guest Post By Madhav Khandekar – “Record Grain Yield Estimated By Indian PM For 2011/12”

Madhav Khandekar requested posting the following information on Indian agriculture

Record Grain Yield estimated by Indian PM for 2011/12 by Madhav Khandekar

India’s PM Mr Manmohan Singh announced in New Delhi on February 20th that India expects a record grain yield of about 250 Million tonnes for the agricultural year 2011/12. The PM was addressing the Golden Jubilee celebrations of the ICAR-Indian Centre for Agricultural Research, a Government funded research center which has provided innovative techniques in recent years in helping boost agricultural products across the breadth and depth of India.  As the news item further states ‘ Indian agriculturists have also boosted production of fruits, vegetables, milk and cotton  in this year. The production of pulses (beans and related proteins) has also gone up by 18 Million tonnes”

It may be noted that India is primarily a “vegetarian country” with a large majority of people eating mostly vegetarian food, with inclusion of occasional meat products like chicken, lamb/goat or beef. Coastal regions like the State of Kerala in southwest and Bengal in east are “fish-eaters’, mostly ‘fresh-water’ fish in Bengal and salt-water ( sea) fish in Kerala.

The record grain yield may be attributed to well-distributed Monsoon (June-September) rains, two years in a row 2010 and 2011. It may be recalled that the 2009 Monsoon season was a severe drought, primarily due to the El Niño in the equatorial Pacific. The drought was also exacerbated due to other factors like heavier Eurasian winter snow cover and unfavourable positioning of the IOD-Indian Ocean Dipole in the equatorial Indian Ocean (Francis & Gadgil, Current Science, 2009). In contrast, the ongoing La Niña since early 2010 and a favorable positioning of IOD for 2011 has led to well-distributed rains in the last two monsoon seasons. Also increased winter rains in the northwest ( more frequent WD-Western Disturbances, mid-latitude low pressure systems percolating through Himalayan Passes in the west) region of Punjab has helped improved winter wheat yields in recent years.

In summary, well-distributed rains due to prevailing La Niña and favorable IOD has helped produce record grain, fruit and vegetable yield for India for 2011/12. The IPCC science has not adequately analyzed impact of well-distributed (summer and winter) rains on grain fruit and vegetable yield, especially in the monsoonal climate of south Asia.

source of image

Comments Off on Guest Post By Madhav Khandekar – “Record Grain Yield Estimated By Indian PM For 2011/12”

Filed under Guest Weblogs, Vulnerability Paradigm

An Example Of The Application Of The Bottom-Up, Resource-Based Approach To The Assessment Of Vulnerability

We have been alerted by Robert Webb of NOAA to a study which applies the bottom-up, resource-based perspective that is discussed in my February 9 2011 post

An Insightful Post By Bill Hooke And Judy Curry – “Human Choice And Climate Change”

This study is reported in the

Central Valley Flood Management Planning Program

Although there is still too much acceptance of the multi-decadal IPCC climate predictions in part of the report, the report does include the insightful text [highlight added]

Most of the existing climate change impacts analysis uses a projection-oriented “top-down” approach that considers a range of scenarios of world development. These scenarios include greenhouse gas emissions that serve as input to GCMs. GCM output serves as input to impact models (with or without inclusion of adaptive actions). Under this approach, analysis of the probability of certain impacts could largely depend on the ability of the GCMs to characterize that probability, which may be more subjective than the level of rigor required to support a risk-based analysis (Dessai and Hulme, 2003). In flood management, risk-based analysis is often based on probabilities derived from event frequency documented in historical records. However, the extreme events and their corresponding climate signals are the most uncertain elements of the climate change research. As a result, additional consideration is necessary of an appropriate approach for a climate change vulnerability analysis in the context of flood management.

“Another approach, the “bottom-up” approach, has seen greater development and application in recent years. The bottom-up approach reflects a focus on the underlying adaptive capacity of the system under study, emphasizing broader social impacts. It is place-based and deals with specific resources of interest. Flood managers could start with existing knowledge of the system and use evaluation tools to identify changes in climate that may be most threatening to long-term management goals and practices – critical system vulnerabilities. GCM outputs are then used as a reference to assess the likelihood of such system-critical vulnerabilities (Ray et al, 2008; Dessai and Hulme, 2003). This approach may ease concerns for policy makers who are hesitant to move forward with policy decisions while climate uncertainties remain.”

My Comment:  Here is an example of the overconfidence on the multi-decadal global climate model predictions as covering the envelope of possible future climate conditions.

The report, however, in Section 2.2.1 has a good discussion as to the value of the bottom-up approach, along with specifics as to how it should be applied. An excerpt from this section reads

Assess Vulnerability

Vulnerability can be assessed from various different levels and with different focus. Critical components of the flood management system have associated thresholds of vulnerability, the crossing of which can cause undesirable consequences. The first step is to identify components and thresholds that exist on several spatial scales. Examples include a reservoir losing capacity to regulate flows downstream, a reservoir (or a system of reservoirs) exceeding its objective release, or an infrastructure (e.g., dam, levee) failure.

Once thresholds for critical system components are identified, the consequences of exceeding the thresholds on a community level can be quantified. For example, a reservoir losing its capacity to regulate downstream flows would have large-scale, systemwide consequences. The effects of crossing a systemwide threshold would likely cascade through the system, causing other thresholds to be crossed. Other critical thresholds would have more moderate, regional consequences, such as a reservoir exceeding its objective release. At the smallest, most local scale, a levee failure may have severe impacts to a specific protection area, but less impact on other parts of the flood management system and operations.

Defining critical thresholds that will need analysis requires a level of agreement among the various State, federal, and local entities with flood risk management responsibilities. It is conceivable that components with potential broader damages to communities (including natural communities) would be easier for broad agreement for CVFPP systemwide application. However, for local flood management studies with a more finite project scope, the local critical thresholds could be used without exhausting available resources.

Identify Causal Conditions

The next step is to define the hydrologic conditions required for a given threshold to be exceeded. These conditions can be described by a set of hydrologic metrics. Critical thresholds for large-scale, systemwide components will be affected by relatively fewer sets of hydrologic matrices. In contrast, critical thresholds for local components will be influenced by significantly more sets of hydrologic metrics at various locations throughout the flood management system.

Hydrologic conditions leading to threshold exceedence are linked to atmospheric patterns that can be affected by climate change. These patterns can be described by a set of atmospheric metrics that can be sampled from a future projection of climate and translated into hydrologic metrics for planning purposes. Subject to additional investigation, it is anticipated that for systemwide components, relatively fewer sets of atmospheric metrics will correspond to the hydrologic metrics, which in turn, correspond to critical thresholds, and more sets for critical thresholds for local components.

My Comment: The statement that  “t]hese patterns can be described by a set of atmospheric metrics that can be sampled from a future projection of climate and translated into hydrologic metrics for planning purposes” is an example in the report where they still accept the multi-decadal global climate model predictions as robust.  However, the rest of the text is excellent.

The Report continues

Assess Likelihood of Exceedence

The final step in the approach is to assess the likelihood of threshold exceedence. It is anticipated that this would be an assessment against baseline conditions or other base of comparison, and would be conducted qualitatively based on available GCMs. It remains to be determined whether current climate change science can provide adequate information to inform the process. If so, an analysis of the likelihood of crossing critical thresholds can be performed, and the results will inform planning analysis for further investment in the flood management system. If not, identification of vulnerabilities will help identify areas of needed climate science investment to obtain adequate information.

My Comment:   Instead of writing that “would be conducted qualitatively based on available GCMs“, The authors of the report should add to the examination of current climate, that “[it] remains to be determined whether multi-decadal climate predictions can provide adequate information to inform the process”.

Final Comment:

While, unfortunately, the change-over to the bottom-up approach, and recognition of the inability of the multi-decadal climate models to provide any skill at predicting changes in climate statistics is incomplete, the Central Valley Flood Management Planning Program  report is a movement in the direction that we propose in our paper

Pielke Sr., R.A., R. Wilby, D. Niyogi, F. Hossain, K. Dairuku, J. Adegoke, G. Kallos, T. Seastedt, and K. Suding, 2011: Dealing  with complexity and extreme events using a bottom-up, resource-based  vulnerability perspective. AGU Monograph on Complexity and  Extreme Events in Geosciences, in press.

However, there is still too much acceptance of the multi-decadal global models as having predictive skill even in the bottom-up vulnerability part of the Report. The requirement for the models to be of value is actually quite high, as they not only have to predict the current climate statistics correctly, but also the changes in these statistics over multi-decadal time scale. They have never done that, to my knowledge, even in a hindcast mode.

source of image

Comments Off on An Example Of The Application Of The Bottom-Up, Resource-Based Approach To The Assessment Of Vulnerability

Filed under Climate Science Reporting, Vulnerability Paradigm

An Insightful Post By Bill Hooke And Judy Curry – “Human Choice And Climate Change”

UPDATE February 9 2012: Bill Hooke has announced my post on his weblog Living on the Real World (see). Since comments are permitted there,  if you would like to comment, you can do that there, and I will respond there if needed.

*****************************************************************

Bill Hooke has an insightful post on his weblog titled

Human choice and climate change

which Judy Curry has also posted an excellent follow on [with the same title] in

Human choice and climate change

I want to add to this discussion here.

As Bill lists on his weblog post from Rayner and Malone in a document assessment “Ten suggestions for policymakers” offer ten suggestions to complement and challenge existing approaches to public and private sector decision making. I have reproduced these below with my comments inserted. On Judy’s post, she wrote

Wow, this is just too sensible for words.  Too bad nobody(?) seems to have paid attention to this back in 1997 (or since).   Its difficult to imagine such sensible ideas emerging from today’s currently hyperpoliticized situation.

The ten suggestions are:

1. View the issue of climate change holistically, not just as the problem of emissions reductions.

My Comment:

This was the focus of the report

National Research Council, 2005: Radiative forcing of climate change: Expanding the concept and addressing uncertainties.Committee on Radiative Forcing Effects on Climate Change, Climate Research Committee, Board on Atmospheric Sciences and Climate, Division on Earth and Life Studies, The National Academies Press, Washington,D.C., 208 pp

with the holistic figure [which I often repost on my weblog :-) ]

Our paper

Pielke Sr., R., K.  Beven, G. Brasseur, J. Calvert, M. Chahine, R. Dickerson, D.  Entekhabi, E. Foufoula-Georgiou, H. Gupta, V. Gupta, W. Krajewski, E.  Philip Krider, W. K.M. Lau, J. McDonnell,  W. Rossow,  J. Schaake, J.  Smith, S. Sorooshian,  and E. Wood, 2009: Climate change: The need to consider human forcings besides greenhouse gases.   Eos, Vol. 90, No. 45, 10 November 2009, 413. Copyright (2009) American   Geophysical Union.

provides another example  of the need for a holistic approach.

The suggestions continue:

2. Recognize that, for climate policymaking, institutional limits to global sustainability are at least as important as environmental limits.

3. Prepare for the likelihood that social, economic, and technological change will be more rapid and have greater direct impacts on human populations than climate change.

4. Recognize the limits of rational planning.

5. Employ the full range of analytical perspectives and decision aids from natural and social sciences and the humanities in climate change policymaking.

6. Design policy instruments for real world conditions rather than try to make the world conform to a particular policy model.

7. Incorporate climate change into other more immediate issues, such as employment, defense, economic development, and public health.

My Comment:  These subject are very effectively discussed in my son’s book

Pielke, R.A. Jr: 2010: The Climate Fix: What Scientists and Politicians Won’t Tell You About Global Warming. Basic Books

I recommend the statements regarding the focus of this book written on the back cover by John Marburger, Neal Lane, James Baker, and Kerry Emmaul.

The list of suggestions continue:

8. Take a regional and local approach to climate policymaking and implementation.

9. Direct resources into identifying vulnerability and promoting resilience, especially where the impacts will be largest.

We have urged the adoption of this approach in weblog posts and papers. One example is from our paper

Pielke Sr., R.A., R. Wilby, D. Niyogi, F. Hossain, K. Dairuku, J. Adegoke, G. Kallos, T. Seastedt, and K. Suding, 2012: Dealing  with complexity and extreme events using a bottom-up, resource-based  vulnerability perspective. AGU Monograph on Complexity and  Extreme Events in Geosciences, in press

where we wrote [highlight added]

We discuss the adoption of a bottom-up, resource–based vulnerability approach in evaluating the effect of climate and other environmental and societal threats to societally critical resources. This vulnerability concept requires the determination of the major threats to local and regional water, food, energy, human health, and ecosystem function resources from extreme events including climate, but also from other social and environmental issues. After these threats are identified for each resource, then the relative risks can be compared with other risks in order to adopt optimal preferred mitigation/adaptation strategies.

This is a more inclusive way of assessing risks, including from climate variability and climate change than using the outcome vulnerability approach adopted by the IPCC. A contextual vulnerability assessment, using the bottom-up, resource-based framework is a more inclusive approach for policymakers to adopt effective mitigation and adaptation methodologies to deal with the complexity of the spectrum of social and environmental extreme events that will occur in the coming decades, as the range of threats are assessed, beyond just the focus on CO2  and a few other greenhouse gases as emphasized in the IPCC assessments.

The final suggestion is

10. Use a pluralistic approach to decision-making.

My Comment:  This another topic that is covered effectively in my son’s book The Climate Fix: What Scientists and Politicians Won’t Tell You About Global Warming.

Final Comment:  The ten suggestions reported by Bill Hooke and Judy Curry provide a way forward even though the IPCC community, including in the USA, the NSF, EPA and NOAA, has chosen to ignore them.  Only if we move towards the broader-based approach will there been a movement towards constructive environmental and social policies.

Figure from Pielke et al 2012

Comments Off on An Insightful Post By Bill Hooke And Judy Curry – “Human Choice And Climate Change”

Filed under Vulnerability Paradigm