Monthly Archives: July 2005

What is Climate Change?

The different definitions of climate, have done much to confuse policymakers in the discussion of climate science.

The American Meteorological Society (AMS) definition of “climate change” is

“(Also called climatic change.) Any systematic change in the long-term statistics of climate elements (such as temperature, pressure, or winds) sustained over several decades or longer.
Climate change may be due to natural external forcings, such as changes in solar emission or slow changes in the earth’s orbital elements; natural internal processes of the climate system; or anthropogenic forcing.”

The AMS defines anthropogenic climate change as

“Climate change that occurs as a result of human activities.

The AMS defines the climate system as the

“The system, consisting of the atmosphere, hydrosphere, lithosphere, and biosphere, determining the earth’s climate as the result of mutual interactions and responses to external influences (forcing).
Physical, chemical, and biological processes are involved in the interactions among the components of the climate system.”

Here we have an inconsistency with the definition even by a very distinguished professional society! Climate, as defined by the AMS, is focused on the atmosphere, while the climate system consists of the atmosphere, hydrosphere, lithosphere, and biosphere. No wonder policymakers misapply this terminology.

As one example of the misuse by policymakers, the Royal Society released the following statement by Lord May:

“The science points to the need for a Herculean effort to make massive cuts in the amount of greenhouse gases that we pump into the atmosphere. So, while this encouraging new deal may play a role in this, it will only be part, and not all, of the solution.

“But we have serious concerns that the apparent lack of targets in this deal means that there is no sense of what it is ultimately trying to achieve or the urgency of taking action to combat climate change. And the developed countries involved with this agreement must not be tempted to use it as an excuse to avoid tackling their own emissions.”

“All eyes should be on the United Nations Framework Convention on Climate Change in Montreal at the end of November. Top of the agenda at this meeting should be the initiation of a study into what concentration of greenhouse gases in the atmosphere we can allow without suffering the most catastrophic effects of climate change. This would allow us to plan cuts in worldwide emissions accordingly and provide direction to such efforts to tackle what is the biggest environmental threat we face today.”

Here the conclusion is made that to “combat climate change” we must initiate “a study into what concentration of greenhouse gases in the atmosphere we can allow without suffering the most catastrophic effectsof climate change.”

Ignored in this statement is the role of the other anthropogenic climate forcings that we identified in the National Research Council report.

Lord May, President of the Royal Society, has clearly overlooked a very critical definition of what really constitutes the climate system and what the anthropogenic forcings and feedbacks are that influence climate. He is, unfortunately, cherrypicking climate science.

Leave a comment

Filed under Definition of Climate

What is the Importance to Climate of Heterogeneous Spatial Trends in Tropospheric Temperatures?

The 2005 National Research Council report concluded that:

“regional variations in radiative forcing may have important regional and global climate implications that are not resolved by the concept of global mean radiative forcing.”

And furthermore:

“Regional diabatic heating can cause atmospheric teleconnections that influence regional climate thousands of kilometers away from the point of forcing.”

This regional diabatic heating produces temperature increases or decreases in the layer-averaged regional troposphere. This necessarily alters the regional pressure fields and thus the wind pattern. This pressure and wind pattern then affects the pressure and wind patterns at large distances from the region of the forcing which we refer to as teleconnections.

The regional diabatic forcing can be caused by land-use/land-cover change (e.g. , Chase et al. 2000a) or by aerosol emissions. Even natural surface variations such as in ocean color produce such teleconnections in a general circulation model (see Atmospheric response to solar radiation absorbed by phytoplankton Shell et al. 2003)

There is debate, however, regarding whether the magnitude of the regional diabatic forcing is large enough to result in long distance teleconnections. However, observed multi-decadal trends in tropospheric-averaged temperatures are large enough to result in large-scale circulation trends (see, for example, A Comparison of Regional Trends in 1979-1997 Depth-Averaged Tropospheric Temperatures for the magnitude of the 1979-1997 regional trends). Thus land-use/land-cover changes and aerosol clouds that produce regional tropospheric temperature anomolies of a similar magnitude (or larger magnitude) would be expected to have significant teleconnection effects.

If this is true, than regional diabatic heating due to human activities represents a major, but under-recognized climate forcing, on long-term global weather patterns. Indeed, this heterogenous climate forcing may be more important on the weather that we experience than changes in weather patterns associated with the more homogeneous spatial radiative forcing of the well-mixed greenhouse gases (see the NASA press release, which is based on the multi-authored paper The influence of land-use change and landscape dynamics on the climate system: relevance to climate change policy beyond the radiative effect of greenhouse gases).

Leave a comment

Filed under Climate Change Metrics

What is a Record Heat Wave, or a Record in Any Climate Metric?

The discussion on the significance of the recent heat wave in eastern Colorado continues (see the Rocky Mountain News article “Hot streak has experts divided“). This article illustrates an important issue in climate science: What measures do we use to identify a heat wave (or other climate extreme) as an all-time record? Can we make such claims from a record at an individual location?

The use of a single station, of course, is fraught with problems. The Denver site, which is referred to in the news article, is located at Denver International Airport (its location is given in this figure). Its position close to runways and buildings raises the issue of its exposure (we have requested photographs of the site). The site has also been moved twice since it was installed. Klaus Wolter and I (who are both referred to in the Rocky Mountain News article) will be completing a co-authored scientific paper over the next several weeks in order to place the heat wave in context, and we will report on our conclusions then. This is the proper procedure to determine if the heat wave in Colorado was really “unprecedented”, or is just an artifact of one observation site.

The highlighting of data from single sites has been misused in the past. For example, Alward et al. titled their paper “Grassland vegetation changes and nocturnal global warming“. This very comprehensive title was based on a reported increase in growing season at a single observation site in eastern Colorado. When we investigated other sites in eastern Colorado, we found that the site used in the Alward et al study was clearly not representative of the trends in growing season for other locations in eastern Colorado.

In our paper Spatial Representativeness of Temperature Measurements from a Single Site, we concluded that “It is unlikely that one or a few weather stations are representative of regional climate trends…”


“the assessment of a group of stations for …more qualitative trends….provides a reasonably robust procedure to evaluate climate trends and variability”.

We used temperature thresholds such as number of days with the temperatures above or below thresholds (such as the number of days above 90°F and 100°F).

Melillo used the results of the Alward et al paper as further evidence that the central grasslands and the Earth were warming. The title of his paper was “Perspectives: climate change – warm,warm on the range“. Escalating the issue further, the Associated Press report on this work sensationalized that “Global warming could mean trouble for ranchers on the plains of Colorado and New Mexico.” All of this based on data from one observation site!

Clearly, the interpretation of the recent heat wave in Colorado needs to be investigated more rigorously, than was reported in the Rocky Mountain News article. Also, other states would benefit by placing their reports of records in context using data from the set of observations available and not just from individual sites.

Leave a comment

Filed under Climate Change Metrics

What Role Does Land-Use Change Have in Climate Science?

An excellent paper has just appeared on July 22, 2005 in Science by Jon Foley and colleagues entitled “Global Consequences of Land Use” (subscription required). Its abstract starts with

“Land use has generally been considered a local environmental issue, but it is becoming a force of global importance.”

The recognition of the importance of land use is overdue (see, for example, R-258 and R-267). The Foley et al. paper and in their supporting online material provide new insight into the global importance of land-use change. Tables S2A and S2B in their online material provide summaries of how large-scale land-cover change associated with the removal of different biomes could affect climate, including temperature and precipitation.

The paper goes beyond climate issues, however, and discusses the far-reaching and significant diversity of other environmental and societal effects of land-use change. This focus fits with the discussion of vulnerability presented in our Climate Science blog by Dev and Dita Niyogi on July 19th. The need to focus on a bottom-up perspective based on a vulnerability perspective, rather than a top-down emphasis in which the global circulation models are used as the starting point of an examination of environmental effects, was emphasized also in CB-37.pdf; CB-38.pdf; CB-39.pdf; CB-40.pdf; CB-41.pdf and CB-42.pdf.

Leave a comment

Filed under Vulnerability Paradigm

What Are The Major Recommendations of the 2005 National Research Council Report Entitled Radiative Forcing of Climate Change: Expanding The Concept And Addressing Uncertainties?

Since these recommendations were not communicated to the July 21, 2001 U.S. Senate Committee Hearing on “Climate Change Science and Economics”, I have reproduced without comment, text from Chapter 7 in the report which starts at

The current global mean top-of-the-atmosphere (TOA) radiative forcing concept with adjusted stratospheric temperatures has been used extensively in the climate research literature over the past few decades and has also become a standard tool for policy analysis endorsed by the Intergovernmental Panel on Climate Change (IPCC). It is a useful index for estimating global average surface temperature change resulting from changes in well-mixed greenhouse gases, solar irradiance, surface albedo, and non-absorbing aerosols. The relative ease of calculating radiative forcing and the associated temperature response has enabled the use of climate models, simpler versions of those models, and chemical transport models to investigate the many factors that may influence climate. In short, the TOA radiative forcing concept still has considerable value and should be retained as a standard metric in future climate research.

Nonetheless, the traditional radiative forcing concept has major limitations that have been revealed by recent research on non-conventional forcing agents and regional studies. It is limited in its ability to describe the climate effects of absorbing aerosols, aerosol interactions with clouds, ozone, land-surface modification, and surface biogeochemical effects. Also, it diagnoses only one measure of climate change: equilibrium response of global mean surface temperature. It does not provide information on nonradiative climate effects, spatial or temporal variation of the forcing, or nonlinearity in the relationship between forcings and surface temperature response. Recent extensions of the concept that allow surface temperatures to adjust have refined the radiative forcing concept to address deficiencies in the original approach. Although currently applied to global mean conditions, this method could be extended for regional conditions.

The strengths of the traditional radiative forcing concept warrant its continued use in scientific investigations, climate change assessments, and policy applications. At the same time, its limitations call for using additional metrics that account more fully for the nonradiative effects of forcing, the spatial and temporal heterogeneity of forcing, and nonlinearities. The committee believes that these limitations can be addressed effectively through the introduction of additional forcing metrics in climate change research and policy. This chapter provides several recommendations for extending the traditional radiative forcing concept in the scientific and policy arenas. It identifies research needed to improve quantification and understanding of different forcings and their impacts on climate, to better inform climate policy discussions, and to obtain reliable observations of climate forcings and responses in the past and future. A large number of recommendations are provided because many research avenues need to be explored in order to improve understanding of climate forcings.

The text continues at

Leave a comment

Filed under Climate Science Reporting

Did The July 21, 2001 U.S. Senate Committee Hearing On “Climate Change Science And Economics” Provide A Balanced Perspective On The Climate Science Issues?

On July 21, 2005, the U.S. Senate Energy and Natural Resources Committee held a Full Committee Hearing entitled “Climate Change Science and Economics.” The Hearing was:

“To receive testimony regarding the current state of climate change scientific research and the economics of strategies to manage climate change. Issues to be discussed include: the relationship between energy consumption and climate change, new developments in climate change research and the potential effects on the U.S. economy of climate change and strategies to control greenhouse gas emissions.”

I am particularly interested in learning what testimony was given since I was called on July 11 and invited to present testimony at this Hearing. However, on July 13, I was e-mailed

“Dr. Pielke: we have had a change in plans. We have decided to ask NCAR
to provide a senior scientist from that organization for the hearing.

As a result we won’t be asking you to drop everything and appear at our
hearing. My apologies for the confusion.”

When I read the testimony that was presented, Dr. Jim Hurrell of NCAR was my “replacement.” He provided a much different perspective on the science issue than I would have given ( For example, he reported

“…. The CCSP Assessment Product on Temperature Trends in the Lower Atmosphere is assessing these new data, and the preliminary report (which has been reviewed by the NRC) finds that the surface and upper-air records of temperature change can now, in fact, be reconciled. Moreover, the overall pattern of observed temperature change in the vertical is consistent with that simulated by today’s climate models.”

The CCSP Report he refers to has not been finalized, nor has the final version been subjected to public comment (I am a Convening Lead Author on the chapter “What measures can be taken to improve our understanding of observed changes?”). The revised Executive Summary has not even been circulated to the Committee (in the draft version that was reviewed by the National Research Council there were major issues with the draft Executive summary (see, which were so serious that I authored a report of its deficiencies (Pielke Sr., Roger A., 2005: Minority Report, Comments Provided to the NRC Review Committee of the U.S. Climate Change Science Program’s Synthesis and Assessment Product on Temperature Trends in the Lower Atmosphere). His statements “that the surface and upper-air records of temperature change can now, in fact, be reconciled” and “the overall pattern of observed temperature change in the vertical is consistent with that simulated by today’s climate models” oversimplify and mischaracterize the text as it currently exists. Moreover, these are not scientifically balanced conclusions. This testimony is an example of cherrypicking of information to promote a particular view of climate science.

Indeed, other testimony similarly cherrypicked information. For instance, while Dr. Ralph Cicerone in this testimony ( included some information from the National Research Council report , he missed the opportunity to educate the Committee on the spectrum of newly recognized human climate forcings, as reported in the NRC (2005) report, and how this complicates our ability to achieve skillful climate forecasts. He should have summarized the findings of that report in his testimony. As President of the National Academy of Science, it is particularly important that he provide a balanced presentation of climate science. He did not do so.

If my invitation to present had not been withdrawn, I would have built on my 2002 testimony to the U.S. House Subcommittee on Oversight and Investigations which is part of the Energy and Commerce Committee. This testimony was given in my capacity as President-Elect of the American Association of State Climatologists. I would have used the Findings in the National Research Council report, my invited essay, and other recent work in the science community to prepare my testimony.

Unfortunately, the Senators were not provided a balanced Hearing on climate science.

Leave a comment

Filed under Climate Science Reporting

Are Multi-decadal Climate Forecasts Skillful?

In one of our July 11, 2005 posts, climate was defined so that climate forecasts are forecasts of the future state of the atmosphere, oceans, land, and continental glaciers, as defined using physical, chemical, and biological variables that we can measure. We can apply local, regional, or global averages over any time period we choose to characterize the future state of the climate. Weather forecasts are a subset of climate forecasts, in that we limit our forecasts to weather conditions, averaged over 12-hour periods, for example, out to a week or more, and generally assume a number of climate variables, such as vegetation and sea-surface temperatures, are invariant over this time period. It is important to note that the averaging time is not what distinguishes weather from climate (e.g., although called “seasonal climate predictions”, these forecasts are more accurately “seasonal-averaged weather predictions”).

As a necessary condition, climate forecasts must be able to skillfully reconstruct the observed temporal and spatial variability and change of local, regional, and global climate variables, when the forecast models are only given the external forcings (such as solar irradiance, volcanic eruptions, CO2 concentrations) as illustrated in Figure 1-2 in Radiative Forcing of Climate Change: Expanding the Concept and Addressing Uncertainties (2005). See also Tables 1 and 2 in Dynamical downscaling: Assessment of value retained and added using the Regional Atmospheric Modeling System (RAMS) where climate forecasts are called a Type 4 model simulation.

In 2000, we published a paper which demonstrated that the general circulation models were unable to skillfully reconstruct even the globally-averaged mid-tropospheric temperature trend during the 1979-2000 time periods. Thus, as of that date, the climate prediction models were shown to not be able to skillfully forecast the future climate even with respect to a single globally-averaged climate variable. (I am on a CCSP committee entitled “Temperature Trends in the Lower Atmosphere: Steps for Understanding and Reconciling Differences” and will update our assessment of the issue of climate prediction skill as soon as the report is public).

Mike MacCracken in his essay response to my Climatic Change essay seeks to distinguish a “prediction” from a “projection.” However, this only obscures the discussion, as GCM results are obviously packaged as forecasts in that specific time periods in the future are presented (see, as just one example, the 2070-2100 forecasts of the United Kingdom Hadley Centre). Even Mike recognizes there is no regional predictive skill in his paper entitled “Reliable regional climate model not yet on horizon.”

A conclusion of our evaluation is that papers which appear in the literature that present future values of a subset of (or all) climate variables are misrepresenting their results by implying that they are forecasts. They should be presented as sensitivity studies (as a process study; see my July 15 post on the types of model applications).

We can illustrate their misuse as forecasts by an analog. If we could run a numerical weather prediction model to provide a forecast of rainfall for tomorrow and publish a paper on it today, would this be considered sound science justifying a paper? Of course not. First we would want to wait to see if the forecast was skillful. This is possible with weather forecasts for tomorrow, but we cannot yet verify a climate forecast model’s skill, for decadal-averaged weather conditions decades into the future.

The climate modeling community runs ensembles of multi-decadal predictions (with different initial conditions, different models) and they average their results over decadal time periods, which they claim distinguishes their simulations from the numerical weather prediction community’s application. Of course the numerical weather prediction community also runs ensembles of simulations. The fundamental difference is that the weather community can validate their model results thousands of times. There is no such ability with multi-decadal climate prediction models.

Our conclusions are the following:

  1. Peer-reviewed papers, and national and international assessments, which present model results for decades into the future, or provide impact studies in response to these model simulations, should never be interpreted as skillful forecasts (or skillful projections). They should be interpreted as process (sensitivity) studies, even though the authors use definitive words (such as this “will” occur) and display model output with specific time periods in the future.
  2. The US National Assessment, which provided model simulations on regional scales for the coming decades, is inaccurately portrayed when their results are given to stakeholders with the interpretation that their results bracket what is expected in the future. This is misleading when transmitted to policymakers, as process studies are inappropriately interpreted to be forecasts.
  3. Climate forecasts (projections) decades into the future have not demonstrated skill in forecasting local, regional, and global climate variables. They have shown that human climate forcing has the capacity to alter the climate system, but we should not present these model simulations as forecasts. To present them as forecasts is misleading to policymakers and others who use this information.

Leave a comment

Filed under Climate Models

Did Denver Tie Its All-Time Measure Heat Record on July 21, 2005?

The news in Colorado is highlighting the 105°F temperature recorded at Denver International Airport as tieing the all-time Denver record. However, the Airport site was established in 1995. Thus, we do not know if this was a long-term temperature record at this site. Other sites in eastern Colorado were hot, but they did not all exceed an all-time record (the Fort Collins observing site, for example, did not even reach 100°F yesterday, although it was still hot!). An accurate media perspective of this “all-time record” value is given in this Rocky Mountain News article.

This heat wave again illustrates why we also need to monitor moist enthalpy as discussed in the posting of July 18th . The dewpoint temperatures were in the upper 20s F, while the temperature was in above 100°F at the airport. The actual heat content of the air should also be included when discussing heat waves.

The answer to the question is that the official site for the Denver measurement tied its all-time record temperature. However, the location of this official measurement has moved over time, so we really do not know whether it really is the day with the highest temperature for the city in general. At the Colorado Climate Office, we will collect the data from around the state to place this heat wave in context (we expect to report on this in August after all of the data arrives from the cooperative weather observers). In terms of heat in the air (moist enthalpy) we recommend that this important climate metric be also tracked to really determine what is the hottest day.

Leave a comment

Filed under Climate Change Metrics

Water Vulnerability – A Topic for the Next G8?

The recent G8 meeting will be remembered, amongst the other items discussed, for the unfortunate London bombings and the somewhat lame climate change initiative that resulted after all the fanfare about this being the place to highlight the issue of the century — climate change.

But are we really correct in calling climate change the only critical issue we are dealing within the Earth system today? Clearly climate change is an issue that needs a framework and policy developed by the global community to help solve some fundamental issues such as reduction in GHG emissions, technology adaptation, and development of scientific concepts to sequester GHGs, etc.

It is important that the scientific community demonstrate its ability and propensity to adapt a broader perspective to encompass a more holistic perspective that considers the vulnerability of the Earth’s resources and possible adaptability and mitigative strategies. It is then feasible that the disconnect between science and the compromises that policymakers and the populace have to make in reaching the decisions between a myriad of choices in abating climate change possible.

The vulnerability of water resources as a global issue is indeed critically poised even as the world debates climate change. There is growing evidence (Douglas et al. 2005, Nat. Haz, in review) that between 1990 and 2025 the number of people living in countries without adequate water is projected to rise from 131 million to 817 million. India is supposed to fall into the water stress category long before 2025 (Shiva, 2002).

Let’s continue with the example of India to illustrate a few areas where the hydrological vulnerability problem lie. Climate change can contribute to the variations in the natural water cycle and cause stress on the water resources. Over and beyond that, there are significant societal issues which have more direct impacts on the water resources (and vice versa). For instance, the increasing trend in privatizing water sources has played a daunting role in the inequity of water access. Private ownership, rather than collective sharing, has left many villagers paying exorbitant rates for water (something that is nearly impossible to afford for sustenance farmers) or spending hours locating alternate water sources. As water availability decreases, malnutrition, disease, and infant mortality increase.

Another problem that plagues communities that are stressed with the difficult choice between water vulnerability and economic choices, is related to the commodity returns. For instance, rural India is shifting from farming food crops to cash crops. Sainath notes (1999) in many areas water intensive sugarcane is replacing traditional yield such as wheat. Sugarcane requires ten times as much water as wheat!

As water needs increase, more resources are utilized and consumption goes well beyond the recharge potential of water sources.
Urban areas are not immune to water vulnerability either. Manufacturing requires water while industrial waste pollutes rivers and water sheds. Villagers migrate to cities due to water shortages, land and water ownership issues and lack of economic opportunity, and increase the burden on the already overpopulated urban areas. Excess population leads to water scarcity since the resources remain near constant. Often improper sanitation facilities add to the contamination of water, leaving even less water for human consumption.
Note that the water vulnerability is not only a problem for the developing world. In the United States, wells have dried up from water depletion in places like Texas, Oklahoma, and Kansas (Brown, 2003).

So whether G8 faces up to the fact that water resource vulnerability is a severe environmental disaster the world is facing or not — the rural and urban communities across the world will live through a decade which will make or break the social infrastructure or what it can develop into or what it can provide to its population.

Brown, L. (2003) World Creating Food Bubble Economy Based on Unsustainable Use of Water,
Douglas E., D. Niyogi, S. Frolking, J.B. Yeluripati, R. A. Pielke Sr., N. Niyogi, C. J. Vörösmarty, U.C. Mohanty (2005) Changes in moisture and energy fluxes due to agricultural land use and irrigation in the Indian Monsoon Belt, J. Natural Hazards (Monsoon Special Issue), in review.
Sainath, P. (1999) Everybody Loves a Good Drought, Headline Book Publishing, London, GB. pp. 255-292.
Shiva, V. (2002) Water Wars, South End Press, Cambridge, MA, pp. 1, 20

Leave a comment

Filed under Vulnerability Paradigm

What Does Moist Enthalpy Tell Us?

In our blog of July 11, we introduced the concept of moist enthalpy (see also Pielke, R.A. Sr., C. Davey, and J. Morgan, 2004: Assessing “global warming” with surface heat content. Eos, 85, No. 21, 210-211. ). This is an important climate change metric, since it illustrates why surface air temperature alone is inadequate to monitor trends of surface heating and cooling. Heat is measured in units of Joules. Degrees Celsius is an incomplete metric of heat.

Surface air moist enthalpy does capture the proper measure of heat. It is defined as CpT + Lq where Cp is the heat capacity of air at constant pressure, T is air temperature, L is the latent heat of phase change of water vapor, and q is the specific humidity of air. T is what we measure with a thermometer, while q is derived by measuring the wet bulb temperature (or, alternatively, dewpoint temperature).

To illustrate how important it is to use moist enthalpy, we can refer to the current heat wave in the southwest United States. The temperatures in Yuma, Arizona, for example, have reached 110°F (43.3°C), but with dewpoint temperatures around 32°F (0°C). In terms of moist enthalpy, if the temperature falls to 95°F (35°C) but the dewpoint temperature rises to 48°F, the moist enthalpy is the same. Temperature by itself, of course, is critically important for many applications. However, when we want to quantify heat in the surface air in its proper units in physics, we must use moist enthalpy.

In terms of assessing trends in globally-averaged surface air temperature as a metric to diagnose the radiative equilibrium of the Earth, the neglect of using moist enthalpy, therefore, necessarily produces an inaccurate metric, since the water vapor content of the surface air will generally have different temporal variability and trends than the air temperature.

There are quite a few other issues with using the global-averaged surface temperature to characterize climate change (see NRC 2005, Radiative Forcing of Climate Change: Expanding the Concept and Addressing Uncertainties). The realization that temperature is an incomplete measure of heat adds another problem to its use.

1 Comment

Filed under Climate Change Metrics