Monthly Archives: January 2007

Christine Todd Whitman On The Importance of Land Use Changes

There is an interesting weblog on January 2 2007 by Christine Todd Whitman on the weblog of the Weather Channel. [You need to scroll down to see it]. It is titled “Land Use Changes” and has the text at the beginning of her weblog,

“Land use changes have as great an impact as greenhouse gas emissions. Changes in farming practices, the deforestation of large areas and the rapid pace of development are stressing our environment and reducing nature’s natural ability to absorb carbon.”

Ms. Whitman, of course, was EPA Administrator in the early years of the current President Bush administration and 50th Governor of New Jersey(see).

While her weblog is brief and she does not go into any depth on her statement, it certainly agrees with the conclusion presented in the NASA press release (Landcover Changes May Rival Greenhouse Gases As Cause Of Climate Change).

This is a conclusion that has been documented in a number of peer reviewed papers that have been summarized on Climate Science (see).

We hope that the upcoming IPCC report will be as enlightened in their assessment of the diverse role of human climate forcings.

Leave a comment

Filed under Climate Change Metrics

Consequences Of Missing Climate Forcings In Analyses Of Attribution Of Multi-decadal Surface Temperature Trends

There is a recent paper that illustrates how an extensive analysis of surface temperature change can miss critically important information. The paper by M. R. Allen , N. P. Gillett, J. A. Kettleborough, G. Hegerl, R. Schnur, P. A. Stott, G. Boer, C. Covey, T. L. Delworth, G. S. Jones, J. F. B. Mitchell and T. P. Barnett is

“Quantifying anthropogenic influence on recent near-surface temperature change” in Journal Surveys in Geophysics in the September, 2006 issue of Surveys in Geophysics Springer Netherlands ISSN 0169-3298 (Print) 1573-0956 (Online) Subject Earth and Environmental Science Issue Volume 27, Number 5.

The abstract reads,

“We assess the extent to which observed large-scale changes in near-surface temperatures over the latter half of the twentieth century can be attributed to anthropogenic climate change as simulated by a range of climate models. The hypothesis that observed changes are entirely due to internal climate variability is rejected at a high confidence level independent of the climate model used to simulate either the anthropogenic signal or the internal variability. Where the relevant simulations are available, we also consider the alternative hypothesis that observed changes are due entirely to natural external influences, including solar variability and explosive volcanic activity. We allow for the possibility that feedback processes, other than those simulated by the models considered, may be amplifying the observed response to these natural influences by an unknown amount. Even allowing for this possibility, the hypothesis of no anthropogenic influence can be rejected at the 5% level in almost all cases. The influence of anthropogenic greenhouse gases emerges as a substantial contributor to recent observed climate change, with the estimated trend attributable to greenhouse forcing similar in magnitude to the total observed warming over the 20th century. Much greater uncertainty remains in the response to other external influences on climate, particularly the response to anthropogenic sulphate aerosols and to solar and volcanic forcing. Our results remain dependent on model-simulated signal patterns and internal variability, and would benefit considerably from a wider range of simulations, particularly of the responses to natural external forcing.”

However, this paper suffers from a neglect of important information on issues with the robustness of the surface air temperature trend data that they used, as well as the absence in their study of recognized first order climate forcings and feedbacks. Thus their attribution study perpetuates the incorrect view that the first order human climate forcings are dominated by the well-mixed greenhouse gases and anthropogenic sulphate aerosols.

If I were a referee of this paper, I would have recommended, that despite their extensive statistical attribution procedures, they neglected an assessment of:

1. Whether existing peer reviewed papers document systematic biases in the surface air temperature trend record.

2. Whether all of the important first order human climate forcings, as documented in the peer reviewed literature, were included in their study.

They accepted without question the surface temperature trend data, and failed to discuss the other climate forcings, such as have been documented in the 2005 National Research Council report.

To illustrate this failure with respect to #1, an excerpt from their paper states

“Systematic errors in the observations would have a much more substantial impact on results. Quantitative estimates of the impact of known sources of potential systematic error, such as the so-called ‘‘urban heat island’’ effect, indicate they are likely to have only a minor effect on results Parker (2004). The possibility of a completely unknown source of bias contaminating the early century instrumental temperature record will always remain a caveat, but more recent studies using multiple strands of climate proxy data, not all of which need to be calibrated against the instrumental record, provide some independent support (e.g., Hegerl et al. 2003, 2006).”

If there was no peer reviewed documentation of systematic biases, this would be an appropriate statement. However, there is a developing literature that indicates a positive temperature bias in the land surface air minimum temperatures (e.g. see the Climate Science summary), which necessarily means there is a warm bias in the data that they used to compare with the models. They should have addressed this issue, and as a referee, I would have strongly urged to the Editor that they do so.

To illustrate #2, the excerpt from their paper on climate forcings reads,

“A range of external climate forcings were considered, including anthropogenic greenhouse gases (G); the direct radiative forcing due to sulphate aerosols (S); the combination of indirect sulphate forcing with tropospheric ozone changes (I); variations in total incoming solar irradiance from the Hoyt and Schatten (1993), reconstruction (H), extended with satellite data (W. Ingram, pers. comm.); solar variations from the Lean et al. (1995) reconstruction (L); and volcanic aerosol forcing from the Sato et al. (1993), reconstruction (V).”

However, the National Research Council report documents a variety of human climate forcings that are ignored (e.g. see the summary). These include:

1. aerosol black carbon
2. black carbon deposition on snow and ice
3. the semi-direct aerosol effect
4. the glaciation effect of aerosols
5. land use/land cover change
6. the biogeochemical effect of added CO2
7. nitrogen deposition

There are peer reviewed papers that support each of these climate forcings. The Allen et al paper has deliberately or inadvertently ignored including these climate forcings in their model simulations.

Thus, the attribution paper uses a data set (average surface temperature trends) with a now recognized warm bias to compare with model results which incompletely represent the actual climate forcings. This paper, unfortunately, perpetuates an inaccurate evaluation of climate forcing signatures, with the result that policymakers are poorly informed with respect to the actual role of the diverse range of climate forcings on regional and global climate.

Leave a comment

Filed under Climate Change Forcings & Feedbacks, Climate Models, Climate Science Misconceptions

Comment on “Climate resets ‘Doomsday Clock’ “

The news release in the BBC by Molly Bentley entitled “Climate resets ‘Doomsday Clock’ “ is a disappointing example of the lack of balance in the media (thanks to Eric Harmsen for alerting me to this). The article includes the highlight

“Experts assessing the dangers posed to civilisation have added climate change to the prospect of nuclear annihilation as the greatest threats to humankind.”

The “Clock” is prepared by the “Bulletin of the Atomic Scientists”.

The news release further writes that

“Not since the darkest days of the Cold War has the Bulletin, which covers global security issues, felt the need to place the minute hand so close to midnight.”

and

“Growing global nuclear instability has led humanity to the brink of a ‘Second Nuclear Age,’ the group concluded, and the threat posed by climate change is second only to that posed by nuclear weapons.

‘When we think about what technologies besides nuclear weapons could produce such devastation to the planet, we quickly came to carbon-emitting technologies,’ said Kennette Benedict, executive director of the Chicago-based BAS.”

This type of news perpetuates the misinformation on the issues of climate to the public. Statements such as

“This is the first time it has included climate change as an explicit threat to the future of civilisation”,

are not consistent with any existing climate assessment.

Claims such as

“”Whether it’s a threat of the same magnitude or slightly less or greater is beside the point,” said Michael Oppenheimer, a geoscientist from Princeton University, US.”

have absolutely no basis in the published climate science literature. That the BBC news article chose not to question this conclusion (nor the climate science credentials of the indiviudals who prepare the “Doomsday Clock”) demonstrates an obvious very serious bias in perspective.

Leave a comment

Filed under Climate Science Reporting

A New Paper On The Role of Land Surface Proceeses Within The Climate System

A new article has been published that provides yet another excellent example of the major role of land surface processes within the climate system. The paper is

Haydee Salmun and Andrea Molod, 2006: Progress in modeling the impact
cover change on the global climate
. Progress in Physical Geography 30, pp. 737–749

The abstract reads,

“The prediction of the impact of anthropogenic land use change on the climate system hinges on the ability to properly model the interaction between the heterogeneous land surface and the atmosphere in global climate models. This paper contains a review of techniques in general use for modeling this interaction in general circulation models (GCMs) that have been used to assess the impact of land use change on climate. The review includes a summary of GCM simulations of land cover change using these techniques, along with a description of the simulated physical mechanisms by which land cover change affects the climate. The vertical extent to which surface heterogeneities retain their individual character is an important consideration for the land atmosphere coupling, and the description of a recently developed technique that improves this aspect of the coupling is presented. The differences in the simulated climate between this new technique and a technique in general use include the presence of a boundary layer feedback mechanism that is not present in simulations with the standard technique. We postulate that the new technique when implemented in a GCM has the potential to guide an improved understanding of the mechanisms by which anthropogenic land use change affects climate.”

Excerpts from the conclusion reads,

“A newly developed technique to model the land atmosphere coupling in a GCM was discussed, which addresses some existing limitations in the models and may affect the results of LCC studies. We reviewed a study conducted using the new technique which revealed the presence of a boundary layer eddy diffusion feedback mechanism that was not present in simulations with the standard technique.”

“An element of the boundary layer feedback mechanism found in the study comparing EM [extended mosaic technique] to a standard mosaic (Figure 3) is the possibility that intensifying the turbulence can lead to either an increase or a decrease in the intensity of the hydrological cycle, depending on the character of the underlying surface. This suggests that in a deforestation simulation using EM the impact of the change in the intensity of the boundary layer turbulence due to the lowered surface roughness may result in an increase in evaporation if the surface is wet enough. If, for example, at the early stages of deforestation the surface is moist, it is possible to enhance the evaporation process despite the removal of the trees. As deforestation increases and the other stresses on the moisture become more important, the drying of the surface may result in reversal of the causality, whereby a decreased roughness will result in a decreased evapotranspiration. The use of EM allows for the possibility of this alternative response, which may be relevant for investigating the impact of LCC such as deforestation.”

This study provides yet another reason that land use processes must be an integral component of climate assessments such as the IPCC

Leave a comment

Filed under Climate Change Forcings & Feedbacks

The Complex Role of Dust In Weather and Climate

An interesting paper on the role of aerosols on weather has appeared. While the authors focus on short-term weather, the significance of their study to multi-decadal climate simulations is obvious since long term climate is the integration of short-term components of the climate system.

The paper (brought to my attention by Dev Niyogi) is

Grini A., P. Tulet, L. Gomes (2006), Dusty weather forecasts using the MesoNH mesoscale atmospheric model, J. Geophys. Res., 111, D19205, doi:10.1029/2005JD007007.

The abstract reads

“Several studies have shown the importance of aerosols in the Earth’s radiative balance. The radiative effect of dust has earlier been quantified in global and regional climate models. Fewer studies have included prediction of aerosols online in numerical weather prediction (NWP) models. Predicting climate effect of aerosols is different from including aerosols actively in weather prediction because climate models give average responses over long timescales, whereas the purpose of weather prediction models is to calculate the right weather on short timescales. In this paper, we run a mesoscale NWP with four different assumptions on dust aerosols: (1) no dust, (2) climatological data set for dust aerosols, (3) forecasted dust using the Dust Entrainment and Deposition model (DEAD) and (4) same as assumption 3 but with decreased single scattering albedo. We show that the assumption on dust is important for predicting ground temperature and convective precipitation in the model. Dust changes the model physics through changing the radiative fluxes. We interpret the changes through analyzing the energy budgets for four zones close to the major dust sources. The zones include both convective and nonconvective areas. In convective areas over ocean, dust can decrease convective activity. The vertical gradient of the aerosols and their single scattering albedo determine how efficient they are. Over land, dust also influences the surface energy budget so that decreased latent heat fluxes result if dust plumes pass over areas where water evaporates from the surface. ”

Excerpts from the conclusion are

“We find that our modeled dust plume has a ToA radiative forcing of maximum 100 W/m2 over ocean. The maximum shortwave atmospheric forcing is around 400 W/m2 and the surface net SW forcing of around 550 W/m2. The maximum greenhouse effect of dust is 10–15 W/m2 over hot desert surfaces.”

“We calculated budgets of potential temperature for four different regions to evaluate the effect of dust on the atmospheric circulation. The effect of dust is very different depending on geographical region, dust optical properties and dust vertical distribution.”

“Over ocean, in the ITCZ, the response to dust SW radiative heating is less convection when the aerosols have a distinct vertical distribution, whereas there is no particular response when they are homogeneously distributed. Over less convective areas over ocean, the response to dust SW radiative heating is change in advection.”

“Over land, dust aerosols not only stabilize the atmosphere. Dust also interacts with the surface energy budget since surface temperatures can change on shorter timescales than the ocean temperature. Over the dry Saharan areas, the dust changes the boundary layer mixing, and modifies advection patterns. In the Sahel region where evaporation from the surface is an important source for atmospheric water vapor, the presence of dust reduces convection significantly.”

“The interactions between the dust and the weather forecast is sufficiently large that we recommend that dust forecasts is included in atmospheric models for which the goal is to predict the weather in the Sahel region.”

These conclusions also requires that dust (and its changes over decades) be an integral component of multi-decadal global and regional climate predictions. The spatially varying character of its effect also shows why a regional focus on climate variability and change are required, as has been concluded on Climate Science (see).

Leave a comment

Filed under Climate Change Forcings & Feedbacks

A January 14 2007 New York Times Article By Andy Revkin Contains Climate Science Oversimplifications And Errors

Andy Revkin had an article on January 14, 2007 in the New York Times entitled The Basics Connecting the Global Warming Dots

Unlike his earlier article which I was impressed with and which Climate Science posted a weblog on (see), this new article presents, at best, a grossly oversimplified summary of the science of the human role in climate change, and more specifically, the human role in global warming.

Perhaps this article was written in response to the pressure he must have received on his balanced January 1 2007 article. In any case, this new January 14 article is a disappointment to anyone who values objectivity in news articles.

I will document four of his oversimplifications/errors below (but these are not even all of his mistakes!). The first clear oversimplification/error in his presentation is his statement that,

“The impact of a buildup of carbon dioxide and other greenhouse gases is now largely undisputed. Almost everyone in the field says the consequences can essentially be reduced to a formula: More CO2 = warmer world = less ice = higher seas. (Throw in a lot of climate shifts and acidifying oceans for good measure.)”

I assume I must fit outside of his category of “almost everyone”. His presentation of a simple linear model between added CO2 and less ice and higher seas ignores the complications that result associated with the biogeochemical effects of added CO2, as well as the requirement for a positive water vapor/cloud feedback in the hydrologic cycle in order for significant warming to occur.

It is true that if ONLY the radiative effect of added CO2 were considered, the lower atmosphere would warm. However, the radiative effect of added CO2 is only one factor that humans influence the climate system, as shown, for example, in the 2005 National Research Council report,

“Radiative Forcing of Climate Change: Expanding the Concept and Addressing Uncertainties”

which Andy has chosen to ignore in every one of his news articles, and certainly has not considered in writing his January 14 article. The effect of the human input of CO2 into the climate system needs to be considered together with the entire spectrum of diverse human- and natural-climate forcings and feedbacks.

The second oversimplification/error in his news article is his claim that

“As in a pointillist painting, the meaning emerges from the broadest view, from the “balance of evidence,” as the scientific case is described in the periodic reports issued by an enormous international network of experts: the
Intergovernmental Panel on Climate Change,www.ipcc.ch.”

These assessment reports, as have been documented on Climate Science (e.g. see), and elsewhere (e.g. see) are managed by only a small subset of climate scientists, who often use a platform as Lead Author to promote their research and their particular perspective. The authorship is hardly “an enormous international network of experts”.

The third oversimplification/error in his article is his statement that,

“The global average minimum nighttime temperature has risen. (This is unlikely to be caused by some variability in the sun, for example, and appears linked to the greenhouse gases that hold in heat radiating from the earth’s surface, even after the sun has gone down.)”

He has ignored peer reviewed research that has shown a wide range of problems with the surface temperature records including a significant warm bias in the minimum temperatures over land (e.g. see the papers listed in

Pielke Sr., R.A., C. Davey, D. Niyogi, K. Hubbard, X. Lin, M. Cai, Y.-K. Lim, H. Li, J. Nielsen-Gammon, K. Gallo, R. Hale, J. Angel, R. Mahmood, S. Foster, J. Steinweg-Woods, R. Boyles , S. Fall, R.T. McNider, and P. Blanken, 2006: Unresolved issues with the assessment of multi-decadal global land surface temperature trends. J. Geophys. Research, submitted.

He also incorrectly reports that,

“There has been a parallel warming trend over land and oceans. (In other words, the increase in the amount of heat-trapping asphalt cannot be the only culprit.)”

Contrary to his claim, most of the increase in temperature has actually been in the minimum temperatures at the higher latitudes (see, as discussed, on Climate Science (e.g. see)). The temperature trends for the oceans and land have not been parallel.

His fourth oversimplification/error is the statement that

“The stratosphere, high above the earth’s surface, has cooled, which is an expected outcome of having more heat trapped by the gases closer to the surface, in the troposphere. (Scientists say that variations in the sun’s output, for example, would instead cause similar trends in the two atmospheric layers instead of opposite ones.)

The data up through December 2006 (see) show that actual global stratospheric temperatures after cooling from the Mount Pinatubo eruption in the early 1990s, have remained more-or-less constant since then. With added CO2, the stratosphere should be continuing to cool; a issue that the article chose not to mention.

I could go on, but it is clear that this article, unlike much of the past often good news reporting by Andy Revkin, is a biased presentation of the current peer reviewed understanding of climate change science.

Leave a comment

Filed under Climate Science Op-Eds, Climate Science Reporting

More On The European Commission Climate Change Strategy

As communicated earlier on Climate Science (see), I was invited to be a monthly columnist on the website Scitizen. My January 2007 column reports on my Friday weblog at Climate Science on the new Climate Change plan of the European Commission.

The Scitizen column is titled

“The European Commission has released its Climate Change Strategy -Is It A Good Plan?”

with the abstract

“The European Commission has issued a Climate Change Strategy which focuses on controls on the emissions of carbon dioxide to limit the human intervention in global warming. However, by incorrectly emphasizing carbon dioxide as the dominant human climate forcing with respect to climate impacts that affect society and the environment, the Plan will not provide the benefit that is expected. A more appropriate climate change strategy needs to consider the diverse range of human and natural climate forcings, as identified in a 2005 USA National Research Council report. Moreover, if the goal is actually to require major changes in energy sources that should occur, irrespective of the effect on climate, the Commission should be straightforward in presenting this perspective.”

Leave a comment

Filed under Climate Science Reporting

Comments On EU’s Climate Change Strategy

Thanks to one of the readers of Climate Science [Eric Harmsen] with alerting me to the release of the EU climate change strategy.

The January 10, 2007 issue of Science News has an article by Jeff Mason entitled “EU challenges world with new climate change target”, which reads in part,

“BRUSSELS (Reuters) – The European Commission presented ‘the most ambitious policy ever’ to fight climate change on Wednesday, challenging the world to follow Europe’s lead in cutting greenhouse gas emissions.

The European Union’s executive branch proposed the 27-nation bloc reduce emissions by at least 20 percent by 2020 compared to 1990 levels, with the possibility of going to 30 percent if other developed countries join in.

The targets are part of new proposals for a broad EU energy policy that aims to boost production of renewable fuels, cut energy consumption, and reduce the dominance of big utility companies over EU gas and electricity markets.

With oil imports hit by the latest dispute involving Russia, the Commission’s vision for an EU-wide energy policy also seeks to ease dependence on foreign suppliers and push the bloc to speak with one voice on the world stage.

But Brussels made fighting global warming the core of its strategy.

“If this was adopted it would be by far the most ambitious policy ever — not only in Europe but the world — against climate change,” European Commission President Jose Manuel Barroso told a news conference.

The plan needs to be approved by EU governments and the European Parliament.

The new goal goes beyond an existing target for an eight percent cut in emissions from 1990 levels in the 2008-2012 period adopted by the 15 members of the EU before its 2004 enlargement, which several countries are struggling to meet…..”

This action by the EU perpetuates the narrow focus of climate mitigation on the human input of carbon dioxide (“to fight climate change”). They continue to ignore the extensive peer reviewed studies which document that the radiative effect of CO2 is just one of the human climate forcings, and that other forcings, such as from the diverse effects of anthropogenic aerosols and land use/land cover change (e.g. as discussed in the 2005 National Research Council report entitled “Radiative Forcing of Climate Change: Expanding the Concept and Addressing Uncertainties” , which has been discussed extensively on Climate Science) are at least as important.

The reality is that, even if successful, the EU climate change plan would have little significant effect on those aspects of climate variability and change that impact society and the environment.

There are at least two possible reasons for the EU ignoring climate science and continuing to promote a plan which would have such little effect on climate variability and change.

First, they could be unaware of the considerable peer-reviewed research which demonstrates that attempts to significantly influence regional and local-scale climate based on controlling CO2 emissions alone is an inadequate policy for this purpose. The European Commission President Jose Manuel Barroso certainly does not yet recognize that global warming is but a subset of climate change

Or, while they state that

“But Brussels made fighting global warming the core of its strategy”

the reality is that they are using the “currency” limits on CO2 emissions as the opportunity to make major changes in energy policy.

If they really intend (and honestly have concluded) that they are offering an effective plan to deal with climate, they are quite naive. If the goal, however, is a new energy policy, the EU should be honest and state that they really want to make major changes in how energy is provided, and that this would be so even if there was only minor climate effects associated with the anthropogenic increases in atmospheric carbon dioxide concentrations.

Leave a comment

Filed under Climate Science Op-Eds

A Paper Which Illustrates The Close Coupling Between Water and Carbon Across Space and Time Scales

One of the papers in the special issue of Global and Planetary Change is

Evidence for carbon dioxide and moisture interactions from the leaf cell up to global scales: Perspective on human-caused climate change” by P. Alpert D. Niyogi, R.A. Pielke, Sr., J.L. Eastman, Y.K. Xue and S. Raman.

The abstract reads,

“It is of utmost interest to further understand the mechanisms behind the potential interactions or synergies between the greenhouse gases (GHG) forcing(s), particularly as represented by CO2, and water processes and through different climatic scales down to the leaf scale. Toward this goal, the factor separation methodology introduced by Stein and Alpert [Stein U. and Alpert, P. 1993. Factor separation in numerical simulations, J. Atmos. Sci., 50, 2107–2115.] that allows an explicit separation of atmospheric synergies among different factors, is employed. Three independent experiments carried out recently by the present authors, are reported here, all strongly suggest the existence of a significant CO2–water synergy in all the involved scales. The experiments employed a very wide range of up-to-date atmospheric models that complement the physics currently introduced in most Global Circulation Models (GCMs) for global climate change prediction.

Three modeling experiments that go from the small/micro scale (leaf scale and soil moisture) to mesoscale (land-use change and CO2 effects ) and to global scale (greenhouse gases and cloudiness) all show that synergies between water and CO2 are essential in predicting carbon assimilation, minimum daily temperature and the global Earth temperature, respectively. The study also highlights the importance of including the physics associated with carbon–water synergy which is mostly unresolved in global climate models suggesting that significant carbon–water interactions are not incorporated or at least well parameterized in current climate models. Hence, there is a need for integrative climate models. As shown in earlier studies, the climate involves physical, chemical and biological processes. To only include a subset of these processes limits the skill of local, regional and global models to simulate the real climate system.

In addition, our results provide explicit determination of the direct and the interactive effect of the CO2 response on the terrestrial biosphere response. There is also an implicit scale interactive effect that can be deduced from the multiscale effects discussed in the three examples. Processes at each scale-leaf, regional and global will all synergistically contribute to increase the feedbacks — which can decrease or increase the overall system’s uncertainty depending on specific case/setup and needs to be examined in future coupled, multiscale studies. ”

Among the conclusions is the statement that

“The need for integrative climate models is a major conclusion of our paper. As shown in the National Research Council (2005) and IGBP (2004) books, also in Pitman’s (2003) review, climate involves physical, chemical and biological processes. To only include a subset of these processes limits the skill of local, regional and global models to simulate the real climate system.”

The IPCC needs to adopt this perspective if it is to move beyond the narrow focus of the radiative effect of CO2 as being the dominant climate forcing.

Other papers from the same issue of Global and Planetary Change will be highlighted in upcoming weblogs.

Leave a comment

Filed under Climate Change Forcings & Feedbacks

A Serious Problem With The Use Of The Global Averaged Surface Temperature Trend To Diagnose Global Warming and Cooling

The global average surface temperature trend is an icon of the climate change community (e.g. see). Global policies are based on this temperature.

The basic concept is that if the radiative forcing of the climate system is increased, the surface temperature will warm until the outgoing long wave radiation becomes in balance with the new radiative forcing. There is a lag between when the radiative forcing is imposed and when an equilibrium is achieved with the new forcing (this has been referred to as a temperature increase which is still in the “pipeline”). If the forcing changes over time, the surface temperature will not, of course, ever reach an equilibrium.

This lag is one of the reasons that we have recommended ocean heat content changes as a more appropriate climate metric for global warming and cooling, as there are no lags involved; just an accounting for the Joules of heat in the climate system, as discussed in

Pielke Sr., R.A., 2003: Heat storage within the Earth system. Bull. Amer. Meteor. Soc., 84, 331-335.

Moreover, an average global temperature can only be diagnosed; it cannot be directly measured. The approach has been to sample air temperatures across the globe in order to construct a global average surface temperature trend. However, there is a major problem with the use of the sampling of surface air temperature trends as is discussed below, for example, for nighttime minimum temperatures over land (which are used as part of the construction of the global average trend).

In our submitted paper

Pielke Sr., R.A., C. Davey, D. Niyogi, K. Hubbard, X. Lin, M. Cai, Y.-K. Lim, H. Li, J. Nielsen-Gammon, K. Gallo, R. Hale, J. Angel, R. Mahmood, S. Foster, J. Steinweg-Woods, R. Boyles , S. Fall, R.T. McNider, and P. Blanken, 2006: Unresolved issues with the assessment of multi-decadal global land surface temperature trends. J. Geophys. Research, submitted,

we discuss this climate metric. A subsection of the text, reproduced from that paper states,

“The first overarching question, of course, is what is meant by the “global average surface temperatureâ€?? The National Research Council Report [2005, see pages 19 and 21] provides a definition as

‘According to the radiative convective equilibrium concept, the equation for determining global average surface temperature of the planet is

dH/dt = f – T’/lamda (1)

where H…….is the heat content of the land-ocean-atmosphere system……. Equation (1) describes the change in the heat content where f is the radiative forcing at the tropopause, T’ is the change in surface temperature in response to a change in heat content, and lambda is the climate feedback parameter [Schneider and Dickinson, 1974], also known as the climate sensitivity parameter, which denotes the rate at which the climate system returns the added forcing to space as infrared radiation or as reflected solar radiation (by changes in clouds, ice and snow, etc.).’

Thus T is the ‘global average surface temperature’ and T’ is a departure from that temperature in response to a radiative forcing f. It appears in Equation (1) as a thermodynamic proxy for the thermodynamic state of the earth system. As such, it must be tightly coupled to that thermodynamic state. Specifically, changes in T must be proportional to changes in the radiation emitted at the top of the atmosphere on climate time scales. However, where is this temperature and its change with time, T’, diagnosed and is it closely coupled?

At its most tightly coupled, T is the radiative temperature of the Earth. However, the outgoing longwave radiation is proportional to T**4. A 1°C increase in the polar latitudes in the winter, for example, would have much less of an effect on the change of longwave emission than a 1°C increase in the tropics. The spatial distribution matters, whereas Equation (1) ignores the consequences of this assumption. A more appropriate measure of global heat content would be to evaluate the change of the global average of T**4
.
In most applications of (1), T is not a radiative temperature, but rather the temperature at a single level of the atmosphere, usually close to the ground. The CCSP [2006] report presents three separate analyses of the global surface temperature trend that use land- and ocean-based observations to evaluate T’. As they reported,

‘Over land, “near-surfaceâ€? air temperatures are those commonly measured about 1.5 to 2.0 meters above the ground level at official weather stations, at sites run for a variety of scientific purposes, and by volunteer (“cooperativeâ€?) observers [e.g., Jones and Moberg, 2003]. These stations often experience relocations, changes in instrumentation and/or exposure (including changes in nearby thermally emitting structures), effects of land-use changes (e.g., urbanization), and changing observing practices, all of which can introduce biases into their long-term records. These changes are often undocumented.’………”

Subsequently in our submitted JGR paper, we show that the use of 1.5-2 m nighttime temperatures over land as part of the measure is an example of a quantity that is not tightly coupled, given its strong sensitivity to the local land-surface conditions, the overlying boundary layer thermodynamic stability, and the wind speed.

Thus, the contribution of the minimum land surface temperatures to the construction of a global average surface temperature trend is NOT tightly coupled to that thermodynamic state of the Earth’s climate system. It insertion into the diagnosis of Equation (1) will introduce an error which we have shown in our paper

Pielke Sr., R.A., and T. Matsui, 2005: Should light wind and windy nights have the same temperature trends at individual levels even if the boundary layer averaged heat content change is the same? Geophys. Res. Letts., 32, No. 21, L21813, 10.1029/2005GL024407,

to be a warm bias when the nighttime boundary layer has a reduction of long wave cooling at night.

This is a serious problem which has not been addressed by any existing climate change assessment.

Leave a comment

Filed under Climate Change Metrics