Monthly Archives: January 2007

A Personal Call For Modesty, Integrity, and Balance by Hendrik Tennekes

Hendrik Tennekes, retired Director of Research, Royal Netherlands Meteorological Institute, former Professor of Aeronautical Engineering at the Pennsylvania State University and internationally recognized expert in atmospheric boundary layer processes contributes another guest weblog today to Climate Science (see his first weblog on January 6, 2006). He has the professional qualifications and experience in climate science to comment on this issue. His guest weblog is given below

Seventeen years ago, I wrote a column for Weather magazine, expressing my concerns about the lack of honesty, integrity and humility of many climate scientists. “I worry about the arrogance of scientists who claim they can help solve the climate problem, provided their research receives massive increases in funding”, reads one line from my text. Unknown to me, my friend Richard Lindzen was working on his famous paper “Some Coolness Concerning Global Warming“, which appeared in the Bulletin of the AMS at the same time. This was early 1990. It is 2007 now, and I want to ring the alarm bell again. There is a difference, though: then I was worried, now I am angry. I am angry about the Climate Doomsday hype that politicians and scientists engage in. I am angry at Al Gore, I am angry at the Bulletin of Atomic Scientists for resetting its Doomsday clock, I am angry at Lord Martin Rees for using the full weight of the Royal Society in support of the Doomsday hype, I am angry at Paul Crutzen for his speculations about yet another technological fix, I am angry at the staff of IPCC for their preoccupation with carbon dioxide emissions, and I am angry at Jim Hansen for his efforts to sell a Greenland Ice Sheet Meltdown Catastrophe. Speaking of Hansen, Dick Lindzen and I wrote a lighthearted April Fools’ Day parody of his concerns, which was published on Fred Singer’s SEPP website (search for Greenland Green Again) last year (view pdf). I can go on much longer, but I will keep my anger in check.

I am more than a little bit worried about IPCC’s preoccupation with CO2. The scientific rationale behind this choice is obvious. Sophisticated climate models have been running for twenty years now. It has become evident that these models cannot be made to agree on anything except a possible relation between greenhouse gases and a slight increase in globally averaged temperatures. The number of knobs that can be twiddled in the parameterization of the radiation budget is not all that large. Seemingly realistic results can be achieved without much intellectual effort. I agree with IPCC that there is a likely link between fossil fuel consumption and increased temperatures. But this is where the much proclaimed consensus ends. Just one example: the models do not include feedbacks between changing farming and forest harvesting practices and the atmospheric circulation. Partly for that reason, they cannot seem to agree on precipitation patterns. It so happens that precipitation is far more relevant to the world’s food production than a slight increase in temperature. I owe this insight to my good friend Denny (Dennis W.) Thomson at Penn State. Like me, he speaks from decades of experience. Denny is the oldest son of a world-renowned arctic lichenologist. He and his wife had the good fortune to grow up on farms in southwestern Wisconsin. Still closely bound to the earth and its delicate ecosystems, they live on a 600-acre farm in Halfmoon Valley on the southeastern flank of Bald Eagle Ridge. A physicist/meteorologist, and former head of Penn State’s meteorology department, Denny has witnessed climate change in progress for most of his life. At the same time he is deeply concerned about the veracity of “physics-challenged” climate models.

Why is it so difficult to make precipitation forecasts fifty years into the future? Most precipitation in the middle latitudes is associated with low-pressure systems, which move along storm tracks carved out by the jet stream. The ever-shifting meanders in the jet stream occur at the edge of the slab of cold air over the poles. The specialists call this slab the Polar Vortex, and have christened the meandering behavior of the jet stream in the Northern hemisphere the Arctic Oscillation. Thirty years ago I worked with Mike (John M.) Wallace and his PhD student N.C. Lau at the University of Washington in Seattle on problems concerning eddy-flux maintenance in the North Atlantic storm track. It is evident to all turbulence specialists that the dynamics of very slowly evolving states is different from the dynamics of instantaneous states. So the moment one asks what keeps the jet stream going, one encounters the kind of problem that is at the core of all turbulence research. But the mainstream of dynamic meteorology refuses to study the slow evolution of the general circulation. It has become so easy to run General Circulation Models on supercomputers that most atmospheric scientists shy away from matters like a thorough study of the interaction between the Polar Vortex and the Arctic Oscillation. Mike Wallace mailed me a year ago, saying that there is not a beginning of consensus on a theory of the Arctic Oscillation. This was one of the highlights in an advanced senior-citizens class on climate change I taught a year ago. It was announced as “A Storm in the Greenhouse”, referring primarily to the increasingly bitter debates of the past fifteen years.

How does this problem affect climate forecasts? If there is not even a rudimentary theory of the Polar Vortex, much less an established relation between rising greenhouse gas concentrations and systematic changes in the Arctic Oscillation, one cannot possibly make inferences about changes in precipitation patterns. We do not know, and for the time being cannot know anything about changing patterns of clouds, storms and rain. Holland’s national weather service KNMI circumvented this impasse last year by issuing climate change scenarios with and without changes in the position of the North Atlantic storm track. It did not occur to the KNMI spokesmen that they should have been forthright about their lack of knowledge. They should have said: we know nothing of possible changes in the storm track, so we cannot say anything about precipitation. But it is entirely consistent with the IPCC tradition to weasel around such issues. One of my contacts at KNMI recently explained to me that their choice was based on the increasing agreement between simulations run with different GCMs. I had to answer that the IPCC spirit of consensus apparently was invading their supercomputers as well. It is bad enough that computer simulations cannot be checked against observations until after the fact. In the absence of a robust stochastic-dynamic theory of the general circulation, one cannot even check climate simulations against fundamental insights.

Actually, the monopoly of GCMs in the climate research business is an interesting object of inquiry, and not just for sociological reasons. A GCM is a weather forecasting model in which the coefficients and parameterizations are tuned so as to obtain long-term results that have an air of realism. The model is then run for several tens of years. There are no penetrating studies of the way slight software mismatches might affect the average values of key output parameters fifty years from now. A forecasting model can make do with relatively crude parameterizations because the short-time evolution of the atmospheric circulation is primarily governed by its internal dynamics. Sloppy representations of boundary conditions, clouds, convection, evaporation and condensation do not mess weather forecasts up all that fast. But the long-term evolution of the general circulation is to a large extent determined by boundary conditions. This realization struck me with some force when I discovered last year that a simple algorithm for inversion rise above the daytime boundary layer I conceived in 1973 is still in wide use today. How can one be sure that an ancient forecasting algorithm is capable of performing the task assigned to it in climate models? At times it seems that no one in this business has learned about Karl Popper’s falsifiability demand. This is why I cringe at WCRP documents promoting Forecasting at All Time Scales. The obvious purpose of such propaganda is to defend the monopoly position that GCMs have enjoyed for so long. It is strategy, not science. A whole generation of meteorologists is growing up with the idea that this is the only way to go. They were not exposed to Lorenz’ WMO monograph on the General Circulation, their faces turn blank when the terms Available Potential Energy and Eddy Kinetic Energy are used. Since they are offered no alternatives, they join those who claim that they need higher resolution and bigger computers. The job of having to think on one’s own feet is too hard to contemplate.

All of 2006 I have been corresponding with Tim Palmer, a leading scientist at the European Center for Medium-range Forecasts. The apparent focus of our discussion was the dynamics of vortex filaments around blocking highs. Palmer intuited that thin sheets of positive relative vorticity around a negative-vorticity core may serve to prolong the life of a high-pressure system. I felt this was an interesting hypothesis. For many years I have ridiculed the phraseology in which blocking highs are said to divert storms coming their way. More than once I have explained to a reporter that it would be equally appropriate to state that diverging storms sustain a blocking high.

Then came the rub. Thin vortex filaments can be simulated on a supercomputer only if the horizontal resolution is much improved. With the current mesh size of the ECMWF model at 40 kilometers if I am not mistaken, simulation of the vorticity microstructure in the troposphere would require a 10,000-fold increase in computer power. So this is how the propaganda for petaflop computing emanating from WCRP comes about, I thought. One hundred computers of the generation following the next would indeed generate the desired increase. This in turn would require a facility on the size of CERN, ITER, or the preposterous Superconducting Supercollider.

Is this what John Houghton, Bert Bolin, Martin Rees, the IPCC staff and the like are aiming for? I have parted the company of these power brokers many years ago, so I cannot begin to imagine what they are up to this time. Palmer has convinced me he is not their puppet, fortunately. We continued our correspondence. “So you’re really lobbying for a massive computer facility”?, I wrote, “you participate in the same song and dance that has annoyed me for so long.” In my years as Director of Research at KNMI, the scientists around me honestly felt that my only job was to promote the early purchase of the next supercomputer. They were eager to collude behind my back with the hardware crowd at KNMI and salesmen from computer manufacturers. This often resulted in seemingly attractive discounts being offered around October, just when the salesmen had heard through the grapevine that a budget surplus would soon be reported to the Management Team.

I might have been sympathetic to Palmer’s ideas if he had argued in favor of a much better representation of ocean eddies, or the atmospheric boundary layer, or the climatic effects of changing farming practices. The dynamics of storm tracks and blocking highs is only one of many interactions demanding more research. It is certainly not appropriate to focus much climate computer power on just this one issue. That can be done better on specialized computers. In the case of blocking highs, a forecasting computer would fit best, because it is dedicated to the internal dynamics of the atmosphere. In my mind, a sense of balance was missing in Palmer’s appeal.

I want to lobby for decency, modesty, honesty, integrity and balance in climate research. I hope and pray we lose our obsession with climate forecasting. Climate simulations are best seen as sensitivity experiments, not as tools for policy makers. I said it in 1990 and I am saying it now: the constraints imposed by the planetary ecosystem require continuous adjustment and permanent adaptation. Predictive skills are of secondary importance. We should stop our support for the preoccupation with greenhouse gases our politicians indulge in. Global energy policy is their business, not ours. We should not allow politicians to use fake doomsday projections as a cover-up for their real intentions. If IPCC does not come to its senses, I’ll be happy to let it stew in its own juices. There is plenty of other work to do.

In 1976, Steve (Stephen H.) Schneider published a book entitled The Genesis Strategy. It made quite an impact on me at the time, primarily because Schneider did not promote technological fixes, but a global strategy of what is now called Adaptation, an idea reluctantly and belatedly embraced by IPCC. Those were the days of Nuclear Winter, weather modification, Project Stormfury, stratospheric ozone destruction, and the sick idea of seeding all Arctic ice with soot to prevent the next ice age. In the preface to his book, Schneider quotes Harvey Brooks, then Harvard dean of engineering:

“Scientists can no longer afford to be naive about the political effects of publicly stated scientific opinions. If the effect of their scientific views is politically potent, they have an obligation to declare their political and value assumptions, and to try to be honest with themselves, their colleagues and their audience about the degree to which their assumptions have affected their selection and interpretation of scientific evidence”.

I rest my case.

Leave a comment

Filed under Guest Weblogs

Correction To Misstatement On My Views On Climate In A USA Today Article Entitled “Fossil fuels are to blame, world scientists conclude”

In the January 31, 2007 issue of USA Today, it is written,

“Climate scientist Roger Pielke Sr. of the University of Colorado at Boulder has suggested that development and deforestation, rather than the burning of fossil fuels, are the main drivers behind global warming. He says on his climate-science website that the IPCC should recognize the importance of these other factors.”

This is an inaccurate statement of my views. The statement should read

“Climate scientist Roger Pielke Sr. of the University of Colorado at Boulder has CONCLUDED THAT THERE ARE SEVERAL CLIMATE FORCINGS IN ADDITION TO the burning of fossil fuels, THAT are the main drivers behind global warming. THESE INCLUDE HUMAN BURNING OF FORESTS AND GRASSLANDS. He says on his climate-science website that the IPCC should recognize the importance of these other factors.”

These other climate forcings are summarized in the 2005 National Research Council Report “Radiative Forcing of Climate Change: Expanding the Concept and Addressing Uncertainties”.

The media, unfortunately, persists in miscommunicating to policymakers and the public, and ignores the correct understanding of climate change as articulated in the 2005 NRC Report. They even misquote my views when I refer them to in my weblog conclusions where I explicitly state,

“Humans are significantly altering the global climate, but in a variety of diverse ways beyond the radiative effect of carbon dioxide. The IPCC assessments have been too conservative in recognizing the importance of these human climate forcings as they alter regional and global climate. ”

“Global warming is not equivalent to climate change. Significant, societally important climate change, due to both natural- and human- climate forcings, can occur without any global warming or cooling.”

and

“Attempts to significantly influence regional and local-scale climate based on controlling CO2 emissions alone is an inadequate policy for this purpose.”

Leave a comment

Filed under Climate Science Reporting

New Website Address For Climate Science

Climate Science has a new web address. It is http://www.climatesci.org/. Please change if you bookmark this site.

Leave a comment

Filed under Uncategorized

A Climatologically Significant Aerosol Longwave Indirect Effect In The Arctic

A paper early in 2006 in Nature by Dan Lubin and Andrew M. Vogelmann entitled

“A climatologically significant aerosol longwave indirect effect in the Arctic” ; Vol 439 26 January 2006 doi:10.1038/nature04449 [pages 453-456],

presents further evidence on the complexity of climate forcings in the Arctic (thank you to Marcia Wyatt for alerting me to this paper!).

The abstract reads,

“The warming of Arctic climate and decreases in sea ice thickness and extent observed over recent decades are believed to result from increased direct greenhouse gas forcing, changes in atmospheric dynamics having anthropogenic origin, and important positive reinforcements including ice–albedo and cloud–radiation feedbacks. The importance of cloud–radiation interactions is being investigated through advanced instrumentation deployed in the high Arctic since 1997. These studies have established that clouds, via the dominance of longwave radiation, exert a net warming on the Arctic climate system throughout most of the year, except briefly during the summer9. The Arctic region also experiences significant periodic influxes of anthropogenic aerosols, which originate from the industrial regions in lower latitudes. Here we use multisensor radiometric data to show that enhanced aerosol concentrations alter the microphysical properties of Arctic clouds, in a process known as the ‘first indirect’ effect. Under frequently occurring cloud types we find that this leads to an increase of an average 3.4 watts per square metre in the surface longwave fluxes. This is comparable to a warming effect from established greenhouse gases and implies that the observed longwave enhancement is climatologically significant.”

The states at the end,

“In conclusion, we provide observational evidence that the first aerosol indirect effect operates in low, optically thin, single-layered Arctic clouds with a concomitant increase in the downwelling longwave flux. The cloud amount during the Arctic spring generally exceeds 80%, which implies that the observed longwave enhancement has climatological significance.”

The conclusion that this aerosol cloud effect “…is comparable to a warming effect from established greenhouse gases and implies that the observed longwave enhancement is climatologically significant”,

along with the finding from

Hansen, J., and L. Nazarenko 2004. Soot climate forcing via snow and ice albedos. Proc. Natl. Acad. Sci. 101, 423-428, doi:10.1073/pnas.2237157100,

that the albedo effect of soot on snow and ice can result in a radiative forcing in the Northern Hemisphere of +0.3 W per meter squared, provides yet another example of why a focus on the radiative forcing of CO2 alone is inaccurately narrow.

In the IPCC report that is being released this Friday, whether or not this peer reviewed recognition of the diversity of human climate forcings in the Arctic this will be one benchmark to assess whether or not the IPCC assessment is an honest balanced presentation of climate science, or, as it has been in the past, a document to be used for political advocacy.

Leave a comment

Filed under Climate Change Forcings & Feedbacks

Human Impacts on Weather and Climate Available at Discount to Blog Readers!

A special 20% discount is available for weblog visitors

Soon to be published by Cambridge University Press the Second Edition of…

Human Impacts on Weather and Climate
Second Edition
humanimpactscover.JPG
William R. Cotton
Colorado State University
and Roger A. Pielke, Sr.
University of Colorado, Boulder

This new edition of Human Impacts on Weather and Climate examines the scientific and political debates surrounding anthropogenic impacts on the Earth’s climate and presents the most recent theories, data and modeling studies. The book discusses the concepts behind deliberate human attempts to modify the weather through cloud seeding, as well as inadvertent modification of weather and climate on the regional scale. The natural variability of weather and climate greatly complicates our ability to determine a clear cause-and-effect relationship to human activity. The authors describe the basic theories and critique them in simple and accessible terms. This fully revised edition will be a valuable resource for undergraduate and graduate courses in atmospheric and environmental science, and will also appeal to policymakers and general readers interested in how humans are affecting the global climate.

February 2007 247 x 174 mm 320pp 64 line diagrams 20 halftones
20 colour plates 104 figures

Hardback £65.00 978-0-521-84086-6
Paperback £29.99 978-0-521-60056-9

For more information or to order your inspection copy visit: www.cambridge.org/9780521600569

Leave a comment

Filed under Books

Comments On The Relative Roles of Global Average And Regional Climate Forcings

Eli Rabett has presented two well posted comments on the relative roles of global average and regional climate forcings, which I am also presenting as a weblog since his contribution helps focus a very important climate change issue:

His comments (posted on January 28, 2007) are:

“Again, which is the forcing and which the response, which is the major effect locally, which the major effect globally. If you increase grassland in one area, and decrease it in another the effects balance globally, but not locally, since land use is inherently local. On the other hand greenhouse gas emissions rapidly diffuse throughout the atmosphere.”

“Bryan, while climate science is about much more that greenhouse gas forcing, your and Prof. Pielke’s insisting on ignoring the elephant in the room is curious. As a physicist, my inclination to spherical elephants is built in. First you look at the largest effects.”

Here is my reply

Eli – Thank you for your several constructive contributions to the weblog.

Your comments succinctly place the relative roles of the different human climate forcings in perspective. We differ on this issue. You are focusing on a global average (such that if, for example, large positive and negative excursions in multi-decadal regional troposphere temperature trends sum to near zero, this is inherently a local (regional) issue.

However, we are finding from theory and from models that such regional tropospheric temperature anomalies result in significant changes in atmospheric circulation patterns, that can substantially alter precipitation, temperature and other aspects of the climate system at large distances (i.e. through teleconnections) from where a land-use/land-cover and/or aerosol emission change occur. This is a global scale climate change which has been presented as a finding in the 2005 National Research Council Report “Radiative Forcing of Climate Change: Expanding the Concept and Addressing Uncertainties”.

It is these changes that would have a much greater effect on impacts on important social and environmental resources than a global average multi-decadal trend in tropospheric temperatures. These circulation changes are the “elephant” in the room, which has been inadequately discussed in past climate assessments.

The radiative effect of the more spatially homogeneous forcing of CO2 is still important, but it is “A” human climate forcing, not “THE” dominate human climate forcing on the local, region, and global scales.

Leave a comment

Filed under Climate Change Forcings & Feedbacks, Climate Science Misconceptions

Effect of Deliberate Landscape Management

Another very important paper in the special issue of Global and Planetary Change is

Deepak K. Ray, Ronald M. Welch, Robert O. Lawton and Udaysankar S. Nair, 2006: Dry season clouds and rainfall in northern Central America: Implications for the Mesoamerican Biological Corridor. Global and Planetary Change Volume 54, Issues 1-2 , November 2006, Pages 150-162

The abstract reads,

“The proposed Mesoamerican Biological Corridor (MBC) is an ambitious effort to stem the erosion of biodiversity in one of the world’s biologically richest regions. The intent is to connect large existing parks and reserves with new protected areas by means of an extensive network of biological corridors within Mesoamerica/Central America to create an environment which provides better prospects for the long-term survival of native species while also addressing the region’s socioeconomic needs. While the forest types in northern Central America generally receive some dry season rainfall, in the proposed protected regions, however, it is unclear whether current rainfall has been altered by regional land-use change.

Based upon climatological rainfall records at 266 stations in Guatemala and adjacent areas, dry season rainfall in March is markedly lower in deforested areas than in forested areas of the same life zone for each of the widespread life zones. In general, dry season deforested habitats have higher daytime temperatures, are less cloudy, have lower estimated soil moisture and lower values of normalized difference vegetation index (NDVI) than do forested habitats in the same life zone. The result is hotter and drier air over deforested regions, with lower values of cloud formation and precipitation. Rainfall is predicted from the correlation of raingauge measurements and observed cloud cover; moreover, March rainfall deficiencies > 25 mm are found for several Holdridge life zones. The data suggest that deforestation is locally intensifying the dry season, increasing the risk of fire, especially for the long corridor connecting regions. In addition, forest regeneration in some parts of the MBC may not result in second-growth forest that is characteristic of that life zone but rather in forest regeneration more typical of drier conditions. The extent to which this would influence the conservation utility of any given corridor depends upon the ecological requirements of the organisms concerned.”

In their conclusion, they write,

“….dry season rainfall in March is markedly lower in deforested areas than in forested areas of the same life zone for each of the widespread life zones in Central America. In general, deforested habitats have higher daytime temperatures, are less cloudy, have lower estimated soil moisture and lower values of NDVI then do forested habitats in the same life zone.”

This paper illustrates why there can be unintended climate consequences associated with land use management for other purposes. A climate assessment should clearly be a component of such landscape alteration

Leave a comment

Filed under Climate Change Forcings & Feedbacks

A New Paper On The Statistics Of Record-breaking Temperatures

There is a recent paper on the statistics of record breaking temperatures (thanks to Peter Schuck for alerting Climate Science to this paper!). The paper is

S. Redner and Mark R. Petersen: 2006: “Role of global warming on the statistics of record-breaking temperatures” Physical Review E 74, 061114

The abstract reads,

“We theoretically study the statistics of record-breaking daily temperatures and validate these predictions using both Monte Carlo simulations and 126 years of available data from the city of Philadelphia. Using extreme statistics, we derive the number and the magnitude of record temperature events, based on the observed Gaussian daily temperature distribution in Philadelphia, as a function of the number of years of observation. We then consider the case of global warming, where the mean temperature systematically increases with time. Over the 126-year time range of observations, we argue that the current warming rate is insufficient to measurably influence the frequency of record temperature events, a conclusion that is supported by numerical simulations and by the Philadelphia data. We also study the role of correlations between temperatures on successive days and find that they do not affect the frequency or magnitude of record temperature events.”

An excerpt reads,

“Our primary result is that we cannot yet distinguish between the effects of random fluctuations and long-term systematic trends on the frequency of record-breaking temperatures with 126 years of data. For example, in the 100th year of observation, there should be 365/100=3.65 record-high temperature events in a stationary climate, while our simulations give 4.74 such events in a climate that is warming at a rate of 0.6 °C per 100 years. However, the variation from year to year in the frequency of record events after 100 years is larger than the difference of 4.74–3.65, which should be expected because of global warming …..After 200 years, this random variation in the frequency of record events is still larger than the effect of global warming. On the other hand, global warming already does affect the frequency of extreme temperature events that are defined by exceeding a fixed thresholdâ€?

This is an interesting study that appears in a journal that is not usually read by climate scientists. One comment, of course, is that the term “global warming” is not correct here. The statistics are really with respect to a long term change in the temperatures at one location; a global average temperature change is not the appropriate climate metric to use.

Their study raises the issue as to what is an extreme temperature (an “abnormal” value), and what is above average but “normal” (i.e. falls with a certain number of standard deviations from the mean). We completed such an analysis in 1987;

Pielke, R.A. and N. Waage, 1987: A definition of normal weather. Natl. Wea. Dig., 12, 20-22,

with the abstract

“This paper clarifies the distinction between abnormal weather, and above and below average weather, using standard statistical analyses. Abnormal maximum and minimum temperatures are defined as requiring at least two standard deviations from the mean; otherwise even though they could be above or below average, the weather is still “normal.” July and January maximum and minimum temperatures for Denver, New York, Los Angeles, Miami, and Bismarck are presented as examples of this analysis.”

Such a distinction between what is above average and what is abnormal should be part of any assessment of multi-decadal temperature trends.

Leave a comment

Filed under Climate Change Metrics

Is There Any Location On Earth Without Air Pollution? An Important New Article On This Subject By Meinrat O. Andreae

There is a new interesting paper in Science on January 5 2007 by Meinrat O. Andreae entitled “Aerosols Before Pollution” [subscription required]. The summary of the article states that

“No unpolluted regions remain in today’s atmosphere…..”

The article includes the text,

“Atmospheric aerosols play a large role in human-induced climate change because of their effects on solar radiation transfer and cloud processes. To assess the impact of human perturbations on the atmosphere’s aerosol content, we need to know the prehuman aerosol burden. This is especially important for understanding the cloud-mediated effects of aerosols on climate, because cloud properties respond to aerosols in a nonlinear way and are most sensitive to the addition of particles when the background concentration is very low…

and

“….prehuman aerosol levels may have been very similar over continents and oceans, ranging from a few tens per cm3 in biogenically inactive regions or seasons to a few hundreds per cm3 under biologically active conditions. This conclusion renders invalid the conventional classification of air masses into maritime and continental according to their aerosol content. It also implies that, before the onset of human-induced pollution, cloud microphysical properties over the continents resembled those over the oceans, whereas nowadays, cloud processes over most of the continents are shaped by the effects of human perturbation.”

The importance of aerosols within the climate system are supported by a wide range of studies including

Matsui, T., H. Masunaga, R.A. Pielke Sr., and W-K. Tao, 2004: Impact of aerosols and atmospheric thermodynamics on cloud properties within the climate system. Geophys. Res. Letts., 31, No. 6, L06109, doi:10.1029/2003GL019287.

Matsui, T., H. Masunaga, S.M. Kreidenweis, R.A. Pielke Sr., W.-K. Tao, M. Chin, and Y.J. Kaufman, 2006: Satellite-based assessment of marine low cloud variability associated with aerosol, atmospheric stability, and the diurnal cycle. J. Geophys. Res., 111, D17204, doi:10.1029/2005JD006097.

The conclusion that the composition of the global atmosphere is significantly altered by the human input of aerosols from industrial and vehicular activity, biomass burning, and landscape degradation should be a wake up call to the policymakers who have concluded that CO2 is the dominant human climate forcing, and if we can just control its atmospheric concentrations, we can effectively “fight climate change”. The Andreae article illustrates the narrowness of such a CO2 centric view.

Leave a comment

Filed under Climate Change Forcings & Feedbacks

The Publication of the Gu et al JGR Paper On Two Important Largely Neglected Climate Processes

The very important new research paper on vegetation dynamics, that was introduced on September 28 2006 on Climate Science, has appeared;

Gu L., et al. (2007), Influences of biomass heat and biochemical energy storages on the land surface fluxes and radiative temperature, J. Geophys. Res., 112, D02107, doi:10.1029/2006JD007425.

As was discussed in an earlier Climate Science weblog and highlighted at the begining their abstract,

“The interest of this study was to develop an initial assessment on the potential importance of biomass heat and biochemical energy storages for land-atmosphere interactions, an issue that has been largely neglected so far.”

and at the end of their abstract,

“From these simulation results, we concluded that biomass heat and biochemical energy storages are an integral and substantial part of the surface energy budget and play a role in modulating land surface temperatures and must be considered in studies of land-atmosphere interactions and climate modeling.â€?

This study demonstrates that models without these two processes will have significant errors in their accurate representation of surface heat and moisture fluxes, and in radiative temperatures, that are measured when vegetation is present. As they write,

“…..the radiative forcing of greenhouse gases (CO2, CH4, N2O, and halocarbons together) is about 2.43 W m−2 above the preindustrial level [IPCC, 2001]; this value could be smaller in the current atmosphere since some of the earlier imbalance presumably has already warmed the climate system. Thus at least at regional scales, biochemical energy storage is on the same order of magnitude as the radiative forcing of atmospheric greenhouse gases. Therefore, for long-term climate system modeling which includes vegetation processes, biochemical energy storage could be important, particularly at regional scales.”

We have shown in an observational study that the incorporation of the amount of transpiring vegetation does significantly affect maximum and minimum temperatures;

Hanamean, J.R. Jr., R.A. Pielke Sr., C.L. Castro, D.S. Ojima, B.C. Reed, and Z. Gao, 2003: Vegetation impacts on maximum and minimum temperatures in northeast Colorado. Meteorological Applications, 10, 203-215.

The Gu et al study presents evidence of two climate processes that would contribute to such differences.

Their work also means that the assessment of multi-decadal surface air temperature trends which do NOT factor in changes in vegetation at an observing site over this time period, must have errors in any attribution of temperature trends. They write on this subject,

“The diurnal temperature range (DTR) on land has been decreasing since the middle of the 20th century [Easterling et al., 1997]. The cause of this trend is not completely understood even though there have been many studies on this topic [e.g., Hansen et al., 1995; Dai et al., 1999]. Collatz et al. [2000] suggested that changes in vegetation cover may have contributed to this trend through controls on latent heat flux and atmospheric stabilities and feedbacks on atmospheric processes. We suggest that changes in biomass heat and biochemical energy storages may be another mechanism for vegetation to influence DTR. Biomass heat and biochemical energy storages act to reduce daytime surface temperature and increase nighttime temperature, thus leading to decreased DTR. Globally, vegetation productivity has been increasing [Myneni et al., 1997; Boisvenue and Running, 2006] and therefore should contribute to dampening DTR. We emphasize that our estimate of influences of biomass heat and biochemical energy storages on DTR (0.5°C) is conservative because we did not consider the feedback from changes in biomass temperature on the atmospheric forcing temperature. If this feedback is considered, the effect of biomass heat and biochemical energy storages on DTR might be even larger.”

They also conclude that,

“Finally, biomass distribution is spatially heterogeneous, which means that biomass heat and biochemical energy storages must be also spatially heterogeneous. This heterogeneity is in essence a form of gradient radiative forcing [Matsui and Pielke, 2006]. In conjunction with spatial variations in evapotranspiration, albedo, and surface roughness associated with vegetation cover, it can influence horizontal pressure gradients and mesoscale atmospheric circulations and therefore regional climates. More studies are needed in this area.”

These two climate processes are yet additional sources of uncertainty in the assessment of climate system heat changes (global warming or cooling) using land surface air temperatures as the selected metric.

Leave a comment

Filed under Climate Change Forcings & Feedbacks, Climate Change Metrics