Monthly Archives: August 2009

Guest Weblog By Syun Akasofu Of International Arctic Research Center At The University of Alaska Fairbanks

Dr. Syun Akasofuhas provided us with a guest weblog based on a translation from Japanese of an article he wrote. I pleased to use my weblog to communicate viewpoints on climate science issues from credentialed climate scientists.

GUEST WEBLOG By Syun Akasofu

Recommendation to postpone the 2009 Copenhagen Conference:

The so-called “global warming” issue viewed in the context of politics and the economy of the world.

Syun Akasofu
International Arctic Research Center

1. The US must have decided to drop the making of cars as their primary manufacturing activity and gave it to Japan. The Obama administration and the US public believe that enough has been done for the ailing car makers, and hope that they will be able to survive by making good electric (not fossil fuel powered) cars.
2. What does this mean? In the history of manufacturing, there has been a trend in which advanced countries lose their primary manufacturing capabilities one after another to developing countries. The textile industry in the UK was taken over by the US, then by Japan, then by China and others. The iron manufacturing industry in the UK was taken over by the US, then by Japan, and then China and other ‘catching-up’ countries. The car manufacturing industry in the UK was taken over by the US (mainly by GM), then Japan (Toyota and Honda), and some day perhaps China. This historical trend cannot be stopped. (The US tried to take over the world’s financing activities from the UK, which had lost interest in manufacturing altogether, but failed miserably in the recent days and caused the current economic recession.)
3. Then, the question is what kind of primary manufacturing industry is the US going to choose to work on in the future? It is likely that the Obama administration has chosen the construction of atomic power plants as the next great US manufacturing effort.
4. The reasons for choosing atomic power plants are obvious. First of all, the US has to secure future electric power because electricity is needed for everything, including future electric cars. The US wants to get away from its reliance on oil (and the unstable oil-producing countries), which will undoubtedly either diminish or become very expensive within the next 50 years. Reducing oil imports will reduce the great deficit. It should be noted that the primary purpose of changing from carbon power to atomic power is not necessarily to reduce the release of CO2 and global warming. It is an excuse. This will become clearer as we look into the related issues.
5. How is global warming related to atomic power? In order to understand this question, it is important to learn how the global warming issue was born. In the 1980s, Margaret Thatcher, then the British Prime Minister, came to the conclusion that the UK needed atomic power energy for their future, but she faced strong objections by her people. It was also about the time when the first crude computer simulation of the greenhouse effect of CO2 was made, and it predicted a great disaster and catastrophe due to the expected temperature rise, unless the release of CO2 could be greatly reduced.
     Margaret Thatcher must have taken this result into account in promoting atomic power, asking her people to choose either atomic power or global disaster/catastrophe, which would require a great sacrifice in their standard of living in order to avoid it. Without her strong endorsement, the IPCC would not have been established. She also established the Hadley Climate Research Center for further study of the effects of CO2. Until that time, climatology was a rather quiet science (not something dealt with in newspaper headlines), but Thatcher put a great spotlight on it for her political purposes. Therefore, although the CO2 hypothesis is appropriate as a hypothesis in science, the IPCC was related to atomic power from its birth and its destiny was to predict a great disaster/catastrophe. This, in spite of the criticism that the IPCC is predicting the end of the world, although we are not doing very well at even predicting the next day’s weather or the severity of the next winter. Science was used for political purposes. At the same time, the world news media was looking for something exciting to report on because the Cold War was ending. Global warming and reporting on imaginary disasters/catastrophes caused by CO2 has become one of their major headline topics.
6. How is the history of global warming and the IPCC related to the Obama administration’s interest in atomic power plants, making the construction of atomic power plants as the new primary manufacturing industry of the US? This is because if they proposed atomic power plants by singling the issue out, they will face fierce opposition of the people. Since the Three Mile Island plant accident, there has been no atomic plant built on US soil. Therefore, the Obama administration, like Thatcher, will ask the people to choose between atomic power plants (maintaining or improving their present standard of living) or a great disaster/catastrophe caused by CO2 (actually, reducing drastically the present living standard, including not being able to drive (electric) cars).
7. For these reasons, from the perspective of the Obama administration, the greater the disaster/catastrophe predicted due to CO2, the better it is for the purpose of promoting atomic energy. As a first step toward the goal of switching to atomic power, the Obama administration states that atomic energy is “green” (meaning no air pollution), that atomic energy is “non-carbon”, and even that CO2 is “unhealthy”. Note also that Obama uses the words “climate change”, not “global warming.”
    The physics of CO2, absorbing and re-emitting infrared radiation is clear. On the other hand, geophysicists must find how much heating CO2 will cause when a given amount of it is released into the complex earth system. Thus, in this situation it is meaningless and useless for the real science of global warming/climate to face off against the political decisions and propaganda for the planning of atomic power plants.
8. One problem in this particular discipline of science is that scientists who base their research on computer simulations have become too arrogant, saying that they can predict the temperature in 2100, although too much is still unknown about the earth system. Ignoring natural causes of climate change and even unknown aspects of cloud physics, they rely on computer work in predicting the temperature rise in 2100. However, a computer is like a robot. It can perform only what it is instructed to do by the programs produced by the human brain. If a computer program is incorrect or inaccurate, the output will also be incorrect or inaccurate. In science, incorrect programs or hypotheses (produced by one or a group of scientists) are criticized by other scientists and can thus be improved. That is the way science should progress. However, the IPCC regards those who criticize them as “skeptics”, or “deniers”, etc., and brought this newborn and immature science to the international stage. They stated in 2007 that scientists have done all they can and that the science is settled, and the rest of the task should be in the hands of policy makers. Such a statement is very irresponsible.
9. However, even if the US decides that its next primary manufacturing industry is the construction of atomic power plants, there will be fierce competition between the US group (US, Japan, Russia) and the French group, which has more experience than the US, at last in the safety of operation. (A further problem is that Toshiba owns much of the Westinghouse stock.) There will eventually be uranium wars in the future; energy securing wars will continue forever.
10. The Obama administration is promoting wind power and solar power. However, there is no way to supply more than 10% of the US power needs (Obama says that they should try for 20%, but has he estimated the cost involved?) It is only about 2.5% at present. In any case, 80-90% of future electric power has to be found.
11. The US has to rely on coal power plants (at present 40%), until a large number of atomic power plants can be built, perhaps about 15-20 years from now. Thus, there is no way for the US to agree on any international agreement on a near-future CO2 reduction at the present time. The US has been saying that unless China and India agree to a significant reduction of the release of CO2, any agreement is useless. On the other hand, the US has made China its factory, and furthermore the US owes a great debt to China. Unless China can remain healthy, politically and financially, and with sufficient energy, the US will have a serious problem. Therefore, the US cannot force China to reduce its CO2 emission. On the other hand, in spite of the fact that China is now “richer” than the US, it continues to claim that it is still one of the developing countries and that the developed countries should reduce their release of CO2 first. The US and China must surely understand each other, so that the above statements are only rhetorical. The IPCC chairman has stated recently that India will not agree to a “cap”. Further, global capitalism is such that the rest of the world relies on the US buying power (even if they are using credit cards), so that the US economy has to be healthy. EU officials have had a large number of conferences on the reduction of CO2, but they have not reached any conclusion they can agree on.
12. For the above reasons, is it useful to have any more conferences on global warming? How many international conferences with the heads of nations have been held in the past? There has been no substantive agreement on the amount of release of CO2 by individual countries, in spite of the fact that protecting the earth from the CO2-based disaster/catastrophe should be the most solemn duty of the heads of nations (although environmental destruction caused by global capitalism is conveniently forgotten). So far, all the past conferences ended with a “fight” between rich nations and poor nations. The latter trying to snatch money from the former using the so-called “cap and trade” as an excuse, and the former trying to protect themselves from such an assault, in spite of the fact that the “cap and trade” negotiations have no effect on reducing the overall release of CO2. It is suspected that the heads of nations do not really believe in the global disaster/catastrophe scenario caused by CO2. However, they stated they believe in the IPCC, so they cannot publicly say that they do not believe in the disaster scenario, because they and their countries would be called enemies of humanity, like George W. Bush.
13. It has been said that the only thing they agreed on at the past conferences is to decide on the time and place for the next meeting. Such conferences are useless, although they are better than a world war. It is suggested that they should postpone future meetings until the science of global warming will advance farther. It is not too late, as the proponents of global warming advocate, since there has been no predicted disaster/catastrophe after the release of CO2 increased rapidly in 1946. In the tropics and middle latitude, there has been no discernible disaster/catastrophe so far. This is why the world media flocks to the Arctic and reports on erroneous global warming effects. None of the phenomena and changes they reported are related even remotely to the CO2 effects. A good example is glacier calving at the terminus. Nevertheless, the world media reports that the changes are caused by the CO2 effect.
14. In Japan, they are overjoyed by the statements of President Obama, saying that he is quite serious about “global warming” (actually, he says “climate change” instead of global arming). They interpret his statements as a sign that the US has finally become serious about the release of CO2, and that Obama is different from George W. Bush.
15. It is very unfortunate that science is being used for political purposes. Global warming is an imaginary product used for promoting the atomic power industry. When the truth will eventually become apparent, the credibility of science will be seriously damaged, since so many scientists (not only climatologists, but also many scientists in general) blindly trusted the IPCC and accused their opponents as “skeptics” and “deniers”, etc.
16. Actually, judging by what has been described earlier, the IPCC is NOT a scientific research organization, although they skillfully mobilized 2500 “world experts in climatology”; they were used by the IPCC, some probably unwittingly. The IPCC skillfully created the impression of “consensus” among 2500 scientists. Their contribution, a large volume of publications, is conveniently used for the IPCC publication, “Summary for Policy Makers”, as an apparent back-up document, although the IPCC charter clearly states that they are not supposed to make recommendations to policy makers.
    The IPCC has tried to emphasize that global warming began unexpectedly and abruptly after 1900 because of the enhanced release of CO2. However, global warming began as early as 1800-1850s at the same rate as the present (0.5°C/100 years), namely about 100 years earlier than the beginning of a rapid increase of CO2 release, as the earth began to recover from the Little Ice Age (1400-1800). The recovery from a cold period is warming. Actually, the warming until 2000 and the present cooling trend can reasonably be explained as natural changes. The IPCC has ignored natural changes as at least a partial cause of global warming, in order to promote their CO2 hypothesis.
17. The IPCC tried to ignore the fact that the earth experienced the Little Ice Age by using the co-called “hockey stick” figure, because it is not convenient to know that the global warming began in 1800-1850, and not as they claim in the 20th century. The recovery from the Little Ice Age (a cold period) is warming. How many of the 2500 scientists trust the hockey stick figure? Perhaps only very few. Is this then the “consensus” of 2,500 experts in climatology? Unfortunately, the IPCC and the world media have presented this hypothesis as a fact.
18. There is another reason for proposing the postponement of future global warming conferences. After 1998 or 2000, global temperature has stopped rising and shows a sign of cooling, in spite of the fact that the amount of CO2 in the atmosphere is still rapidly rising. This is an observed fact. Therefore, their temperature prediction for the year 2100 has already failed during the first 10 years. However, IPCC scientists have not recognized it, saying that it is just a temporal change; but 10 years of consistent change is considered climate change.
19. The world political leaders should be able to decide to postpone future conferences until scientists could find the causes for the present halting of global warming. Temporary or not, there must be unknown forces and causes to suppress the CO2 effect or even overcome it.
20. We should bring back the science of climate change to a basic science, avoiding interferences by policy makers and the world mass media. Only then can this particular science proceed in a scientifically healthy way. Only then can we discuss any global warming hypothesis as proponents and opponents (instead of as “believers” and “skeptics” or “deniers” in the religious sense), regardless of one side being in the majority or minority. In science, unlike in politics, a minority can be right.

Comments Off on Guest Weblog By Syun Akasofu Of International Arctic Research Center At The University of Alaska Fairbanks

Filed under Guest Weblogs

Remarkable Admission By James Annan On The Klotzbach Et Al (2009) Paper

There is a remarkable presentation of viewpoints on our Klotzbach et al (2009) paper by Michael Tobis (see) and James Annan (see). On Roger Pielke Jr.’s  Blog, he has already posted effectively in response to Michael Tobis’s admission of his lack of expertise on the topic of our paper (see).   From the comment below that James Annan made on his weblog, it is clear he does not understand boundary layer physics either. He wrote

“For the record, I agree that land use cover change may impact on the climate. But unless Roger Pielke can find some way of arguing that this has changed the net average surface flux by the order of 1Wm-2at night, his whole theory is still a bust. And even if he did, it would not rescue his erroneous claims that the trends in temperature due to GHG or the other most significant forcings induce a significant change in the lapse rate in the boundary layer.”

His  challenge to document a change in the net surface flux by 1Wm-2  due to landscape change is a clear demonstration that he is poorly informed about boundary layer dynamics.

As one example of many, for urban areas relative to residential areas (which illustrate how the fluxes change as urbanization occurs), the paper

 Soushi, K. and Y. Yamaguchi, 2007: Estimation of storage heat flux in an urban area using ASTER data. Remote sensing of environment ISSN 0034-4257

summarizes their study in the abstract

“The urban heat island phenomenon occurs as a result of the mixed effects of anthropogenic heat discharge, increased use of artificial impervious surface materials, and decreased vegetation cover. These factors modify the heat balance at the land surface and eventually raise the atmospheric temperature. It is important to quantify the surface heat balance in order to estimate the contributions of these factors. The present authors propose the use of storage heat flux to represent the heat flux between the land surface and the inside of the canopy for the heat balance analysis based on satellite remote sensing data. Surface heat fluxes were estimated around the city of Nagoya, Japan using Terra ASTER data and meteorological data. Seasonal and day-night differences in heat balance were compared using ASTER data acquired in the daytime on July 10, 2000, and January 2, 2004 and in the nighttime on September 26, 2003. In the central business and commercial districts, the storage heat flux was higher than those in the surrounding residential areas. In particular, in winter, the storage heat flux in the central urban area was 240 to 290 W m-2, which was much larger than the storage heat fluxes in residential areas, which ranged from 180 to 220 W m-2. Moreover, the negative storage heat flux in the central urban area was greater at night. This tendency implies that the urban surface stores heat during the daytime and discharges it at night. Extremely large negative storage heat flux occurred primarily in the industrial areas for both daytime and nighttime as a result of the enormous energy consumption by factories.”

These values are much larger than the  1Wm-2  threshold that James presented in his weblog.

On his statement rejecting  “that the trends in temperature due to GHG or the other most significant forcings induce a significant change in the lapse rate in the boundary layer“,  as just one example (and there are many), the paper

Sun, J-L et al, 2003: Heat balance in the nocturnal boundary layer during CASES-99  J. Appl. Meteorology. 42, 1649-1666

reported a “[A] radiative flux difference of more than 10 W m-2 over 46 m of height was observed under weak-wind and clear-sky conditions after hot days.”

The abstract reads

“A unique set of nocturnal longwave radiative and sensible heat flux divergences was obtained during the 1999 Cooperative Atmosphere-Surface Exchange Study (CASES-99). These divergences are based on upward and downward longwave radiation measurements at two levels and turbulent eddy correlation measurements at eight levels. In contrast to previous radiation divergence measurements obtained within 10 m above the ground, radiative flux divergence was measured within a deeper layer-between 2 and 48 m. Within the layer, the radiative flux divergence is, on average. comparable to or smaller than the sensible heat flux divergence. The horizontal and vertical temperature advection, derived as the residual in the heat balance using observed sensible heat and radiative fluxes, are found to be significant terms in the heat balance at night. The observations also indicate that the radiative flux divergence between 2 and 48 m was typically largest in the early evening. Its magnitude depends on how fast the ground cools and on how large the vertical temperature gradient is within the layer. A radiative flux difference of more than 10 W m-2 over 46 m of height was observed under weak-wind and clear-sky conditions after hot days. Wind speed variation can change not only the sensible heat transfer but also the surface longwave radiation because of variations of the area exposure of the warmer grass stems and soil surfaces versus the cooler grass blade tips. leading to fluctuations of the radiative flux divergence throughout the night.”

As the authors write “Its magnitude depends on how fast the ground cools and on how large the vertical temperature gradient is within the layer…..Wind speed variation can change not only the sensible heat transfer but also the surface longwave radiation because of variations of the area exposure of the warmer grass stems and soil surfaces versus the cooler grass blade tips. leading to fluctuations of the radiative flux divergence throughout the night.”

All of us should be disappointed that both James Annan and Michael Tobis have elected not to engage in a proper scientific discussion of our findings. We look for a dialog with colleagues who do undertand boundary layer dynamics.

Comments Off on Remarkable Admission By James Annan On The Klotzbach Et Al (2009) Paper

Filed under Climate Science Misconceptions

Our Paper “Impacts Of Land Use Land Cover On Temperature Trends Over The Continental United States: Assessment Using The North American Regional Reanalysis” By Fall Et Al 2009 Is Published

Our paper

Fall, S., D. Niyogi, A. Gluhovsky, R. A. Pielke Sr., E. Kalnay, and G. Rochon, 2009: Impacts of land use land cover on temperature trends over the continental United States: Assessment using the North American Regional Reanalysis. Int. J. Climatol., 10.1002/joc.1996

is now published.  We report in our paper

“As most of the warming trends that we identify can be explained on the basis of LULC changes, we suggest that in addition to considering the greenhouse gases-driven radiative forcings, multi-decadal and longer climate models simulations must further include LULC changes.”

The abstract reads

“We investigate the sensitivity of surface temperature trends to land use land cover change (LULC) over the conterminous United States (CONUS) using the observation minus reanalysis (OMR) approach. We estimated the OMR trends for the 1979-2003 period from the US Historical Climate Network (USHCN), and the NCEP-NCAR North American Regional Reanalysis (NARR). We used a new mean square differences (MSDs)-based assessment for the comparisons between temperature anomalies from observations and interpolated reanalysis data. Trends of monthly mean temperature anomalies show a strong agreement, especially between adjusted USHCN and NARR (r = 0.9 on average) and demonstrate that NARR captures the climate variability at different time scales. OMR trend results suggest that, unlike findings from studies based on the global reanalysis (NCEP/NCAR reanalysis), NARR often has a larger warming trend than adjusted observations (on average, 0.28 and 0.27 °C/decade respectively).

OMR trends were found to be sensitive to land cover types. We analysed decadal OMR trends as a function of land types using the Advanced Very High Resolution Radiometer (AVHRR) and new National Land Cover Database (NLCD) 1992-2001 Retrofit Land Cover Change. The magnitude of OMR trends obtained from the NLDC is larger than the one derived from the static AVHRR. Moreover, land use conversion often results in more warming than cooling.

Overall, our results confirm the robustness of the OMR method for detecting non-climatic changes at the station level, evaluating the impacts of adjustments performed on raw observations, and most importantly, providing a quantitative estimate of additional warming trends associated with LULC changes at local and regional scales. As most of the warming trends that we identify can be explained on the basis of LULC changes, we suggest that in addition to considering the greenhouse gases-driven radiative forcings, multi-decadal and longer climate models simulations must further include LULC changes.”

Comments Off on Our Paper “Impacts Of Land Use Land Cover On Temperature Trends Over The Continental United States: Assessment Using The North American Regional Reanalysis” By Fall Et Al 2009 Is Published

Filed under Research Papers

Comments On A New Post By James Annan titled “A Bizarre Rewriting Of History”

James Annan has written a new post entitled “A bizarre rewriting of history”.

In response, my son and I have e-mailed to James and have reproduced the communications below. If he responds, I will post as an update.

E-Mail from Pielke Jr.

James-
 
With respect to your last post, perhaps these points of clarification can be of some use:
 
1. Klotzbach et al. has nothing to do with the radiative forcing of greenhouse gases on climate.  Zero.  It just is not relevant to the argument.  The underlying warming trend that is present could be due to alien space rays as far as our argument is concerned.  As I have written, it is however consistent with a warming trend due to the radiative forcing effects of GHGs.
 
2. Klotzbach et al. discusses the role of GHGs on measured surface temperature trends through mechanisms different than TOA radiative forcing.  The rise in GHGs is one of several important factors in perturbing the boundary layer.
 
3. Mahmood et al. (in press) that you write about discusses the role of LULC on climate and future research priorities.  If Klotzbach et al. is correct then it suggests a different interpretation of the land surface record than conventionally ascribed, and surely this would have effects on research priorities as related to the land surface and climate.  Your overheated remarks seem to have forgotten the “and future research priorities”.  In any case LULC is one of the factors discussed in Klotzbach et al. as related to pertubations of the boundary layer.
 
4. As I understand it 1 and 2 above hold for PM05 as well (Pielke Sr can confirm, he is copied).
 
All of this seems quite obvious.  Anyway, what is the deal with all of your snark and playing to the chorus?  Why so angry?
 
Best regards from Boulder,
 
Roger

E-mail from Pielke Sr.

James

 I agree with Roger Jr. He has expressed the papers accurately.

I also agree; why are you so contemptuous in your tone? You may be reaffirming others on your viewpoint, but you are turning off
independent readers.

 As to your statement

“For the record, I agree that land use cover change may impact on the climate. But unless Roger Pielke can find some way of arguing that this has changed the net average surface flux by the order of 1Wm-2 at night, his whole theory is still a bust”.

Do you really mean this? Changing just the value of z0 [the aerodynamic roughness] at the surface has this effect and more. Clouds and higher water vapor (and CO2) also alter the surface flux by values larger than 1 Watt per meter squared. I agree that added CO2 is less important in this regard as a direct radiative effect than clouds and water vapor, as we reported in the Eastman et al paper, but as we also showed in that paper, the effect of the biogeochemical effect of added CO2 on plant transpiration during daylight on subsequent nighttime water vapor  concentrations (and  thus its effect on the radiative flux) is significant.

The P&M paper just looked at the issue as to whether if there was less loss of heat at night out of the top of the boundary layer, even if the loss was the same, would the vertical distribution of the heat loss be uniform between strong and windy nights? It is not, and this effect is seen in the minimum surface air temperatures. In the real world, it is even more different as the loss of heat from the boundary layer is not the same on windy and light wind nights.

Regards

Roger Sr.

Comments Off on Comments On A New Post By James Annan titled “A Bizarre Rewriting Of History”

Filed under Climate Science Misconceptions

The Santer Et Al (2000) View Of The Importance of Error In The Surface Temperature Record In Their Paper “Differential Temperature Trends At The Surface And In The Lower Troposphere”

In 2000 Ben Santer and colleagues published the paper [and thanks to Dick McNider for alerting us to it!]

B. D. Santer, T. M. L. Wigley, D. J. Gaffen, L. Bengtsson, C. Doutriaux, J. S. Boyle, M. Esch, J. J. Hnilo, P. D. Jones, G. A. Meehl, E. Roeckner, K. E. Taylor, and M. F. Wehner: Interpreting Differential Temperature Trends at the Surface and in the Lower Troposphere
 Science 18 February 2000 287: 1227-1232 [DOI: 10.1126/science.287.5456.1227] (in Research Articles).

The abstract reads

“Estimated global-scale temperature trends at Earth’s surface (as recorded by thermometers) and in the lower troposphere (as monitored by satellites) diverge by up to 0.14°C per decade over the period 1979 to 1998. Accounting for differences in the spatial coverage of satellite and surface measurements reduces this differential, but still leaves a statistically significant residual of roughly 0.1°C per decade. Natural internal climate variability alone, as simulated in three state-of-the-art coupled atmosphere-ocean models, cannot completely explain this residual trend difference. A model forced by a combination of anthropogenic factors and volcanic aerosols yields surface-troposphere temperature trend differences closest to those observed.”

In their paper, they briefly mention the effect of a possible error in the surface temperature data which could explain the discrepancy between the temperature trends in the troposphere and at the surface. They write

“A nonsignificant trend differential would also occur if the surface warming had been overestimated by 0.05°C per decade in the IPCC data … The relative likelihood of such errors in the MSU and IPCC data is difficult to assess…”

The IPCC (2007) Statement for Policymakers wrote

“New analyses of balloon-borne and satellite measurements of lower- and mid-tropospheric temperature show warming rates that are similar to those of the surface temperature record and are consistent within their respective uncertainties, largely reconciling a discrepancy noted in the TAR.”

and

“Eleven of the last twelve years (1995–2006) rank among the 12 warmest years in the instrumental record of global surface temperature9 (since 1850). The updated 100-year linear trend (1906 to 2005) of 0.74°C [0.56°C to 0.92°C] is therefore larger than the corresponding trend for 1901 to 2000 given in the TAR of 0.6°C [0.4°C to 0.8°C]. The linear warming trend over the last 50 years (0.13°C [0.10°C to 0.16°C] per decade) is nearly twice that for the last 100 years. The total temperature increase from 1850–1899 to 2001–2005 is 0.76°C [0.57°C to 0.95°C].”

Our paper

Klotzbach, P.J., R.A. Pielke Sr., R.A. Pielke Jr., J.R. Christy, and R.T. McNider, 2009: An alternative explanation for differential temperature trends at the surface and in the lower troposphere. J. Geophys. Res., accepted

has clearly documented an estimated warm bias of about 30% in the IPCC reported  surface temperature trends.  This bias also brings into question the claim that 11 of the 12 years in the period 1995 to 2006 were the warmest on record.  Moreover, despite the claim in the IPCC (2007) report, the tropospheric and surface temperature trends have not NOT reconciled.

The lack of news coverage on this documented bias which has appeared in the peer reviewed literature on the Klotzbach et al (2009) paper is another clear example of the failure of most  of the journalism community to cover news that conflicts with the IPCC (2007) perspective.

Comments Off on The Santer Et Al (2000) View Of The Importance of Error In The Surface Temperature Record In Their Paper “Differential Temperature Trends At The Surface And In The Lower Troposphere”

Filed under Climate Change Metrics

My 1991 View of “Overlooked Scientific Issues In Assessing Hypothesized Greenhouse Gas”

In 1991 I published a paper which had my views on the issue of the GCM modeling of global warming. This weblog revisits the topics I raised at that time.

Pielke, R.A., 1991: Overlooked scientific issues in assessing hypothesized greenhouse gas warming. Environ. Software, 6, 100-107.

I summarized the focus of my article in the text

“Numerical models of the global atmosphere and ocean circulations (referred to as general circulation models -GCMs) have been used to investigate the impact on climate of an increase in these trace gases [which include carbon dioxide, methane, chlorofluorocarbons, and nitrous oxide]. The Environmental Protection Agency (EPA) concluded in 1983 based on these models, for example, that an increase of the average global temperatures of 5°C by the year 2100 with an incrcase or sea level up to around 2 meters will result because of the global enhancement of these gases. The World Meteorological Organization has concluded that greenhouse gas cause warming could cause a global warming of 1.5°C to 4°C by the middle of the next century.

The purpose of this paper is to discuss a number of serious shortcomings in the GCM model simulations which produced these conclusions regarding climate change, These limitations, which are either inadeqately handled or not represented at all in GCMs are summarized in this paper.”

The following are the issues that I have raised, and what has been accomplished since the appearance of this paper:

1. INCREASED CARBON DIOXlDE CONSUMPTION RESULTING FROM INVIGORATED PLANT GROWTH ON LAND AND IN THE OCEAN

Biogeochemisty and biogeography are now recognized as first order climate effects [e.g see NRC, 2005].

2. INABILITY FOR GCM MODELS TO PROPERLY RESOLVE THE EVOLUTION OF EXTRATROPICAL AND TROPICAL CYCLONES DUE TO THER POOR SPATIAL RESOLUTION

Even though the models now have finer spatial resolution, The IPCC community still fails to recognize that they must test the ability to faithfully simulate weather features (i.e. they need to be run in a numerical weather prediction mode). This is a necessary test in order to evalute the dynamics and thermodynamics in the GCMs.

3. INABILITY FOR GCM MODELS TO  PROPERLY RESOLVE REGIONS OF OCEAN UPWELLING WHOSE COLD WATERS CAN ENHANCE THE OCEANIC UPTAKE OF CARBON DIOXIDE

The issue still requires futher investiagation. I would welcome urls of peer reviewed papers that have looked at this specific issue  [which is directly related to the spatial resolution in the ocean part of the global climate models, as well as both the physical temperature effect and the biogeochemical (carbon assimilation) effect on ocean biomass].

4. OCURRENCE OF GREATER GLOBAL CLOUD COVERAGE AS A RESULT OF COLLODIALLY MORE STABLE CLOUDS DUE TO ANTHROPOGENIC INPUT OF AEROSOLS

This climate forcing is now recognized as a major effect on the climate system [NRC, 2005]. Its complexity, however, and the microphysics spatial scales in which this occurs, continue to challenge skillful modeling of this process.

5. MODIFICATION OF THE AMOUNT OF SOLAR RADIATION REFLECTED BACK INTO SPACE DUE TO MAN-CAUSED LANDSCAPE CHANGE

This effect is included in the 2007 IPCC report.

6. MODIFICATION IN THE AMOUNT OF EVAPORATION AND TRANSIPIRATION TO THE ATMOSPHERE AS A RESULT OF MAN CAUSED LANDCAPE CHANGES

This has been one of my major research areas, and it has been elevated to a first order climate effect (e. g. see NRC, 2005), although the 2007 IPCC failed to adequately discuss it.

7. CLOUDS WHICH CONTAIN SULPHATE PARTICLES, RESULTING FROM FOSSIL FUEL COMBUSTION, HAVE HIGHER ALBEDO THAN PRISTINE CLOUDS

As with #4,  this climate forcing is now recognized as a major effect on the climate system [e.g, see NRC, 2005]. Clouds and precipitation process are not seen, however, as an even more difficult modeling issue than in the early 1990s (e. g. see Table 2-2 in NRC, 2005).

8. GREATEST WARMING IS PREDICTED TO BE IN POLAR REGIONS, AND YET WARMING HAS NOT OCCURRED.

This warming has occured in the Arctic (and, while there is disagreement), it has not warmed in the Antarctic region at the same level. 

9. SINCE THE ATMOSPHERE IS A NONLINEARLY RSPONDING SYSTEM. EVEN WITH ALL RELEVANT PHYSICS FAITHFULLY REPRESENTED. THE GCMS COULD ONLY SIMULATE EXAMPLES OUT OF A SPECTRUM OF POSSIBLE ATMOSPHERIC RESPONSES TO INCREASED GREENHOUSE GASES

The 2007 IPCC continued to perpetuate the view that the models can skillfully predict the climate in the coming decades despite their own admission that the GCMs do not even have all of the first order climate forcings (see the caption to figure SPM.2).

I summarized my recommendations are follows

“Since climate change is a natural feature of the earth, we need to husband our resources even if there were no man-caused changes (e.g. Schneider). With respect to man’s potential influence on climate, the “path-of least- regret” is that we should immediately adopt policies which mitigate man’s impact providing there are no deleterious economic, environmental, or political effects of these policies. Even better, of course, is if these policies result in positive benefits to mankind. Conservation of fossil fuel resources, for example, and utilization of renewable energy resources represent examples of beneficial activities which should be promoted by government policy makers regardless of the direction of climate change. Recommendations by Rosenfeld and lIafemeister represent definite steps which could be taken to achieve this goal. Policies which require significant hardship, are  in this writer’s opinion premature.”

Comments Off on My 1991 View of “Overlooked Scientific Issues In Assessing Hypothesized Greenhouse Gas”

Filed under Climate Change Forcings & Feedbacks, RA Pielke Sr. Position Statements, Research Papers

New Research Paper “Theoretical Assessment Of Uncertainty In Regional Averages” By D.PaiMazumder And N. Mölders

Nicole Mölders has a very important new research paper that is in press. This paper illustrates the issue of what is an adequate spatial sampling of surface climate variables, including the 2m temperatures.

This is yet another illustration of the inadequacy of the use of 2m temperature trends over land, as applied by NCDC, GISS and CRU to construct a multi-decadal global average surface temperature trend. 

As we have shown in a number of peer-reviewed research papers, this temperature has a diverse set of biases and uncertainties which make it quantitatively misleading to use as a diagnostic of global warming, and even to monitor regionally averaged temperature anomalies (e.g. see, see, see and see).

The paper is

PaiMazumder D. And N. Mölders, 2009: Theoretical assessment of uncertainty in regional averages due to network density and design. Journal of Applied Meteorology and Climatology. (in press). [the paper will appear here, as soon as the AMS posts]

The abstract reads

“Weather Research and Forecasting (WRF) model simulations are performed over Russia for July and December 2005, 2006 and 2007 to create a “dataset” to assess the impact of network density and design on regional averages. Based on the values at all WRF grid-points regional averages for various quantities are calculated for 2.8o X 2.8o areas as the “reference”. Regional averages determined based on 40 artificial networks and 411 “sites” that correspond to the locations of a real network, are compared with the reference regional averages. The 40 networks encompass ten networks of 500, 400, 200, or 100 different randomly taken WRF-grid-points as “sites”.

“The real network’s site distribution misrepresents the landscape. This misrepresentation leads to errors in regional averages that show geographical and temporal trends for most quantities: errors are lower over shores of large lakes than coasts and lowest over flatland followed by low and high mountain ranges; offsets in timing occur during frontal passages when several sites are passed at nearly the same time. Generally, the real network underestimates regional averages of sea-level pressure, wind-speed, and precipitation over Russia up to 4.8 hPa (4.8 hPa), 0.7 m/s (0.5 m/s), and 0.2 mm/ d (0.5 mm/d), and overestimates regional averages of 2-m temperature, downward shortwave radiation and soil-temperature over Russia up to 1.9K (1.4K), 19Wm-2 , (14Wm-2 ), and 1.5K (1.8K) in July (December). The low density of the ten 100-sites-networks causes difficulties for sea-level pressure. Regional averages obtained from the 30 networks with 200 or more randomly distributed sites represent the reference regional averages, trends and variability for all quantities well.”

The paper also writes

“In the natural landscape differences between the regional averages derived from the real network and the true regional averages may be even greater than in our theoretical study because the real network was designed for agricultural purposes, i.e. the real network represents the fertile soils within the 2.8o X  2.8o areas. Consequently, it may be even more biased to a soil-type than in the simplified WRF-created landscape assumed in this case study.”

Comments Off on New Research Paper “Theoretical Assessment Of Uncertainty In Regional Averages” By D.PaiMazumder And N. Mölders

Filed under Climate Change Metrics

Comment On News Article “U.S. Chamber of Commerce Seeks Trial On Global Warming”

There was a news article in the LA times on August 25 2009 by Jim Tankersley entitled “U.S. Chamber of Commerce seeks trial on global warming“.

The article has the text

“The U.S. Chamber of Commerce, trying to ward off potentially sweeping federal emissions regulations, is pushing the Environmental Protection Agency to hold a rare public hearing on the scientific evidence for man-made climate change.”

I do not know if a “trial” is effective, however, it is certainly clear that the EPA ruling is scientifically very flawed, as I wrote in a series of posts:

Republican Comment On EPA Endangerment Findings

Brief Overview Of Several Climate Science Research Findings

Comments On The EPA “Proposed Endangerment And Cause Or Contribute Findings For Greenhouse Gases Under The Clean Air Act”.

As I have written in the last weblog above

In conclusion, the EPA Endangerment findings is the culmination of a several year effort for a small group of climate scientists and others to use their positions as lead authors on the IPCC, CCSP and NRC reports to promote a political agenda.

Now that their efforts have reached the federal policy decision level, Climate Science urges that there be an independent commission of climate scientists who can evaluate the assement process that led to the EPA findings as well as the climate science upon which it is constructed. “

The Chamber of Commerce statement further documents that independent assessments of the EPA findings are required.

Comments Off on Comment On News Article “U.S. Chamber of Commerce Seeks Trial On Global Warming”

Filed under Climate Change Regulations, Climate Science Reporting

The Issue That James Annan and Gavin Schmidt Should Focus On With Respect To The Klotzbach Et Al 2009 Paper

There has been quite a bit of discussion by James Annan (see and see) and Gavin Schmidt (see) on our paper

Klotzbach, P.J., R.A. Pielke Sr., R.A. Pielke Jr., J.R. Christy, and R.T. McNider, 2009: An alternative explanation for differential temperature trends at the surface and in the lower troposphere. J. Geophys. Res., in press.

 which is quite peripheral to the conclusions of our paper.  In our multi-authored paper

Mahmood, R., R.A. Pielke Sr., K.G. Hubbard, D. Niyogi, G. Bonan, P. Lawrence, B. Baker, R. McNider, C. McAlpine, A. Etter, S. Gameda, B. Qian, A. Carleton, A. Beltran-Przekurat, T. Chase, A.I. Quintanar, J.O. Adegoke, S. Vezhapparambu, G. Conner, S. Asefi, E. Sertel, D.R. Legates, Y. Wu, R. Hale, O.W. Frauenfeld, A. Watts, M. Shepherd, C. Mitra, V.G. Anantharaj, S. Fall,R. Lund, A. Nordfelt, P. Blanken, J. Du, H.-I. Chang, R. Leeper, U.S. Nair, S. Dobler, R. Deo, and J. Syktus, 2009: Impacts of land use land cover change on climate and future research priorities. Bull. Amer. Meteor. Soc., in press

We summarized the issue as follows

“The stable nocturnal boundary layer does not measure the heat content in a large part of the atmosphere where the greenhouse signal should be the largest (Lin et al. 2007; Pielke et al. 2007a). Because of nonlinearities in some parameters of the stable boundary layer (McNider et al. 1995), minimum temperature is highly sensitive to slight changes in cloud cover, greenhouse gases, and other radiative forcings. However, this sensitivity is reflective of a change in the turbulent state of the atmosphere and a redistribution of heat not a change in the heat content of the atmosphere (Walters et al. 2007). Using the Lin et al. (2007) observational results, a conservative estimate of the warm bias resulting from measuring the temperature from a single level near the ground is around 0.21°C per decade (with the nighttime minimum temperature contributing a large part of this bias). Since land covers about 29% of the Earth.s surface, extrapolating this warm bias could explain about 30% of the IPCC estimate of global warming. In other words, consideration of the bias in temperature could reduce the IPCC trend to about 0.14°C per decade; still a warming, but not as large as indicated by the IPCC.”

So far, it appears that neither James or Gavin are particularly knowledegable on boundary layer physics.  While they certainly have expertise in other areas in climate science, they have failed so far to comment on the topic in the above paragraph (which is what the Klotzbach et al (2009), Lin et al (2007) and Pielke and Matsui (2005) papers are about.

My current weblog is an invitation to them to comment on the above paragraph (either as guest weblogs or on their sites). If they ignore this request, it would further demonstrate that they are commenting outside of their expertise on the subject of our papers, and that their real goal is simply to malign papers they disagree with.

Comments Off on The Issue That James Annan and Gavin Schmidt Should Focus On With Respect To The Klotzbach Et Al 2009 Paper

Filed under Climate Change Metrics

Comments On An E-mail Exchange With James Annan

James Annan and I have been exchanging e-mails over the weekend, and while he clearly is misunderstanding the focus of the papers,

Pielke Sr., R.A., and T. Matsui, 2005: Should light wind and windy nights have the same temperature trends at individual levels even if the boundary layer averaged heat content change is the same?Geophys. Res. Letts., 32, No. 21, L21813, 10.1029/2005GL024407

Lin, X., R.A. Pielke Sr., K.G. Hubbard, K.C. Crawford, M. A. Shafer, and T. Matsui, 2007: An examination of 1997-2007 surface layer temperature trends at two heights in Oklahoma. Geophys. Res. Letts., 34, L24705, doi:10.1029/2007GL031652.

Klotzbach, P.J., R.A. Pielke Sr., R.A. Pielke Jr., J.R. Christy, and R.T. McNider, 2009: An alternative explanation for differential temperature trends at the surface and in the lower troposphere.J. Geophys. Res., in press.

the fact that he is engaging in more-or-lessconstructive debate is encouraging.

I have posted below my edited latest reply to James, as the information should be useful to those have been misled by James’s and Gavin Schmidt’s posts on our paper [James still concludes this is only about the radiative forcing of CO2; James’s statement in the comments that “At least it now seems fairly clear from the recent distraction tactics (eg belatedly trying to convolute the effect of different atmospheric states with that of anthropogenic forcing) that he realises his error” still emphasizes his missing the point on our papers].

Here is my latest e-mail.  With James permission, I would post his also.

E-Mail to James Annan August 23 2009

James

 I do not reject the figure by Woods. That figure presents the instantaneous radiative flux divergences for the specific vertical profiles used in that analysis. However, it does not have the time integration that would result in the development of a stable boundary layer near the surface.

 To more closely illustrate the actual issue in our papers, as one example, we compute vertical heating rates all of the time in our modeling. As an easily accessible sample, see Fig 8-6 in my modeling book [Mesoscale MeterologicalModeling, 2nd Ed. 2002] for a location in Australia. The rate of cooling is about 0.18K per hour with the largest cooling near the surface. At 1.5 km, it is about an order of magnitude smaller. An alteration in the vertical distribution of this heating will necessarily alter the minimum temperature at 2m.

 What I see is the issue is that you are fixated on the radiative effect of doubled CO2. I agree with you that it is a much smaller effect than other influences on changing the vertical distribution of the heating.  In the Eastman et al 2001 paper , see Figure 8. These results present an integrated analysis of the effect on the vertical distribution of heating on minimum temperatures where both radiative flux divergence and vertical divergence of turbulent heat fluxes are included.

 The radiative effect of CO2 on the minimum temperature is an inconsequential -0.017 C, but it does have an effect. The biogeochemical effect (which alters stomatal conductance and the growth of leaf area and roots during the period of the simulation) is +0.097 C and the land use change is +0.261 C. The later two are significant. Both of the later, we attribute to the addition of water vapor into the atmosphere [and its effect on the vertical profile of the long wave radiative flux divergence] as a result of the greater leaf area.

 Thus the focus on the radiative effect of doubled CO2, which was presented in P&M as just one example of what could alter 2m temperatures, is a diversion from the focus of our paper. Anything which alters the vertical distribution of heating will alter the temperatures at 2m. If the alteration is systematic over years, it will result in a bias in the interpretation of the 2m temperature trends (anomalies) as moving in tandem with the trends (anomalies) higher up.

 I invite you to comment on the core of the three papers [P&M; Lin et al; and Klotzbach et al] instead of the peripheral discussion of TOA and surface radiative heating from the doubling of CO2. The core issue is

“Our results also indicate that the 1.5 or 2 m minimum long term temperature trends over land are not the same as the minimum long term temperatures at other heights within the surface boundary layer (e.g. 9 m), even over relatively flat landscapes such as Oklahoma. For landscapes with more terrain relief, this difference is expected to be even larger.

Therefore, the use of minimum temperatures at 1.5 or 2 m for interpreting climate system heat change is not appropriate. This means that the 1.5 to 2 m observations of minimum temperatures that are used as part of the analysis to assess climate system heat changes (e.g., such as used to construct Figure SPM-3 of Intergovernmental Panel on Climate Change [2007] and of Parker [2004, 2006] study) lead to a greater long term temperature trend than would be found if higher heights within the surface boundary layer were used.”

Your comments on the above conclusion would be where the focus of your weblogs are. If you disagree, discuss why. 

 You are using the discussion of the role of the radiative effect of added CO2 in  directly altering the surface fluxes as an way to divert attention from the actual conclusions of our paper. Indeed, if we accept your interpretation that the direct radiative effect of doubled CO2 is so small, yet the other effects, such as land use change are so much more important even at short time periods, we should take away the message that there is much more to climate change than just changes in the radiative top of the atmosphere forcing due to added CO2.

Comments Off on Comments On An E-mail Exchange With James Annan

Filed under Climate Change Metrics