Category Archives: Guest Weblogs

Review of Humlum Et Al 2012 “The Phase Relation Between Atmospheric Carbon Dioxide And Global Temperature” By Donald Rapp

Commentary By Donald Rapp on the paper: “The phase relation between atmospheric carbon dioxide and global temperature” by Ole Humlum, Kjell Stordahl, Jan-Erik Solheim, accepted for publication in: Global and Planetary Change.

This paper analyzed data on annual variations in carbon dioxide concentration, various measures of earth temperature, and rate of emissions of carbon dioxide for the period 1980 to 2011. They compared the rate of change of CO2 concentration with measures of the rate of change of global temperature. While both CO2 and temperature generally increased during this 31-year period, the rates of change varied significantly during the period. They showed that changes in CO2 correlated somewhat with changes in sea surface temperature (SST) but the CO2 change lagged the SST change by about 11-12 months. They concluded that “A main control on atmospheric CO2 appears to be the ocean surface temperature”. They mentioned possible connection to the giant 1998 El Niño but did not elaborate on the connection of the entire sequence of data to El Niño indices.

In the present posting I desire to make a few comments on this paper by Humlum et al. Of course, as noted by the authors, the common belief is that rising CO2 produces an increase in the rate of warming, not vice versa. Their data suggests quite the opposite.

Consider the figure at the top of this post.

The uppermost curve shows the NINO3.4 index from 1980 to 2011. Peak El Niños are labeled with letters A to F.

The middle curve shows the change in CO2 concentration per year plotted on a monthly basis. The peaks in this curve are also subjectively labeled A to F. The average change in CO2 concentration per year can be interpreted either as a ramp or a step-function. Arbitrarily adopting the step function, the average change in CO2 concentration per year varied from year to year about 1.5 ppm/yr prior to the 1998 El Niño, and varied from year to year about 2.0 ppm/yr after the 1998 El Niño. These are depicted as horizontal dashed lines x and y.

The lowermost curve shows the annual change in anthropogenic CO2 emissions plotted on a per month basis.

A rough rule of thumb is that each Gt of carbon (3.67 Gt of CO2) produces the equivalent of about 0.5 ppm of CO2 in the atmosphere if none of it is absorbed. The figure below shows that annual variations in global emissions of carbon are typically about 2 x 104 metric tons per year which if unabsorbed, would produce annual changes in CO2 that are far too small to account for the observed variations in the average change in CO2 concentration per year.

The point made by Humlum et al. is that the average change in CO2 concentration per year lags the change in ocean temperature by about 11-12 months. As Tisdale showed in his book, El Niños leave behind them a pool of warm surface waters. As a result, the average change in CO2 concentration per year tends to lag the NINO3.4 index by a bit more than a year. This correlation is far from perfect but it seems to have some validity, particularly for the major El Niño that started toward the end of 1997. The data suggest that the ability of the oceans to absorb CO2 emitted by human activity responds to the state of the NINO3.4 index with a delay of a bit over a year.

Human activity is presently emitting roughly 8 Gt/yr of carbon, which if unabsorbed, would be sufficient to increase the atmospheric concentration of CO2 by about 4 ppm per year. Over a period of years, (very) roughly half of that CO2 is absorbed by earth systems (oceans, biosphere, …) and the other half ends in the atmosphere raising the atmospheric concentration by about 2 ppm. However, on a year-by-year basis, the proportion of emitted CO2 that is absorbed by the earth systems varies considerably, mainly due to the presence of warm surface waters in the Pacific produced quasi-periodically by El Niños. According to the graphical data below, the annual increase in CO2 concentration can be as high as 3 ppm (following the 1998 El Niño) or as low as 1 ppm (between peaks B and C). During the most recent period after the 1998 El Niño, variations in annual increase in CO2 concentration seem to have varied roughly as 2 ± 0.5 ppm or ±25%. These results seem to suggest that while roughly half of emissions end up in the atmosphere over an extended period, annual variations in the distribution of emitted CO2 between the atmosphere and the earth system are significant, and strongly dependent on prevalence of El Niños.

Tisdale showed that from 1976 to about 2005, there was a pronounced prevalence of El Niños over La Niñas. He argued that this could account for all of the warming of the earth during that period without invoking the greenhouse effect. However, it seems likely that during this period, a greater proportion of emitted CO2 ended up in the atmosphere due to prevalence of El Niños, and this might have amplified the natural El Niño warming effect via greenhouse gas forcing. McLean et al. (2009) estimated that 70% was due to El Niños while Foster et al. (2010) fell back on climate models that attribute only 15-30% of temperature variation in the 20th century to variability of the El Niño index. As is usual in climate matters, one has only to glance at the authors to know in advance what spin the results are likely to show. The Foster paper included the crème de la crème of climategate characters while the Mclean paper was written by skeptics.

The proportion of global heating from 1976 to 2005 due to prevalence of El Niños over La Niñas vs. greenhouse gas forcing remains uncertain. Nevertheless, the state of the Pacific Ocean is clearly important, not only for its impact on the atmospheric temperature, but also because it regulates the annual rise in CO2 concentration.

Tisdale, Bob (2012) “Who turned on the heat?”, http://bobtisdale.wordpress.com/

McLean, J. D., C. R. de Freitas, and R. M. Carter (2009) “Influence of the Southern Oscillation on tropospheric temperature” Journal of Geophysical Research, 114, D14104.

Foster, G., J. D. Annan, P. D. Jones, M. E. Mann, J. Renwick, J. Salinger, G. A. Schmidt and K. E. Trenberth (2010) “Comment on “Influence of the Southern Oscillation on tropospheric temperature” by J. D. McLean, C. R. de Freitas, and R. M. Carter”, Journal of Geophysical Research, 115, D09110.

Comments Off

Filed under Guest Weblogs, Research Papers

Book Review By Donald Rapp Of “Who Turned on the Heat? – The Unsuspected Global Warming Culprit, El Niño-Southern Oscillation” – By Donald Rapp

Book Review By Donald Rapp [ see for the post on Donald's book – The Climate Debate]

*************************************************

Bob Tisdale has produced an extraordinary new book:

 Who Turned on the Heat? – The Unsuspected Global Warming Culprit, El Niño-Southern OscillationThe Unsuspected Global Warming Culprit, El Niño-Southern Oscillation is now on sale in pdf form for US$8.00 – Please click here to buy a copy.

This book is also subtitled “Everything you wanted to know about El Niño and La Niña” and that is quite accurate.

I didn’t realize how little I understood El Niño and La Niña phenomena until I read Bob Tisdale’s book. I learned a great deal from this book, which provides the reader with thorough but easily understandable explanations of El Niño and La Niña phenomena enhanced by many wonderful cartoon-like illustrations. The book provides lucid descriptions of the various indices used to characterize El Niño and La Niña phenomena. It also provides a wealth of graphical data on El Niño and La Niña occurrences. While the book deals predominantly with the last thirty years, it also deals with the entire 20th century.

Perhaps the three most important facts that I had not previously fully appreciated were:

(1) While incident sunlight can penetrate several to many meters into oceans, incident IR penetrates only up to a few mm.

(2) After an El Niño (particularly a strong one) a pool of warm surface water stretches across the Pacific that continues to warm the atmosphere even after El Niño conditions have subsided. (This seems to have been particularly important for the great 1998 El Niño).

(3) A La Niña is not the opposite of an El Niño, but rather is an amplified version of normal conditions in the Pacific Ocean.

In addition, Tisdale emphasizes the enormity of the Pacific Ocean (about 1/3 of the earth’s surface) and he also emphasizes the worldwide climatic effects of El Niños.

Tisdale uses Fact (1) to argue his view that the atmosphere, even if heated by greenhouse gases, does not warm oceans; rather sunlight warms oceans. On the contrary, he argues that warm ocean surface waters heat the atmosphere. So, he disagrees with climate modelers as to which is the dog and which is the tail that wags.

Tisdale uses Facts (2) and (3) to argue that the warming of the atmosphere that began around 1976 commensurate with the beginning of an era of El Niño preponderance was due to the warm surface waters during and after El Niños. He backs up this argument with extensive data.

Looking at the full extent of the 20th century, Tisdale shows that the century can be divided into three periods: 1900-1940; 1940-1976; and 1976-2006. The first and last periods were preponderantly El Niño while the middle period was slightly favored with La Niña. These period are also the periods during which global temperatures rose sharply, dipped slightly, and then rose again. This relationship has been noted previously by many climatologists (see Sec. 4.9 of my book: “The Climate Debate”). Tisdale shows (as my book does also) that the integral of an El Niño index looks amazingly like the global temperature curve for the entire 20th century.

Tisdale’s book is so excellent in so many ways that it is difficult to find anything to criticize. The only thing I can harp on is that Tisdale is perhaps too sure of himself. He seems certain that prevalent El Niños caused essentially 100% of the warming from 1976 to 2006 and greenhouse gases contributed nothing. He said:

“The SST and ocean heat content data for the past 30 years show that the global oceans have warmed. There is no evidence, however, that the warming was caused by anthropogenic greenhouse gases in part or in whole; that is, the warming can be explained by natural ocean-atmosphere processes, primarily ENSO” (emphasis added).

Other skeptics have suggested that El Niños provided 70% of warming over the past 50 years (McLean et al., 2009) while some alarmists have suggested 15-30% (Foster et al., 2010).

There is evidence that the 300-year period from say 1600 to 1900 was characterized by relatively low global temperatures (the little ice age) and we might surmise that El Niños were not preponderant during that period. Yet, something changed beginning shortly after 1900. There were two extended periods of preponderant El Niños commensurate with rising global temperatures. The change around 1976 was particularly noticeable and has been referred to in the literature (“sudden and decisive change in the circulation patterns and upwelling characteristics in the Pacific began around 1976”, “… the tendency for more El Niño and fewer La Niña events since the late 1970s is highly unusual and very likely to be accounted for solely by natural variability”, “Several studies have noted that the pattern of El Niño–Southern Oscillation (ENSO) variability changed in 1976, with warm (El Niño) events becoming more frequent and more intense”, “Particularly dramatic physical and biological excursions occurred during the 1976–77 change in the Pacific Decadal Oscillation”, “It is now widely accepted that a climatic regime shift transpired in the North Pacific Ocean in the winter of 1976–77. This regime shift has had far reaching consequences for the large marine ecosystems of the North Pacific” (references for these quote given in “The Climate Debate”). During the 20th century when earth temperatures were rising, the CO2 concentration was also rising. Several related questions are suggested. Can variability of preponderance of El Niños explain most, if not all of the 20th century warming? If it can, is there some underlying reason why El Niños  emerged preponderant in the 20th century, or was it merely a statistical quirk? Is there any connection between rising CO2 and the advent of preponderant El Niños? Tisdale says there is no evidence for this.

As for me, in matters of climate, I am not sure of hardly anything.

Donald Rapp

*************************************************

Added info: for other posts by Donald, please see

An Analysis By Donald Rapp Of The Levitus Et Al 2012 Analysis

“The Climate Debate” by Donald Rapp 2012 – An Excellent Addition To The Literature On The Climate Issue

Brief Commentary on Two Recent Papers By Donald Rapp

Comments Off

Filed under Books, Guest Weblogs

Reply From Chris Rapley Regarding His Nature Article

I invited Chris Rapley to reply to my post

A Comment On The Nature Article – “Time To Raft Up- Climate Scientists Should Learn From The Naysayers And Pull Together To Get Their Message Across, Says Chris Rapley”

He has graciously responded and, with his permission, I am posting below.

Dear Roger,

I am thankfully somewhat recovered. I am still on overseas travel, though (and will be for a couple of weeks), and not able to access the references you cite in your blog, which I will read with interest on my return. In the meantime, I would make the following observations:

(i) I agree completely that human greenhouse gas emissions are only part of the climate change story, and that climate change is only a subset of the broader issue of human disturbance of the Earth system. The paper by Rockstrom et al “A Safe Operating Space for Humanity” attempts to explore these issues. Similarly, global average near-surface temperature (which is in any case a physically meaningless variable) is only one part of the story of the Earth’s energy imbalance . And so I agree that as a community, by seeking to simplify, and hence by focussing attention on CO2 emissions and surface temperatures at the expense of a more balanced narrative, we have (i) left ourselves in the awkward position should policy makers address and deal with CO2 emissions of then having to say – “oh and by the way, there are these other issues too”, which is understandably unwelcome and unlikely to engender confidence and trust. And (ii) opened up a flank to be exploited by those with ill!

-intent to play games over detail – about surface temperature issues or the un-amplified impact of CO2 emissions, to knowingly draw debate into fruitless cul-de-sacs.

(ii) You argue that this has hindered progress in persuading policy makers to define and execute policy measures. You may well be right (though see below). But in my view, the main reason that decarbonising humanity hasn’t progressed very well is that it is really hard to achieve! We essentially have a global civilisation of 7bn people supported by infrastructure and processes based (unwittingly) on a false assumption … that we can extract and burn fossil-fuels limitlessly without consequence! So we have landed ourselves with 100y of investment in what is turning out to be a stranded asset. That would be tough enough, though not impossible, to deal with. But layered on top, as an obstacle to even begin a serious attempt to move forward, are the misunderstandings / rejections / dismissals / deliberate misleadings / ideological and political polarisations that I was writing about in my article. And my experience is that those barriers to progress are getting higher, as res!

ult of a combination the actions of a well-organised “dismissive” campaign, and an inept response from the climate community. Hence my thoughts and rallying cry.

(iii) In my article I say: “We climate scientists ? from disciplines both natural and social ? need to align our purpose, re-establish our legitimacy, identify and understand our target audiences and decide how best to express our message. ” In an earlier draft, I am pretty sure I also had “need to … agree our message ….” – and in the light of your criticism, I regret that somehow that dropped out. I see this exchange as a key part of that process – addressing the questions “What is our message? What should we focus on? How best should we frame it?”

(iv) Finally, an anecdote. Nearly twenty years ago, when I was Executive Director of IGBP, I complained to a very senior official inn the European Commission that framing the issue as “Climate Change” was too narrow, and that we should adopt “Global Change” as more appropriate. He had been very successful at engaging political interest and attention to the issue, and was alarmed at the prospect. He said: “it’s taken me years to get the politicians to respond to the phrase “climate change” – but now they do. If you change it to “Global Change” it will sow confusion and undermine all that good work. The use of “global change” never took off!

Feel free to use extracts from the above (or all of it) on your blog if you think it would be helpful.

Best regards,

Chris

source of image

Comments Off

Filed under Guest Weblogs

Toby Carlson Op-Ed “The Everlasting Argument Over Climate Change”

I invited a colleague of Barry Lynn who was listed on our e-mail interaction which culminated in the guest post

Guest Post By Barry Lynn Of The the Hebrew University of Jerusalem

to also write a guest post. I have known Toby Carlson for decades, and while we disagree on key issues that he discusses below, I respect Toby and want to give him this forum to present his views. His short biographical summary is below.

Dr. T. N. Carlson, Ph.D., Imperial College, University of London, is an emeritus Professor of Meteorology at the PennStateUniversity. Professor Carlson’s scientific contributions, over 90 papers published in refereed journals, reflect a wide range of interests: synoptic and dynamic meteorology, radiative transfer, severe local storms, plant-atmosphere interactions, aerosol transport and chemistry, remote sensing of land surface properties and surface energy processes, and, most recently, applications of remote sensing to the study of urban sprawl and small watershed runoff. In 1991 Professor Carlson published a widely used book on meteorology (Mid Latitude Weather Systems). He created two new web products related to his current interest in land surface processes: an online land surface process model (“Simsphere”) and a data base of impervious surface area and fractional vegetation cover determined from Landsat 5 digital imagery at 25 m resolution for all of Pennsylvania, 1985 and 2000. In addition he has helped create a web-based tool which allows one to assess the health (nutrient load) and surface runoff potential of a user-defined stream basin in Pennsylvania or in the Chesapeake Bay Basin.

Below is Toby’s Op-Ed, followed up a set of e-mail interchanges between Toby and I which are designed to expand on the Op-Ed [the e-mails were edited to focus on the  Op-Ed issues].

The everlasting argument over climate change

Until about a decade or so ago, I was a global warming skeptic. Back in the 1990s all sorts of claims were being made about climate change based on climate model simulations. At that time, the evidence was not clear and some of the research was underwhelming. I was resolved to not to remain skeptical unless and until it could be demonstrated that these models were capable of simulating the indisputable increase in global temperature that seems to have occurred during the previous century, by initializing the models with atmospheric conditions one hundred years earlier. Only when these models showed that they are capable of predicting changes over a century up to the present would I begin to take them seriously.

These set of conditions have finally been satisfied. Climate model simulations made by scientists have finally produced some convincing evidence of the effects of human activity on global climate change. Unlike previous types of computer simulations, the latest ones adopt the novel approach of predicting the present global temperature starting in the past, for example with the year 1890. I will now describe just one of several such climate simulations, albeit one of the first of its kind made over a decade ago.

Two series of four computer simulations were made under the auspices of the NationalCenter for Atmospheric Research (NCAR) and the Department of Energy using several different climate models, called General Circulation Models (GCMs). One series of computer runs included only the effect of volcanic eruptions and solar variations on the earth’s radiation budget. Aerosol particles such as sulfates and the notorious greenhouse gas, carbon dioxide, were kept constant at their 1890 levels. Another series of simulations allowed the sulfates and greenhouse gases to vary according to their observed values. For convenience, I refer to the temperature trend simulated by the first set of runs as the ‘natural’ variation and the second set as the ‘total’ variation, as the latter contains both natural and anthropogenic effects. The difference between the two sets of simulations constitute a measure of the human-induced effects on global climate. Unlike the unverifiable and more contentious predictions of future climate, these simulations are verifiable in that they can be compared with measured mean global temperature changes over the same period. In that sense the simulations can validate or invalidate themselves.

Results of the computer runs were summarized in a letter from Richard Anthes, president of the University Corporation for Atmospheric Research, to Senator John McCain of Arizona. In that letter, Dr. Anthes emphasized the role of human activity in global warming and urged the senator to treat global warming as a serious issue.

I redrew the graph included by Anthes in the letter to Senator McCain, smoothing out the wiggles to show only the essential details of the two series of simulations. Zero on the temperature scale is an arbitrary reference corresponding to the average temperature between 1890 and 1919. I don’t show the observations because they fall almost exactly on the smoothed temperature line for the total simulations, which therefore assume a high degree of credibility.

An interesting aspect of this graph is that the warming trend from the 19th century until some time after 1960 can be accounted for by natural variability. Yet, I am impressed that one can reasonably ascribe about 1°F in the temperature rise during the past thirty years or so to human activity; (the last point on each graph is extrapolated). According to Jerry Meehl, a scientist involved in these simulations at NCAR, carbon dioxide emissions have accelerated since 1960, raising global carbon dioxide concentrations by the year 2000 from about 315 parts per million to 360 parts per million during that time interval. (This is to be compared with about 275 parts per million in 1850.) As of the year 2012, the carbon dioxide concentrations have exceeded 390 parts per million.

For me (a once avowed skeptic of the global climate brouhaha) the graph is the first convincing evidence I have seen that global warming due to fossil fuel burning is significantly raising global temperature. Since then the evidence for a human cause of global warming has become even more convincing with yet more such simulations, accumulation of much more observational evidence, including temperatures showing an even steeper slope to the warming curve after the year 2000 than that shown in the figure, the 2007 IPCC report, including sea level rises, Artic ice disappearance, etc. And yet, even more such simulations reproducing the results originally made at NCAR with their climate simulations have subsequently been made. In my opinion, further denials of the global warming evidence are likely to be based more on political than on scientific, motives.

My Comment

Hi Toby

I will be asking questions on your views also, and you might like to add to your post based on this, which we can add. The first question is

“How do your reconcile your confidence in the model skill when the hindcast multi-decadal regional climate predictions are so poor, as i reported in my post

http://pielkeclimatesci.wordpress.com/2012/07/20/cmip5-climate-model-runs-a-scientifically-flawed-approach/

The second question is

“Which of the three hypotheses in

Pielke Sr., R., K. Beven, G. Brasseur, J. Calvert, M. Chahine, R. Dickerson, D. Entekhabi, E. Foufoula-Georgiou, H. Gupta, V. Gupta, W. Krajewski, E. Philip Krider, W. K.M. Lau, J. McDonnell,  W. Rossow,  J. Schaake, J. Smith, S. Sorooshian,  and E. Wood, 2009: Climate change: The need to consider human forcings besides greenhouse gases. Eos, Vol. 90, No. 45, 10 November 2009, 413. Copyright (2009) American Geophysical Union http://pielkeclimatesci.files.wordpress.com/2009/12/r-354.pdf

do you see as not being refuted?”

Best Regards

Roger

Toby’s Reply

Roger, I have not involved myself so deeply in the controversy as to address your questions. I am simply something more than a layperson but not a specialist. The graph I showed was an amalgam of four simulations combined for each of two conditions, with co2 and without co2. The simulations were made by a reputable group at NCAR and the simulations reproduced the observations exactly with co2 and not without co2. That was enough to convince me. I don’t know what you mean by poor predictions. The one’s have seen more recently seemed a good fit, though I have not studied the papers in detail.

My Comment

Hi Toby

Those questions are central to the issue of attributing all, most, some or  none of the added to CO2 to the observed warming. That added CO2 has a  warming effect is not in disagreement by anyone. The ability of regional  models to explain behavior and provide attribution for changes in drought  frequency, heat waves, etc. is, in my view, the central issue. The global  average surface temperature is almost irrelevant in this.

If you prefer just to focus on the correspondence between CO2 increase and  the global average temperature increase without any further discussion, we  can still post your comments, but it may result in your being asked by readers to respond to the type of questions I asked. Do you still want to  post given this might occur?

I would also like to post your reply below to my questions, which you (as I  would understand) might not feel comfortable with. But let me know.

Roger

Toby’s Reply

Roger….. I neglected to add a few more reasons why I changed my mind about human impact on global warming a decade ago, but these are well known and more conventional reasons: the IPCC 2007 report, the loss of sea ice in the Arctic, the rise in sea level, etc.

My Comment

Thanks Toby!

I will work with what we have. It will post sometime next week with our mails and your statement.

We differ significantly in our viewpoints, but your perspective should be presented. I will add a short bio on you, but please send me a paragraph so I can introduce you on the post.

Best Regards

Roger

Toby’s Comment

Roger, I am really just a bystander in the global warming debate. I don’t want to get drawn into a kind of biblical debate here, the kind that theologists tend to have amongst themselves.

I am certainly familiar with the issue and the physics and, to some extent, the models. But I have no ax to grind. I am simply posting my educated opinion and the  reasons for my change of mind. I would not be able to handle detailed questions. For those, one should contact my PSU colleague, Michael Mann.

My arguments, besides that of the NCAR models and the hockey stick graph are as follows:

*  CO2 concentrations have increased almost 20% since 1960. This is unprecedented even in geological time. They are higher now than at any time in the past million years. If you want to argue that issue, see my other colleague, Richard Alley.

* Someone had made a calculation that the total amount of fossil fuel burned over some period of time corresponds roughly to the increase in the CO2 during that period. This would be a relatively easy calculation to make if one had the time to do it. Therefore, the CO2 increase is almost all human made

* To say that an increase of 20% in CO2 would not make a difference means that the laws of radiative transfer must be discarded. It is no good to resort to Richard Lindzen’s arguments that feedbacks mitigate the effect (I understand that he has considered only negative feedbacks) or that the warming (which even he admits is occurring) will be no more than 0.5 C; (even he would admit to a factor of two uncertainty in this sort of highly theoretical estimate.

Second, this is specious because his argument as to the unimportance of the increase is judgmental: that this is not an important increase. He made the same mistake that was made in a Wall Street Journal article saying that the increase in global temperature has only been 0.8 C over the past century; it’s really a bit more than that, but anyway…. But, he didn’t realize that the increase in global temperature since the little ice age is ‘only’ about that amount, he is saying that the rise in temperature is no more important than the difference in climate between the little ice age and the present. I don’t think he meant to emphasize the importance of such a small increase. Such is the WSJ mindset.

Anyway, I doubt if my article will really provoke many to reply. It is simply an opinion article and the facts are not really in question in any case. I was simply indicating the issues that changed my mind on the subject.

Thanks for taking this so seriously and for including the essay on your web site.

Toby

source of image

Comments Off

Filed under Climate Science Op-Eds, Guest Weblogs

Repost Of “NOAA and GFDL CM2.1 Sea Surface Temperature Trends By Latitude” And “Sensitivity of the water vapor feedback to locations of SST trends” By Troy Ca

With permission, I am reposting two very interesting posts by Troy Ca on his weblog Troy’s Scratchpad which was motivated by Bob Tisdale and my post

Sea Surface Temperature Trends As A Function Of Latitude Bands By Roger A. Pielke Sr. and Bob Tisdale

Troy describes himself as

Yes, I am yet another software engineer with a climate science blog.  My hope here is to keep the focus technical and away from the drama we find on other climate sites in the blogosphere.  While that drama can be entertaining in its own right, there are plenty of other places to find it.

Besides — and I know I’m not the first person to say this — it’s a lot easier to publish a general comment attacking politics, motivations, or even a broad scientific topic, than it is to make a specific and coherent technical argument.

His excellent two posts are presented below:

Post #1

NOAA and GFDL CM2.1 sea surface temperature trends by latitude [July 17 2012]

I saw a post over at Roger Pielke Sr.’s a few days ago with some preliminary analysis of the sea surface trends by latitude, the primary point of which seemed to be that the water vapor feedback may be overestimated, as the warmer waters in the topics – which contribute more to evaporation – were actually heating at a slower rate than other regions.  I was skeptical of a few aspects of this, particularly the analysis with the short period (2003-2011) and the fact that the width of the latitude bands showing the trends seemed to arbitrary.  Furthermore, what it  showed was a general increase in temperature trend the further north it went over the period 1982-2011, which I seemed to recall was similar in the GFDL CM2.1 model run.

Thus, I decided to compare the monthly NOAA SST anomaly trends by latitude to several of the GFDL CM2.1 model run trends by latitude over the same period (1982-2010 was the overlap here).  I used those GFDL CM2.1 runs for CMIP5 (rather than the CMIP3 runs I had used before), so these should have used historical forcings up until 2005.  I ultimately downloaded 5 of the 10 runs to get a decent sense of variability over that period.  Intermediate data and code for this post is available here…you’ll see that I used 15 degree latitude bands to cut down on some of the noise.

Anyhow, when it came to reproducing the absolute temperatures by latitude, GFDL CM2.1 did quite well, although perhaps this was to be expected:

noaa_gfdl_temperature_by_lat

However, I was somewhat surprised to see that the spatial pattern of the warming was quite different:

noaa_gfdl_sst_trends

While the model seems to overestimate the SST trends, this was perhaps common knowledge already.  Instead, what is more interesting is that despite the variability in the model runs, almost all of them seem to show the greatest rate of warming near the equator, thus producing the general shape we see in the blue line (the mean from the 5 runs).  Compare it to the NOAA SST, and there appears to be a discrepancy.

So, what affect will this have on the TOA radiation and the water vapor feedback?  I’m not sure.  My next step, if I find time, will be to compare a GFDL CMIP5 AMIP run (which has the atmospheric component tied to the actual SST observations up till 2008) to that of these coupled runs, perhaps using the GFDL water vapor kernels to convert the atmospheric specific humidity outputs in each into the global TOA radiative effect.  Or perhaps this work is already out there, and I need to search the existing publications.

Post #2

Sensitivity of the water vapor feedback to locations of SST trends [August 15 2012]

With a new computer that has the necessary disk space and memory, I was finally able to do some of the analysis I had talked about doing in the previous post, and the preliminary results seem quite interesting.  To recap, I wanted to see what effect the discrepancy in the rates of sea surface warming at different latitudes would have on the TOA radiation budget that could specifically be attributed to water vapor.  This was originally to check out Dr. Pielke Sr.’s  hypothesis that perhaps the different rates of evaporation in these areas would have an impact on the water vapor feedback.

Data and Methods

Here, I use the same 5 CMIP5 historical runs from the GFDL CM2.1 model as I did in that last post, in particular the specific humidity and surface air temperature values.  I estimated the TOA radiative impact specifically attributed to water vapor for each month by using the longwave water vapor radiative kernel  derived from GFDL CM2.1.  I then compared this against the 3 GFDL HIRAM C180 AMIP runs available on the GFDL data portal, again converting specific humidity to the corresponding radiative anomaly.  Since the primary difference between the CMIP and AMIP runs is that the AMIP runs have the atmospheric model responding to actual sea surface temperature observations, if we assume that the atmospheric model in GFDL’s CM2.1 is very similar to GFDL’s HIRAM C180, we are thus essentially able to isolate the different water vapor responses specifically to the discrepancy in the location of SST trends.    The code and intermediate data for this post can be found here.

Results

wvRadAnomaly

In the figure above you can see the trend in the outgoing longwave radiation (OLR) over time in the coupled vs. atmosphere-only model runs.  The lower the OLR anomaly, the more heat that is “trapped” due to water vapor.  It is interesting to note that the AMIP runs, bounded as they are by actual SST observations, see a trend that has a much smaller magnitude than the fully coupled models (and, unsurprisingly, the runs show much less variation).  Clearly the discrepancy in SST trends has had a large impact on the TOA balance, at least with respect to water vapor.  However, if we’re estimating a global factor for the water vapor feedback, this in itself would not necessarily indicate an overestimation of the feedback (dR/dT) in the CM2.1 coupled model, since perhaps the denominator for the coupled model (the temperature trend) has been overestimated just as much as dR (one may argue that an overestimation in temperature trend could indirectly indicate an overestimate of some positive feedback, but it is by no means straight-foward).

The figure below shows the temperature trend in the coupled runs vs. GISS.  I should mention that it is my understanding that the long term water vapor feedback will be primarily driven by ocean temperatures, rather than including land surface air temperatures.  However, given that feedbacks and sensitivity estimates are typically calculated with respect to surface air temperature, I’ve framed the results in this way to be able to make meaningful comparisons to other results.  As we can see, the trends tend to be higher in the coupled runs, but the difference in trends is not as drastic as in the OLR anomalies.

TemperatureAnomaly

Additionally, there are are few different ways we might calculate the feedback ratio, dR/dT.  Soden and Held (2006) take the mean from the last 10 years and subtract the mean from the first 10 years when calculating the feedback from models, but they had 110 years to work with; since we only have 28 years here, I’ve employed a similar method (method #2 in the figure below) using the mean from the last 5 years (2004-2008) and the first 5 years (1981-1985).  Another method I’ve employed (method #1), in order to use all years, is the ratio of linear trends in the wv-induced OLR changes and temperature changes, or (dR/dt) / (dT/dt) (such a method is used, for example, when determining trend amplification in land vs. ocean vs. lower troposphere).  These methods, I believe, are more likely to highlight the long-term (climate scale) radiative variable response relevant to climate sensitivity.

The final method (method #3) I show is regressing R_wv (yearly anomalies) directly against temperatures.  This method has been used in several papers where the satellite observations are limited (to ~10 years).  However, in this approach, you tend to be measuring the response to ENSO-induced interannual variability, unless the trends are strong enough to negate this…and as I’ve discussed before, there appears to be little evidence to suggest that the global radiative response to ENSO-induced temperature changes correlates well with the long-term response, and it certainly doesn’t in most models.

And thus we have a boxplot of the estimated water vapor feedback:

FeedbackVals

Conclusions and Limitations

A simple glance would suggest that the water vapor feedback is significantly lower (~40%) in those runs bounded by SST observations than the coupled runs, which would have potentially large impact on the estimated sensitivity (for example, all else being equal and assuming a an average water vapor feedback of 1.8 W/m^2/K in models, this sort of reduction in the WV feedback would lower the climate sensitivity estimate from 3.0 K to 1.9 K).  However, there are a  couple of large caveats here, the first of which is whether a similar warming pattern of the sea surface (lower in the tropics) would be expected with the continual increase in CO2, or whether some sort of natural variability is impacting this pattern.  This goes for land vs. SST as well, where it appears the discrepancy between the SST and land rates is greater in observations than in the coupled model.  Additionally, it is quite possible that some of the lessening of the positive water vapor feedback may be counteracted by a decrease in the strength of the negative lapse rate feedback which could mitigate the effect on overall sensitivity.

Another thing that should raise some question marks is that all of the methods above during this time period seem to underestimate the 100 year water vapor feedback for GFDL CM2.1, at least according to Soden and Held (2006), where it is listed as 1.97 W/m^2/K.  Also, the all-sky radiative kernel should not be entirely relied upon, as it is based on the models that do not accurately represent the cloud distribution (see errors in all-sky outgoing radiation).  One may also wonder why we don’t use available reanalysis.  Apart from the fact that we want to isolate the differences simply to water vapor concentrations resulting from observed sea surface temperatures, reanalysis products have also been shown to be inadequate for the task of reproducing long-term trends in water vapor [John et al., 2009 ; Serreze et al., 2012].

So, with regards to the original question about whether the discrepancy in the spatial distribution of sea surface temperature trends may affect the water vapor feedback specifically because of the different rates of evaporation, the last two posts at least provide some support for the hypothesis.  I could not say evaporation is the key here for sure, but neither do I see any evidence to dispute it.

I am a bit curious as to whether this method has been employed before (that is, combining AMIP models with radiative kernels for comparison against coupled runs for feedback analysis), as it seems straightforward and could provide a more extended period of analysis than is available in the satellite record.  It seems like it could be an interesting avenue to explore.

My Comment:  This is a very informative analysis, with issues that need exploration. It further documents, however, that the water cycle radiative feedback in the climate models is not accurately representing the real-world feedback as also can be inferred from the paper

Sun, D.-Z., Y. Yu, and T. Zhang, 2009: Tropical Water Vapor and Cloud Feedbacks in Climate Models: A Further Assessment Using Coupled Simulations. J. Climate, 22, 1287-1304

that I posted on in

Tropical Water Vapor and Cloud Feedbacks in Climate Models: A Further Assessment Using Coupled Simulations by De-Zheng Sun, Yongqiang Yu, and Tao Zhang.

The Sun et al article concluded that

“The extended calculation using coupled runs confirms the earlier inference from the AMIP runs that underestimating the negative feedback from cloud albedo and overestimating the positive feedback from the greenhouse effect of water vapor over the tropical Pacific during ENSO is a prevalent problem of climate models.

I hope Troy Ca pursues publishing his analyses.

Comments Off

Filed under Climate Change Forcings & Feedbacks, Guest Weblogs

Guest Post By Barry Lynn Of The Hebrew University of Jerusalem

At the request of a colleague of mine,  Barry Lynn, I have posted his guest contribution below.  While Barry and I disagree on some aspects of climate science, his constructive post provides a framework which we and others can discuss together.

Dr. Barry Lynn is a research scientist at the Hebrew University of Jerusalem. He has also worked as an associate research scientist at Columbia University and Carnegie Mellon University.  Dr. Lynn’s interests include studying the impacts of “greenhouse” gases on climate and the effect of aerosols on precipitation.  Many of his papers have been published by the AMS and JGR.  He is also the C.E.O of Weather It Is LTD (www.weather-it-is.com), a company that produces weather forecasts and climatological information, with an emphasis on deriving new economic applications from such products.

Guest Post By Barry Lynn

One often has memories of how things were.  For instance, I remember that a typical winter forecast was for 2-4 inches of snow.  Yet, in the last decade it seemed like it always snowed more than this, and that even weak storms brought 3-6 inches — even though it wasn’t necessarily that cold. Turning to the summer season, we looked forward to the few times we would sleep downstairs with the air conditioner on (some 30 odd years ago), and now air conditioner use is pretty standard almost *every *night.  One has the distinct impression that the seasonal temperatures (i.e. climate) have changed.

In fact, there is strong evidence that changes in surface temperatures, what we call our “climate” is tied quite strongly to Carbon Dioxide concentrations (CO2), and this goes back many, many thousands of years. See, for example,

http://www.agu.org/meetings/fm09/lectures/lecture_videos/A23A.shtml

Moreover, some skeptics of human induced climate change have now changed their opinion, and now agree that elevated levels of Carbon Dioxide are indeed associated with surface warming.  The study by Robert Muller, incidentally funded in part by The Charles G. Koch Charitable Foundation — which has supported efforts opposing mainstream climate change science — supports the idea that emissions of CO2 are leading to higher surface temperatures.  You can read Professor Muller’s latest NY Times op-ed here:

http://www.nytimes.com/2012/07/30/opinion/the-conversion-of-a-climate-change-skeptic.html?pagewanted=all

A search of the internet also finds this note,

http://www.skepticalscience.com/satellite-measurements-warming-troposphere.htm

where we learn that changes to the processing of satellite data has led to changes in the calculated trends of satellite inferred surface temperatures — now trending much closer to the observed surface measurements.

Yet, I do not profess to be an expert on whether or not climate change has or is happening — although I do believe that the occurrence of multiple (US) 100 degree days this summer ties in well with CO2 enhanced feedbacks between surface drought and the US synoptic pattern. Moreover, I don’t see any problem in questioning various conclusions that have been reached concerning surface temperatures changes, so long as there is a scientific debate (see one of  Professor Pielke’s recent posts on “Climate Science,” for just such a debate).

The problem is that various newspaper editorial boards, for example, use these debates to question the validity of CO2 forcings on climate.  Reading the Wall Street Journal for example, I can tell you what the “layman” reads has a strong influence on his opinions (including mine).

For instance, I appreciate that moving our economy away from a full dependence on oil and coal (very polluting fuels) to renewables and other less polluting sources requires a real investment of money and resources. Yet, editorial-like “rants” are one of the reasons that a more rational formulation of policy to encourage the reduction in CO2 emission  remains quite difficult to enact.  And the costs of not doing anything may just be much higher than the cost of doing something:

http://dotearth.blogs.nytimes.com/2012/03/02/an-economist-rebuts-the-wall-street-journal-16-on-climate-risk/

A co-worker of Muller, Dr. Judith Curry was recently quoted on Andrew Revkin’s Dot Earth, and her position summarizes the state-of-the-art quite well:

“No one that I listen to questions that adding CO2 will warm the earth, all other things being equal. The issue is whether anthropogenic activities or natural variability is dominating the climate variability. If the climate shifts hypothesis is correct (this is where I am placing my money), then this is a very difficult thing to untangle, and we will go through periods of rapid warming that are followed by a stagnant or even cooling period, and there are multiple time scales involved for both the external forcing and natural internal variability that conspire to produce unpredictable shifts.” The evidence is in, and I think it is time to frame the debate along the consensus line, so as not to lose track of the goal: to minimize the potential harmful effects of CO2 on climate (as least in regard to us humans).

In my opinion, the responsible way to frame the debate would be to:

1) Accept the consensus that CO2 gas concentrations have played a critical role in modulating the earth’s temperatures. 2) The level of CO2 gases has and will continue to rise substantially because of human activities — unless steps are taken to reduce emissions from the “business as usual” approach. 3) There is a consensus that doubling CO2 gas levels will lead to an average  3 C change in surface temperatures (regional impacts could be more severe), and have potentially detrimental changes in precipitation patterns (where the majority of people live). 4) There may be other factors such as solar “dimming” or industrial aerosols that may modulate the potential impact of CO2 on climate. 5) Understanding the potential impact of other factors on climate should be and needs to be the subject of intense research. 6) We should seek ways to mitigate the warming:. For instance, were we able to artificially modulate the solar radiance at the surface (i.e, through encouraging the formation of reflective clouds), this might provide one approach to preserving the present climate in a more favorable state – even at elevated levels of CO2.

I recently read about an Israeli invention that can harvest CO2 from the air to produce hydrocarbon fuels (using solar power as the energy source). Is it at all feasible to think that CO2 harvesting can be done on an industrial scale?  We should find out. Our welfare and economic well-being probably depends upon it.

Comments Off

Filed under Guest Weblogs

Guest Post By Madhav Khandekar “Climate Catastrophe Or Media Hype?”

In response to Madhav Khandekar and Tom Harris’s interview in PJ Media titled

Climate Catastrophe or Media Hype?

which starts with

Subjected to a continual bombardment of catastrophism from climate activists, the public can be forgiven for assuming that recent extreme weather events, especially heat waves in North America, are unusual. Citizens would have little reason to suspect that most records for these phenomena were set many years ago.

But they were.

Madhav has provided us with the following summary:

” The current long, hot & dry summer in many parts of the US has prompted a number of climate scientists to sugget that this heat wave in the US and similar heat waves in Europe ( in 2010 & 2003) are linked to human-added CO2 over the past several years. I believe it is important to analyse past heat waves in the US and eslewhere before linking current heat wave to human-added CO2.

In the North American context, the decades of 1920s and 1930s ( known popularly as the Dust Bowl years) witnessed possibly the most anomalous climate of the 20th century, with recurring droughts and heat waves on the Great American Plains ( Canada as well as the US). The decade of the 1930s saw long and recurring droughts on the Canadian-America Prairies. There were locations (in Canada and US) where very little rain fell for the entire year or more in the 1930s! The highest temperature (at 45C)  ever recorded in Canada was in a town in Saskatchewan in July 1937. Toronto, the largest city in Canada recorded its highest temperature ( at 41C ) on three days in July 1936. Several locations in US Midwest and in the Canadian Prairies recorded unusually high temperatures during the 1920s and 1930s. Meteorologists and climate scientists still do not fully understand why the North American climate was so anomalous! Human-added CO2 was certainly NOT a factor for such the long and recurring droughts (and associated heat waves) then!

The European heat waves of 2003 and 2010 have been now attributed to large-scale atmospheric patterns ( with a blocking development) and NOT to increasing concentrations of atmospheric CO2. In the monsoonal climate of India ( and south Asia) , a heat wave of few days to a week or longer duration often develops during the premonsoon months of April-June. Such heat waves ( with maximum temperatures at 43C and above) are associated with mid-tropospheric flow and possible delay in the arrival of monsoon rains. Heat waves in Australia are generally linked to ENSO phase producing less rainfall (reduced cloud cover) and dry soil condition. Heat waves in central Africa in particular, are associated with the movement of the ITCZ and associated rainfall patterns.

There is a definite need to understand the mechanics of heat waves in different parts of the world before linking recent heat waves in Europe and the US to human-added CO2.”

source of image

Comments Off

Filed under Climate Science Reporting, Guest Weblogs

An Analysis By Donald Rapp Of The Levitus Et Al 2012 Analysis

Donald Rapp (see the post on his book – The Climate Debate) sent me the two figures above with the note below. It is an interesting presentation of the Levitus et al (2012) analysis data, which I felt was informative. I am presenting here with his approval.  His e-mail transmitting the information follows. The Levitus et al 2012 paper is

Levitus, S., et al. (2012), World ocean heat content and thermosteric sea level change (0-2000), 1955-2010, Geophys. Res. Lett.,doi:10.1029/2012GL051106, in press

which I have posted on several times; e.g. see

Comment On Ocean Heat Content “World Ocean Heat Content And Thermosteric Sea Level Change (0-2000), 1955-2010″ By Levitus Et Al 2012

The Overstatement Of Certainty In The Levitus Et Al 2012 Paper

Dear Roger:

You might find this interesting. If you take the integrated data of Levitus et al (2012) and differentiate it by painstakingly fitting straight lines to adjacent years, you get the attached curve (vertical scale is in 10^22 J/yr). The average was 0.5 x 10^22 over this time period and since the early 1990s when there was 100% coverage, there was no sign of any acceleration.

Don Rapp

Comments Off

Filed under Climate Change Regulations, Guest Weblogs

Guest Post “Modeled European Precipitation Change Smaller Than Observed” By Ronald van Haren, Geert Jan van Oldenborgh, Geert Lenderink andWilco Hazeleger

 

Modeled European precipitation change smaller than observed

by  Ronald van Haren, Geert Jan van Oldenborgh, Geert Lenderink, Wilco Hazeleger of the Royal Dutch Meteorological Institute (KNMI)

Introduction

Now is an exciting time to do climate research. In many areas of the world climate change is emerging from the noise of natural variability. This opens the opportunity to compare the observed changes to the changes that are simulated by climate models. Climate models are mathematical representations of the climate system and should in principle give a physics-based response to increased concentrations of CO2 and other greenhouse gases, different types of aerosols, solar and volcanic forcings. However, many processes are too small-scale or complex to be physically represented in the model and are parameterized: the average or expected effect of such processes are specified. Examples are clouds, thunderstorms, fog, ocean mixing. The necessity to parameterize these processes adds model uncertainty into the simulations. Projections of the climate are also dependent on uncertainties in the forcings. Aerosol emissions and concentrations in the past are poorly known and future social-economic developments that affect emissions of greenhouse gases, aerosols and land use change are uncertain. Finally, we should always keep in mind that the climate system also shows natural variations on different timescales.

To deal with these uncertainties, use is often made of multiple climate models: a multi-model ensemble. The spread between the model results of such an ensemble is a combination of model uncertainty and natural climate variability. Note that even when natural variability is low, the model uncertainty is not equal to the spread of the ensemble. It can both be larger (if all models do not represent an essential process) or smaller (if the ensemble contains models of lower quality). For some models multiple realizations are available that allow an estimation of the natural variability from the spread within the model.

To come back to our goal: to have confidence in future climate projections, a correct representation of trends in the past is necessary (but not sufficient). In a recent article (van Haren et al, Clim.Dyn., 2012) we investigated if modeled changes in precipitation over Europe are in agreement with the observed changes.

Results & Discussion

Clear precipitation trends have been observed in Europe over the past century. In winter (October – March), precipitation has increased in north-western Europe. In summer (April – September), there has been an increase along many coasts in the same area. Over the second half of the past century precipitation also decreased in southern Europe in winter (figures 1a and 1d). We checked by comparing different analyses of precipitation that the difference between modeled and observed precipitation changes that are discussed in this article are much larger than the analysis uncertainty in the observations, except for some countries in eastern Europe that do not share much data. These analyses are partly based on the same station observations, but agreement between precipitation changes calculated over the second half of the past century and the complete past century give further confidence that the observed changes are physical and not artifacts of changes in the observational methods.

An investigation of precipitation trends in an ensemble of regional climate models (RCMs) of the ENSEMBLES project shows that these models fail to reproduce the observed trends (figures 1b and 1e). In many regions the observed trend is larger than in any of the models. Similar results are obtained for the entire last century in a comparison of the observed trends with trends in global climate models (GCMs) from the CMIP3 co-ordinated modeling experiment. The models should cover the full range of natural variability, so that the result that the observed trend is outside the ensemble implies that either the natural variability is underestimated, or the trend itself. We compared the natural variability over the last century between the models and observations. The GCMs were indeed found to underestimate the variability somewhat, but the RCMs actually overestimate natural variability on the interannual time scale. In Europe, there is very little evidence of low-frequency variability over land beyond the integrated effects of interannual variability: both the observations and the models are compatible with white noise once the trend has been subtracted.

We also have available from ENSEMBLES regional climate model experiments in which the large scale circulation and sea surface temperatures are prescribed from reanalysis data, which are close to the observations. These simulations reproduce the observed precipitation trends much better (figures 1c and 1f). The observed trends are largely compatible with the (smaller) range of uncertainties spanned by the ensemble, indicating that the prescribed factors in regional climate models, large scale circulation and sea surface temperatures, are responsible for large parts of the trend biases in the GCM-forced ensemble and the GCMs themselves.

Figure 1: Comparison of observed and modeled precipitation trends over 1961-2000 [%/century]. (a) Relative trends in observed summer precipitation. (b) Mean relative trends of summer precipitation of the GCM forced RCM ensemble. (c) Mean relative trends of summer precipitation of the RCM ensemble forced by reanalysis data. (d-f)

Using a simple statistical model we next investigated the relative importance of these two prescribed factors. We find that the main factor in setting the trend in winter is the large scale atmospheric circulation (as we found earlier for the winter temperature trends). The air pressure over the Mediterranean area has increased much stronger in the observations than in the models. In the summer season, sea surface temperature (SST) changes are important in setting precipitation trends along the North Sea and Atlantic coasts. Climate models underestimate the SST trends along the Atlantic coast, the North Sea and other coastal areas (if represented at all). This leads to lower evaporation trends and reduced trends in coastal precipitation.

Conclusions

The results of this study show that climate models are only partly capable of reproducing the details in observed precipitation changes: the local observed trends are often much larger than modeled in Europe. Because it is not clear (yet) whether the trend biases in SST and large scale circulation are due to greenhouse warming, their importance for future climate projections needs to be determined. Processes that give rise to the observed trends may very well be relatively unimportant for climate projection for the end of the century. Therefore, a straightforward extrapolation of observed trends to the future is not possible. A quantitative understanding of the causes of these trends is needed so that climate model based projections of future climate can be corrected for these trend biases.

References:

-       Ronald van Haren, Geert Jan van Oldenborgh, Geert Lenderink, Matthew Collins and Wilco Hazeleger, SST and circulation trend biases cause an underestimation of European precipitation trends, Clim.Dyn, (2012) 10.1007/s00382-012-1401-5. preprint

-       G. J. van Oldenborgh, S. Drijfhout, A. van Ulden, R. Haarsma, A. Sterl, C. Severijns, W. Hazeleger, and H. Dijkstra, Western Europe is warming much faster than expected, Clim.Past, 5, 1-12, 2009. doi:10.5194/cp-5-1-2009, full text

-       van der Linden, P. and Mitchell, J. F. B. (Eds), ENSEMBLES: Climate Change and its Impacts: Summary of research and results from the ENSEMBLES project. Met Office Hadley Centre, 2009. book

-       Meehl, Gerald A., Curt Covey, Karl E. Taylor, Thomas Delworth, Ronald J. Stouffer, Mojib Latif, Bryant McAvaney, John F. B. Mitchell, 2007: The WCRP CMIP3 Multimodel Dataset: A New Era in Climate Change Research. Bull. Amer. Meteor. Soc., 88, 1383–1394. doi: doi:10.1175/BAMS-88-9-1383. Full text

Comments Off

Filed under Climate Change Metrics, Climate Models, Guest Weblogs

Guest Post By Richard McNider On The New JGR – Atmosphere Article “Response And Sensitivity Of The Nocturnal Boundary Layer Over Land To Added Longwave Radiative Forcing”

Guest Blog – Richard McNider, University of Alabama in Huntsville

We have just had a paper published in JGR entitled

McNider, R. T., G.J. Steeneveld, B. Holtslag, R. Pielke Sr, S. Mackaro, A. Pour Biazar, J. T. Walters, U. S. Nair, and J. R. Christy (2012). Response and sensitivity of the nocturnal boundary layer over land to added longwave radiative forcing, J. Geophys. Res.,doi:10.1029/2012JD017578, in press. [for the complete paper, click here]

The paper addresses the diurnal asymmetry in warming that has occurred in the observed temperature trends in the last century in which minimum temperatures have warmed at a substantially greater rate than maximum temperatures.  While the paper goes into  considerable detail on the response of the stable boundary layer to radiative forcing that perhaps only a stable boundary layer junkie can appreciate, the implications of the paper ,I believe,  are critical to interpreting both the historical temperature data set and global modeling over the last century.  For those who do not want to be overwhelmed with details, I believe the introduction and conclusions are tractable for non-boundary layer specialists.

Here let me summarize and at the end editorialize on the key points of the paper.  In the last century minimum temperatures have warmed nearly three times more than maximum temperatures as captured by the NOAA Global Historical Climate Network. In fact this asymmetry is one of the most significant signals in the climate record and has been the subject of many papers.  Our paper shows that the CMIP3 climate models only capture about 20% of this trend difference. This is consistent with other studies. Because climate models have not captured this asymmetry, many investigators have looked to forcing or processes that models have not included such as jet contrails, cloud trends, aerosols, and land use change to explain the lack of fidelity of models.  However, our paper takes an alternative approach that explores the role of nonlinear dynamics of the stable nocturnal boundary layer that may provide a general explanation of the asymmetry. This was first postulated in a nonlinear analysis of a simple two layer model we carried out a few years ago (Walters et al. 2007) that indicated that slight changes in incoming longwave radiation from greenhouse gases might result in large changes in the near surface temperature as the boundary is destabilized slightly due to the added downward radiation. This produced a mixing of warmer temperatures from aloft to the surface as the turbulent mixing was enhanced just as an increase in wind speed can destabilize the nighttime boundary and mix warm air from aloft to the surface.

The purpose of the present paper was to see whether this behavior in the simple two layer model was retained in a more complete multi-layer column model for the stable boundary layer.  Basically, we subjected a nocturnal boundary layer to an added increment of downward radiation (4.8 W m -2 ) then looked at the difference in the model solution without this forcing.  We also carried out detail budget calculations to see where the added energy ended up being deposited. Increased downward radiation from CO2 and water vapor feedbacks has been part of the global forcing in climate studies. However, aerosols can also add downward longwave radiation (Nair et al 2011).

The results of these experiments showed that indeed the stable boundary layer grew slightly and was less stable due to the added longwave radiation. The result of the growth of the boundary layer and destabilization was that warm air was entrained from aloft down to the surface by the added turbulence. The model showed that the 1.5 m air temperature (that is near the standard shelter height) warmed substantially due to this destabilization. Moreover, the budget calculations showed that only about 20% of the warming was due to the added longwave energy. Most of the warming at shelter height was due to the warm air mixed from aloft. This is illustrated in figure 10 in the paper. Thus, this process is a highly sensitive positive feedback to surface warming.

Figure 10:  (top) Expanded view of the difference in potential temperature profile between the case of added GHG energy and base case for a geostrophic wind of 8 m s-1(top). (bottom) Expanded view of profile difference.

Our budget calculations in the paper also showed that the ultimate fate of the added input of longwave energy was highly sensitive to boundary layer parameters and turbulent parameterizations. In our simple model, the added radiation could go to heating the atmosphere, heating the near surface ground temperature, heating the deep ground temperature or lost to radiative emission from the skin surface. The model showed that at light winds (with weak turbulence) the atmosphere was not able to effectively lift this energy off the surface and into the atmosphere. Thus, more radiation was emitted from the surface. If soil conductivity and /or heat capacity were large then more of the energy would go to heating the ground. When we tested boundary layer parameterizations of the type employed in large scale models, we found they generally added much more sensible heat to the atmosphere as opposed to being lost by radiation or to the ground.

To capture the type sensitivity we found in our model, climate models would need very fine vertical high resolution and also stable boundary layer parameterizations that don’t have large background mixing such as is often added to large scale models with coarse resolution. Our paper also showed that the stable nocturnal boundary layer was very sensitive to the turbulent parameterization and surface characteristics such as roughness, and land surface heat capacity and conductivity. In fact because current coarse resolution global models do not capture the asymmetry in warming in minimum temperatures and likely do not represent the stable boundary layer very well, we further suggested that truthful replication of the night-time warming may be out of the reach of current models.

Thus, it may be better for current climate models, when they test replication of past climates and to project future global warming, to only use maximum temperatures rather than the current metric of using the mean daily temperature, which contains the minimum temperature. Of course, changes in night-time temperatures represent real changes and possible impacts to the climate system (e.g., melting ice), to society (agricultural productivity) and to ecosystems. Thus, ultimately we need to develop climate models that do have the resolution and sensitivity to capture changes in minimum temperatures.

In this blog I would like to now editorialize on the implications of this work which were not explicitly stated in the peer reviewed paper. While the asymmetrical warming of the nighttime temperatures and the lack of fidelity of models in capturing the asymmetry that we discuss has also been the subject of other papers, it seems that no one has looked at the implications of this to the general ability of models to forecast climate change. But, consider the following is a thought experiment.

Model credibility in the IPCC has been based on the ability to replicate the last 130 years of the global instrumental temperature record with anthropogenic forcing. But, remember that the global temperature record in such comparisons is based on the daily Tmean (the average of Tmax and Tmin). If models are replicating Tmean but are not capturing the trend in Tmin, then this must mean that the model Tmax is warming faster than the actual Tmax.  Also, if most of the warming in the instrumental record is warming in the nighttime boundary then by its very nature this is warming of a very thin layer of order 200m or so. In fact, if our results are correct, we show that it is only the lowest part of the nighttime boundary layer that is being warmed or a thin layer than of no more 20-50 meters. Maximum temperature observations made in daytime boundary layers which are 1- 2 km in depth, reflect a measure of a much deeper layer temperature.  Thus, the instrumental observational data when viewed in light of boundary layer theory is showing that most of the warming is occurring in a very thin layer and the deeper atmosphere, as captured by Tmax, is not warming as much as models.

However, one of the largest positive feedbacks in climate simulations is the accumulation of additional water vapor as the deep atmosphere warms and this adds an additional greenhouse effect. In fact, the added water vapor effect depends on a deep layer of added water vapor. If the deep atmosphere is not warming then this water vapor feedback will not be nearly as strong. Thus, models may be overstating the water vapor feedback. This in turn begs revisiting the continued controversy of the difference in warming of lower troposphere as measured by satellite and balloon borne data and the warming predicted by models since 1979.

Because of the disagreement between the satellites and balloon data and models, the consensus of most of the climate change community has been that the warming of the surface data is consistent with model so that there must be problems in method or sampling with the satellite or balloon data sets.  However, if you throw out Tmin as related to the heat content of deeper atmosphere, it is likely that trends in observed Tmax may be more consistent with the satellite and balloon data. Of course, here we are talking about land processes and consideration of the total warming over land and water would have to be considered. However, I believe it deserves further detailed investigation by both the observational and modeling community to determine whether this thought experiment is valid.

In regards to the oceans (since I started my career as an ocean modeler), I think we should also be careful about similar turbulent processes connecting the atmosphere and ocean surface. Just as for the land surface, the ultimate fate of added energy may be tied to the details of how efficiently and quickly turbulence in the atmosphere and the ocean can remove this added energy from the skin surface. Any errors in this near surface turbulence will impact the fate of the added energy. I am not certain at all that coupled ocean-atmospheric models get these details right.

References

Nair, U. S., R. McNider, F. Patadia, S. A. Christopher, and K. Fuller (2011), Sensitivity of nocturnal boundary layer temperature to tropospheric aerosol surface radiative forcing under clear‐sky conditions, J. Geophys. Res., 116, D02205, doi:10.1029/2010JD014068.

Walters, J. T., R. T. McNider, X. Shi, and W. B. Norris (2007), Positive surface temperature feedback in the stable nocturnal boundary layer, Geophys. Res. Lett., 34, L12709, doi:10.1029/2007/GL029505.

source of image

Comments Off

Filed under Climate Change Forcings & Feedbacks, Climate Change Metrics, Guest Weblogs, Research Papers