Monthly Archives: April 2012

Comment On Ocean Heat Content “World Ocean Heat Content And Thermosteric Sea Level Change (0-2000), 1955-2010″ By Levitus Et Al 2012

I was alerted by March Morano and others today to a new paper on changes in upper ocean heat content between 1955 and 2010. The new paper is

Levitus, S., et al. (2012), World ocean heat content and thermosteric sea level change (0-2000), 1955-2010, Geophys. Res. Lett.,doi:10.1029/2012GL051106, in press

with the abstract [highlight added]

We provide updated estimates of the change of heat content and the thermosteric component of sea level change of the 0-700 and 0-2000 m layers of the world ocean for 1955-2010. Our estimates are based on historical data not previously available, additional modern data, correcting for instrumental biases of bathythermograph data, and correcting or excluding some Argo float data. The heat content of the world ocean for the 0-2000 m layer increased by 24.0×1022 J corresponding to a rate of 0.39 Wm-2 (per unit area of the world ocean) and a volume mean warming of 0.09ºC. This warming rate corresponds to a rate of 0.27 Wm-2 per unit area of earth’s surface. The heat content of the world ocean for the 0-700 m layer increased by 16.7×1022 J corresponding to a rate of 0.27 Wm-2 (per unit area of the world ocean) and a volume mean warming of 0.18ºC. The world ocean accounts for approximately 90% of the warming of the earth system that has occurred since 1955. The thermosteric component of sea level trend is 0.54 mm yr-1 for the 0-2000 m layer and 0.41 mm yr-1 for the 0-700 m layer of the world ocean for 1955-2010.

They list their key points as

  • A strong positive linear trend in exists in world ocean heat content since 1955
  • One third of the observed warming occurs in the 700-2000 m layer of the ocean
  • The warming can only be explained by the increase in atmospheric GHGs

The third bullet they list is just a regurgitation of the IPCC claim that only increases in atmospheric GHGs can cause warming in the oceans.  This is a model based claim and seems to be standard for the Levitus papers. Nonetheless, the observation part of his paper is quite informative.

The value of using the ocean heat as the metric to diagnose global warming and cooling was discussed in the papers

Ellis et al. 1978: The annual variation in the global heat balance of the Earth. J. Geophys. Res., 83, 1958-1962

Pielke Sr., R.A., 2003: Heat storage within the Earth system. Bull. Amer.  Meteor. Soc., 84, 331-335.

Pielke Sr., R.A., 2008: A broader view of the  role of humans in the climate system. Physics Today, 61, Vol. 11, 54-55.

First, if 1/3 of the heating really is at depths below 700m, this works to discount the value of using the annual average global surface temperature trend as the diagnostic to monitor global warming as was discussed in the post

Torpedoing Of The Use Of The Global Average Surface Temperature Trend As The Diagnostic For Global Warming

Second,  in the Levitus et al figure, reproduced below, there is a more-or less monotonic increase of heat since about 1990.  In the surface to 2000 meter depth, this accumulation of about 14 X 1022 Joules corresponds to a rate of heating of ~ 0.41 Watts per meter squared. As they also note in their paper, about 1/3 of this heating is at levels below 700m.

There is, therefore, a question about their analysis of heating since the Argo network became dense enough to provide a more homogenous analysis of upper ocean temperatures in 2003.  NOAA’s Pacific Marine Environmental Laboratory presents the analysis below which shows a muted warming in the upper 700m since 2003 as compared with the Levitus et al data.

In any case, let’s just use the Levitus et al 2012 analysis to compare with the prediction made by Jim Hansen. In Jim’s comment in 2005,  he wrote

The Willis et al. measured heat storage of 0.62 W/m2 refers to the decadal mean for the upper 750 m of the ocean. Our simulated 1993-2003 heat storage rate was 0.6 W/m2 in the upper 750 m of the ocean. The decadal mean planetary energy imbalance, 0.75 W/m2, includes heat storage in the deeper ocean and energy used to melt ice and warm the air and land. 0.85 W/m2 is the imbalance at the end of the decade [note added: he is referring to the 1990s].

Thus either using the 1955 to 2010 time period, or the shorter time period from 1990 to 2010 in the Levitus et al 2012 paper, the diagnosed magnitudes of ocean warming and global warming are significantly less than claimed by Jim Hansen in 2005. This discrepancy is even larger if we use the NOAA’s Pacific Marine Environmental Laboratory data.

source of image at top

Comments Off

Filed under Climate Change Metrics, Research Papers

Sea Ice Prediction – Update To 2012

Update#7 

I agree with Grant Foster and dana1981 that  it is a bad idea to make claims about short trends because the error bars in mapping to longer term trends are going to be large.  I should have presented my reasoning for doing this more clearly.

In examing the Vinnikov et al 1999 paper, I did not explain that my use of their trend values to state that short term assessments have value for quantities which involve inertia (mass) such as heat and ice. If the sea ice area were to recover to its original area and thickness (for whatever reason), for example, it does not matter what its long term trend was.  The long term trend (if there is one) would be reset. I have made this point often with respect to ocean heat content (e.g. see). It also applies to sea ice (although area is only one part of it).

Trend analysis, as being interpreted by Grant Foster and dana1981  is fundamentally an assumption that there is an overarching control on the long time period. My perpsective is to see if there are change points which interrupts that trend (e.g. 2006). I look forward to seeing what they obtain for sea ice area and for insolation-weighted sea ice using the same approach they used for sea ice extent.

Update #6  Skeptical Science has joined with Grant Foster to dismiss the claim in my posts that the Vinnikov et al 1999 model prediction overpredicted Arctic sea ice loss. Indeed, they concude the opposite. The Skeptical Science post is Lessons from Past Predictions: Vinnikov on Arctic Sea Ice. They both critizise my post in that I visually extracted the trend and also cherrypicked the start of the time period.

The visual examination of a time series to look for patterns is a classic approach that has been used in developing atmospheric boundary layer forumla for use in models, as I learned from Hans Panofsky years ago.  The selection of a starting time (2006) was based on  this figure, and it does show a clear breakpoint. David Douglass has used this concept of breakpoints (although for different years) with respect to ocean heat content in his paper

D.H. Douglass, R.S. Knox, 2012: Ocean heat content and Earth’s radiation imbalance. II. Relation to climate shifts. Physics Letters A. http://dx.doi.org/10.1016/j.physleta.2012.02.027

The Skeptical Science weblog post concluded that “the Arctic sea ice decline is about 27 years ahead of the Vinnikov model predictions.”

I did make an error in my post by assuming that anomalies in sea ice extent and sea ice area would be the same. I am convinced from the analyses by Grant Foster and dana1981 that this is incorrect. I have always focused on sea ice area since this is the more accurate way to diagnose the albedo radiative feedback.

Since, Grant Foster and dana1981 at Skeptical Science have the statistical tools ready to analyze the data, to convincingly show that Arctic sea ice decline is as advanced as they state, that they

i) perform the same analysis for sea ice area that they have done for sea ice extent

and

ii) perform the analysis of insolation-weighted sea ice trends; e.g. see

Pielke Sr., R.A., G.E. Liston, and A. Robock, 2000: Insolation-weighted  assessment of Northern Hemisphere snow-cover and sea-ice variability.  J. Geophys. Res. Lett., 27, 3061-3064

and

Pielke Sr., R.A., G.E. Liston, W.L. Chapman, and D.A. Robinson, 2004:  Actual and insolation-weighted Northern Hemisphere snow cover and sea  ice — 1974-2002. Climate Dynamics, 22, 591-595 DOI10.1007/s00382-004-0401-5.

With these two other analyses, we will see if the Arctic sea ice is actually in the “death spiral” as they report based on their analysis.  They may be correct, but from the figure below using sea ice area from the Cyrosphere Today, it certainly looks like a hiatus in its demise since 2006.

Update #5   Some of the readers of my weblog are wondering who Tamino is.  The scientific debate on the sea ice issue, of course, is independent of who makes comments, but professional courtesy requires that an individual identify themselves. Tamino is Grant Foster of Tempo Analytics in Garland,  Maine [h/t Anthony Watts]. He has a peer-reviewed publication with Stefan Rahmstorf

Grant Foster and Stefan Rahmstorf 2011: Global temperature evolution 1979–2010. Environ. Res. Lett. 6 044022 doi:10.1088/1748-9326/6/4/044022.

and one wirth Mike Mann, James Annan, and Gavin Schmidt

Foster, G., J. D. Annan, G. A. Schmidt, and M. E. Mann (2008), Comment on ‘‘Heat capacity, time constant, and sensitivity of Earth’s climate system’’ J. Geophys. Res., 113, D15102, doi:10.1029/2007JD009373.

His expertise clearly is in statistics, and he wants to apply this skill to climate analysis.

He does have a quite good study published February 12 2012 titled

Pine Beetle Infestation and Fire Risk in the Black Hills

where he writes as part of his conclusion

Surely, excessive rhetoric about the urgent fire danger posed by pine beetle infestation, sometimes to the point of hysteria, does not serve the public interest.

[he should apply the same advice to his treatment of climate science]

His real world identity, and measured scientific approach, however, conflicts with his handling of his posts and those of his commenters on Tamino.  Tamino has a number of the same disparaging commenters that appear on Skeptical Science.  The failure to identify yourself when discussing a scientific issue shows a lack of professional courtesy.

Now, with respect to the sea ice discussion,  I have asked the question of Grant Foster

What criteria in the observations would have to occur, before you would reject the model predictions of sea ice coverage?

Provide a benchmark criteria to assess for the coming seasons. From my perspective, I view that the models are not refuted if the anomalies in sea ice areal coverage fall at close to or greater than the rate of, say, the order of 10o,000 square kilometers per decade in the next few years.

Update #4  April 26 2012   In the latest comment in the Tamino post Do the Math, I am critizied for using sea ice area and not sea ice extent (along with the now to be expected personal insults by a number of commenters on Tamino).   However, in my view since it is area, not extent that better maps with radiative feedback, the Cryosphere Today presentation is preferred. Perhaps Tamino, should show the Vinnikov et al prediction (if he can obtain it) for the years up to 2012 and beyond in terms of  sea ice area.

In terms of the Vinnikov et al plot, there clearly is an ambiquity as the Hadley and the GFDL results for sea ice extent are so different. To repeat, the use of anomalies provides a way to avoid the absolute value of sea ice extent or area, as both are quite dependent on the precise definition that is used to define them (as illustrated by the Hadley and GFDL analyses).   Tamino would have to show that the anomalies for the extent and are coverage were distinctly different, and then I would agree that the two should not be interpreted as having the same anomalies.

There is one valid point, however, that is raised in the comments. Is the break-point visible in 2006 statistically significant? We are pursuing this analysis and will report on this weblog when we have results.

Finally, if Tamino wants to have a constructive discussion, I recommend he also examine the magnitude of the insolation-weighted radiative feedback from sea ice area coverage, as we presented in our papers

Pielke Sr., R.A., G.E. Liston, and A. Robock, 2000: Insolation-weighted  assessment of Northern Hemisphere snow-cover and sea-ice variability.  J. Geophys. Res. Lett., 27, 3061-3064

and

Pielke Sr., R.A., G.E. Liston, W.L. Chapman, and D.A. Robinson, 2004:  Actual and insolation-weighted Northern Hemisphere snow cover and sea  ice — 1974-2002. Climate Dynamics, 22, 591-595 DOI10.1007/s00382-004-0401-5

Of course, this discussion of a break point becomes moot if the sea ice decline prior to 2006 returns.  However, until and unless that happens, Tamino (and a number of the commenters) are not, in my view, following the scientific method which is to seek to refute hypothses (i.e. in this case the Vinnikov et al or other model preditions) rather than defending the models. Tamino asks the question

Could it be that Roger Pielke is actually aware of that, but that he really doesn’t care about portraying sea ice changes correctly, he only cares about discrediting global warming science?”

This question is absurd and insulting, but it does show Tamino’s mindset.  I am not discussing global warming science at all in the sea ice post.  I am seeking to refute (test) the Vinnikov et al prediction. That is the scientific method.  If you are convinced and present evidence, as you have, that I have been unsucessful so far, that is an appropriate scientific debate. I can then counter with other information.  However, to misrepresent my views on climate science, when I specifically referred you to a summary article on my views;

Pielke Sr., R., K.  Beven, G. Brasseur, J. Calvert, M. Chahine, R. Dickerson, D.  Entekhabi, E. Foufoula-Georgiou, H. Gupta, V. Gupta, W. Krajewski, E.  Philip Krider, W. K.M. Lau, J. McDonnell,  W. Rossow,  J. Schaake, J.  Smith, S. Sorooshian,  and E. Wood, 2009: Climate change: The need to consider human forcings besides greenhouse gases.   Eos, Vol. 90, No. 45, 10 November 2009, 413. Copyright (2009) American   Geophysical Union

is dishonest.

Update #3  April  25 2012 The comments on Tamino on their post Do the Math raised the issue of why I chose to use the long-term analysis of near daily of anomalies of sea ice area rather than yearly averages of sea ice extent for my comparison with the Vinnikov et al result.  The answer is that there is more information in the anomalies. For example, see

The absolute values on the left side of the Vinnikov et al plots are also misleading, as clearly the real world annual average of sea ice extent shown in the above figure from The Cryosphere Today is not even close to the values plotted for the GFDL model or what are reported in their paper to be the Chapman and Walsh, or Parkinson et al observed data.  The reason for the disagreement is likely due to different definitions of sea ice extent. Showing anomalies from the current data on The Cryosphere Today is a way to focus just on the anomalies and their trend over time, and to avoid dealing with the absolute values themselves.

The comments on Tamino disparage the use of visual information to interpret the behavior of the sea ice.  Instead they focus on linear (or near-linear trend) quantitative analyses. However, if the system is not behaving in a near linear (monotonic) way (such as shown in the Levitus et al figure 1 right), their analysis will miss obvious changes in the behavior of the data.  Everyone, if they are being objective, will see a break in the slope of the anomaly plots since 2006.

Finally, reading the comments on Tamino illustrates the lack of professional courtesy that is the benchmark of science.  I am not presenting my comments on Tamino for that reason.  Most of the commenters also hide behind anonymity to insult rather than actually debate a scientific issue. The Tamino weblog approach illustrates the unfortunate state of scientific debate with respect to climate state. They are defending the model predictions, rather than follow the scientific method to determine if there are real world observstions that refute the models (which are, afterall, hypotheses). If all attempts fail to reject the model predictions, than it is accepted as a robust prediction and attribution tool. A simple question to Tamino would be what criteria in the observations would have to occur, before he would reject the model predictions of sea ice coverage?

Update #2 April 24 2012  There is a comment by “Ned” at Tamino that presents the plot of “annual mean sea ice extent” and show that it has decreased faster than the Vinnikov et al model results.  I disagree that this is the proper metric to show, and that anomalies provide the more appropriate comparison as this avoids the confusion as to which value is actually plotted in the Vinnikov et al paper with respect to the models.  Moreover, the plot of anomalies shows why using linear (or quasi-linear trends) is misleading. The sea ice data had a significant change in its trend in 2006. Tamino and his commenters appear unable or uninterested in actually constructively discussing this issue. They have also chosen to ignore the inconvenient behavior of the Antarctic sea ice coverage.

Update April 24 2012 – Tamino has a post titled

Do the Math

where he disagrees with my post. Unfortunately, he does not present an honest view of what I wrote. He writes

I refer the reader to our advice on “Defense Against the Dark Arts.” His misdirection is revealed by Step 3: look at more than they show you, and be especially wary of time spans that are too brief and areas that are too small.  In this case the “time spans that are too short” alarm is flashing red — not only has Pielke cherry-picked his starting point, he’s comparing a predicted long-term trend to an observed time span of far less than a decade.  That’s foolish of him, and misleading to his readers.

Tamino also writes

Note also that when you start “since 2006″ or later, the error bars on the estimated rates are rather large.  The trend for such short time spans is so uncertain, it really doesn’t give much information.

In other words, Tamino, for some reason claims “error bars” are too large which is absurd, as the sea ice coverage data analysis is quite robust, and 2006 was chosen as it is clearly a break-point in the sea ice data.

He estimates the trend not just from 2006 to the present, but from every starting year since 1999 to the present and has, for example, the claim that the trend from 2009 to the present is -2 million km squared per decade. This is obviously quite different from what is shown below from The Cryosphere Today below. Perhaps Tamino should take his own advice and “do the math“.

He then concludes with a motive

Could it be that Roger Pielke is actually aware of that, but that he really doesn’t care about portraying sea ice changes correctly, he only cares about discrediting global warming science

Tamino is quite disingenuous in his post.  I do not disagree that the Arctic sea ice has been decreasing. My post was to compare the sea ice anomaly trends that were presented in the Vinnikov et al paper to real world observations updated to 2012. The figure in the Vinnikov et al 1999 paper shows a rather monotonic (but increasing over time) decrease  in Arctic sea ice content with time.  Tamino ignores what is obvious in even a visual comparison between the Vinnikov et al plot and the real world observations that the decline has stopped, at least for now.

As to motive, I did not even discuss global warming in the post. His ad hominem end statement that I only care about “discrediting global warming science” is an example of him seeking to discredit anyone who introduces a counter viewpoint. If Tamino was actually honest in his post, he would present the same type of figure from the models as done by Vinnikov et al 1999.  In addition, he would report honestly on my view on climate science and global warming which we summarized in our paper

Pielke Sr., R., K.  Beven, G. Brasseur, J. Calvert, M. Chahine, R. Dickerson, D.  Entekhabi, E. Foufoula-Georgiou, H. Gupta, V. Gupta, W. Krajewski, E.  Philip Krider, W. K.M. Lau, J. McDonnell,  W. Rossow,  J. Schaake, J.  Smith, S. Sorooshian,  and E. Wood, 2009: Climate change: The need to consider human forcings besides greenhouse gases.   Eos, Vol. 90, No. 45, 10 November 2009, 413. Copyright (2009) American   Geophysical Union.

**********original post******************

In 2009 I posted the following

A Comment On A 1999 Paper “Global Warming And Northern Hemisphere Sea Ice Extent By Vinnikov Et Al

I have edited the post below using crossouts and underlining, to show that the models are even more out of sync that they were in 2009. There are also very recent discussions on the Arctic sea ice anomalies at WUWT and Real Climate.

Ten  Thirteen years ago, the following paper was published.

Vinnikov et al., 1999: Global warming and northern hemisphere sea ice extent. Science. 286, 1934-1937.

In this paper, there is a presentation of the model predictions of sea ice extent along with observations up to 1998.  This weblog introduces the subject of how well have the model predictions done.

Their abstract includes the statement (referring to the GFDL and Hadley global climate models)

“Both models used here project continued decreases in sea ice thickness and extent throughout the next century.”

In the conclusion to their paper, they write

“Both climate models realistically reproduce the observed annual trends in NH sea ice extent. This suggests that these models can be used with some confidence to predict future changes in sea ice extent in response to increasing greenhouse gases in the atmosphere. Both models predict continued substantial sea ice extent and thickness decreases in the next century.”

In their paper (in Table 1) they have model predictions (in units of linear trend in 106 square kilometers per decade) listed for the GFDL climate model from 1978-1998 of -0.34 (and -0.19 using a “smoothed model output“) and for the Hadley Centre climate model -0.18 (and -0.16 using a “smoothed model output“).

A value of -0.18 is the loss of sea ice area of 180000 square kilometers per decade, for example.

The first figure below is from the Vinnikov et al 1999 paper  with respect to the model predictions, while the second and third figures are the sea ice areal converage for the Northern Hemisphere up to the present (April 20 2012)  and the Antarctic sea ice areal anomaly from The Cryosphere Today.

Until later in 2007, the sea ice areal extent continued to decrease in a manner which, at least visually, is consistent with the Vinnikov et al 1999 predictions (although the actual values of areal coverage differ substantially between the observations and the predictions, perhaps as a result of their formulation to compute areal coverage).

However, since 2006, the reduction has stopped and even reversed. Perhaps this is a short term event and the reduction of sea ice extent will resume. Nonetheless, the reason for the turn around, even if short term, as well as the long term increase in Antarctic sea ice coverage, needs an explanation.  Moreover, this data provides a valuable climate metric to assess whether the multi-decadal global models do have predictive skill as concluded in the Vinnikov et al 2009 paper.

It has been claimed that most of the recent sea ice is thin and thus will melt quickly this spring. Perhaps so. However, in terms of the albedo feedback into the atmosphere that we discuss in our papers

Pielke Sr., R.A., G.E. Liston, and A. Robock, 2000: Insolation-weighted  assessment of Northern Hemisphere snow-cover and sea-ice variability.  J. Geophys. Res. Lett., 27, 3061-3064

and

Pielke Sr., R.A., G.E. Liston, W.L. Chapman, and D.A. Robinson, 2004:  Actual and insolation-weighted Northern Hemisphere snow cover and sea  ice — 1974-2002. Climate Dynamics, 22, 591-595 DOI10.1007/s00382-004-0401-5

the albedo feedback would be muted and could be even negative if the positive global sea ice converage continues.

Comments Off

Filed under Climate Change Metrics

John Christy’s Comment On “If You Want To Roll The Climate Dice, You Should Know The Odds”

In response to the post yesterday

Debate On The “Climate Dice” Issue

with respect to the post on The Conversation

If you want to roll the climate dice, you should know the odds

John R. Christy, Distinguished Professor of Atmospheric Science and Director of the Earth System Science Center of the Univeristy of Alabama in Huntsville, has provided us with his perspective on the Conversation post.

Following is John’s insightful comment.

To make an apples to apples comparison between the 1981 paper by Hansen and observations since 1979, a couple of adjustments need to be made. Without these adjustments, the comparisons are apples to oranges and conclusions misleading.

Eyeballing the forecast curve gives a model projected surface trend of about +0.15 C/decade since 1979.  I suspect the rise in CO2 was actually faster, and thus this curve underestimates what the model would have shown had it used better emissions numbers.

This +0.15 C/decade is a surface trend, so to compare with GISS model troposphere, the scaling factor is 1.25, giving a 1981 model trend of +0.19 C/decade.  That’s one adjustment.

The second adjustment is to account for the volcanic cooling in the first part of the 1979-2012 observations which tilts the observed trend to be more positive than otherwise by about +0.04 C/decade.  So had the 1981 model included real volcanoes which cooled the early portion, its tropospheric trend would be tilted upward to about +0.23 C/decade.  This compares with observations of UAH and RSS of +0.13 C/decade.  So, the apples to apples comparison with tropospheric temperatures since 1979 would be, model: +0.23 C/decade, observations +0.13 C/decade.

Now if we start after the effects of Mt. Pinatubo (say around 1996) to avoid having to calculate the volcanic impact, we have the following. The model goes from +0.2 to +0.5 in 16 years (i.e. trend of about +0.19 at the surface or +0.23 C/decade in the troposphere) while observations show UAH +0.11 C/decade and RSS +0.04 C/decade.  Quite a difference between models and observations!

As an interesting footnote, the average climate model for the most recent CMIP-5 RCP4.5 result has a calculated tropospheric trend of +0.22 C/decade for 1979-2012 and +0.29 C/decade for 1996-2012.  So, not much has changed in 30 years as far as model skill in replicated tropospheric trends it seems.

John C.

source of image

Comments Off

Filed under Climate Change Forcings & Feedbacks, Climate Change Metrics, Guest Weblogs

Debate On The “Climate Dice” Issue

source of image from Revkin.net

There is an informative discussion ongoing at the weblog

The Conversation

regarding the “climate dice” issue that Jim Hansen has introduced. As written on The Conversation in their post on April 17 2012 by Ben Newell who is an Associate Investigator at the ARC Centre of Excellence in Climate Systems Science;

If you want to roll the climate dice, you should know the odds

This “climate dice” analogy has been used in a recent paper by James Hansen and colleagues to demonstrate how over the past 30 years the dice have become “progressively loaded”. There is no longer equal chances of warm, cool, or average seasons.

I was asked on this weblog to respond to the comment in the following e-mail

 Dear Dr Pielke and Dr Hansen,  I would be interested in comments from both of you for public posting  concerning the following comments made on an Australian website called The  Conversation.

See comments at  https://theconversation.edu.au/if-you-want-to-roll-the-climate-dice-you-should-know-the-odds-6462#comment_32628

Specifically Dr Hansen’s 1981 paper was mentioned to provide credence to  the suggestion that climate models are performing skillfully. I note this  paper has received some attention in the press recently and resulted in a  blog post by Dr Pielke Snr pointing out the following about the model  outputs:

“If the observed surface temperature data used in the figure in which this  claim is made is correct, but also so is the measurement of lower  tropospheric temperatures (such as from MSU RSS and MSU UAH), than Hansen’s  forecast for the surface temperatures would be correct, but for the wrong  reason. If the warming were due to added CO2 and other greenhouse gases,  the lower tropospheric temperatures would have warmed at least as much.

However, the latest available global average lower tropospheric temperature  anomaly (see) is only +0.11 C above the 30 year average. Over this time  period, the Hansen figure shows an expected change anomaly of ~+0.5c.

The trend has also been essentially flat since 2002. The Hansen figure  indicates the current change since 2002 should be almost +0.2C.”

For the rest of the article from which the above is derived see…

http://pielkeclimatesci.wordpress.com/2012/04/13/cherrypicking-a-comment-on-the-atlantic-article-now-this-is-interesting-a-climate-prediction-from-1981-by-james-fallows/

The following response [by Nick Kermode] to this was elicited at The Conversation:

Marc, Mr. Pielke Snr. would have done well to actually read Hansens paper  rather than just a newspaper article. Making a statement like “If the  warming were due to added CO2 and other greenhouse gases, the lower  tropospheric temperatures would have warmed at least as much” shows he  clearly hasn’t. In his paper Hansen specifically says the troposphere  should not have responded yet. So his temp. predictions are pretty close  and he is ALSO right about the troposphere. Have a read, amazing it was  written 30 years ago!  https://theconversation.edu.au/if-you-want-to-roll-the-climate-dice-you-should-know-the-odds-6462#comment_32633

Can either of you comment. Dr Hansen I cannot find anywhere in your paper  reference to  a statement that suggest “the troposphere should not have  responded yet“. Indeed it seems this would go against the basic science for  a transient response to increased CO2.

Regards  Marc Hendrickx

My reply, posted on The Conversation, was

Hi Mr. Hendrickx

Thank you for the opportunity to add to the discussion. The troposphere clearly should not have a lag in the heating, as we discuss, for instance in our papers

Klotzbach, P.J., R.A. Pielke Sr., R.A. Pielke Jr., J.R. Christy, and R.T. McNider, 2009: An alternative explanation for differential temperature trends at the surface and in the lower troposphere. J. Geophys. Res., 114, D21102, doi:10.1029/2009JD011841. http://pielkeclimatesci.files.wordpress.com/2009/11/r-345.pdf

Klotzbach, P.J., R.A. Pielke Sr., R.A. Pielke Jr., J.R. Christy, and R.T. McNider, 2010: Correction to: “An alternative explanation for differential temperature trends at the surface and in the lower troposphere. J. Geophys. Res., 114, D21102, doi:10.1029/2009JD011841″, J. Geophys. Res., 115, D1, doi:10.1029/2009JD013655. http://pielkeclimatesci.files.wordpress.com/2010/03/r-345a.pdf

As we report in the second paper, Gavin Schmidt, who works with Jim, has provided online model results of an amplification of 1.1 over land and 1.6 over the oceans. The surface and lower troposphere are clearly connected in terms of warming in the model results.

Jim also wrote in

http://pielkeclimatesci.files.wordpress.com/2009/09/1116592hansen.pdf

in a response to a comment by John Christy and I in 2005 [http://pielkeclimatesci.files.wordpress.com/2009/09/hansen-science.pdf]

that

“The Willis et al. measured heat storage of 0.62 W/m2 refers to the decadal mean for the upper 750 m of the ocean. Our simulated 1993-2003 heat storage rate was 0.6 W/m2 in the upper 750 m of the ocean. The decadal mean planetary energy imbalance, 0.75 W/m2, includes heat storage in the deeper ocean and energy used to melt ice and warm the air and land. 0.85 W/m2 is the imbalance at the end of the decade.”

The end of the decade refers to the 1990s. If this finding were robust, we would expect heating to occur in the lower troposphere in recent years.

Please let me know if you would like further input on this question.

Best Regards

I look forward to seeing how Jim Hansen (or Gavin Schmidt) responds to Marc’s inquiry, and will post if they do.

Comments Off

Filed under Climate Change Forcings & Feedbacks, Climate Change Metrics, Climate Models

Presentation By Ryan Maue Titled “New Normal? Historical Context Of Recent Global Tropical Cyclone Inactivity”

 

From Ryan Maue’s excellent weblog Policlimate  he has posted the abstract and powerpoint slides from his talk at the 30th Conference on Hurricanes and Tropical Meteorology. It is reposted below

Monday, 16 April 2012: 12:00 PM
New normal? Historical context of recent global tropical cyclone inactivity by Ryan Maue.

The abstract reads [highlight added]

Since 2007, overall global tropical cyclone (TC) activity has decreased dramatically using either frequency or accumulated energy metrics. The time series of global TCs of tropical storm, hurricane, and major hurricane strength has not exhibited a significant trend during the past several decades but has demonstrated considerable interannual and interdecadal variability mainly associated with the El Niño Southern Oscillation and Pacific Decadal Oscillation. Although our current historical best-tracks are incomplete prior to the satellite era over sparsely observed ocean basins, at least 40-years of reliable data is available to contextualize the recent historical downturn in TC activity continuing to the present. Indeed in 2011, the number of global hurricane-force tropical cyclone landfalls numbered only 9, a multi-decades low. Acknowledging the marked TC activity decrease, it is critical to address previous research findings which have prominently publicized increasing trends in TC intensity and frequency historical best-track datasets due to anthropogenic influences. On a global scale, it is clear that such conclusions should be met with considerably more skepticism today.

Upon recognition of the very strong influence of natural climate variability on overall global TC behavior, the past 5-years provides a sobering reminder that linear correlations between relatively short TC activity time series and sea surface temperature measures may be inadequate to characterize historical variability in either quantity. Furthermore, as researchers reassess the role of tropical cyclones in the climate system, we must ask if the current lower-level of global TC activity reminiscent of the 1980s will be a permanent feature of the upcoming decade.

His powerpoint slides are at PowerPoint Presentation .

source of image

Comments Off

Filed under Climate Change Metrics

Comments On The New Paper “An Improved Dynamical Downscaling Method With GCM Bias Corrections And Its Validation With 30 years Of Climate Simulations” By Xu and Yang 2012

There is a new paper, sent to me by Zong-Liang Yang, which examines the skill of multi-decadal global climate models to predict climate, as well as a method to correct systematic biases that exist in the parent global model. The paper is

Xu, Zhongfeng and Zong-Liang Yang, 2012: An improved dynamical downscaling method with GCM bias corrections and its validation with 30 years of climate simulations. Journal of Climate 2012 doi: http://dx.doi.org/10.1175/JCLI-D-12-00005.1

The abstract reads [highlight added]

An improved dynamical downscaling method (IDD) with general circulation model (GCM) bias corrections is developed and assessed over North America. A set of regional climate simulations are performed with the Weather Research and Forecasting (WRF) model version 3.3 embedded in the National Center for Atmospheric Research’s (NCAR’s) Community Atmosphere Model (CAM). The GCM climatological means and the amplitudes of interannual variations are adjusted based on the National Centers for Environmental Prediction (NCEP)-NCAR global reanalysis products (NNRP) before using them to drive WRF. In this study, the WRF downscaling experiments are identical except the initial and lateral boundary conditions derived from the NNRP, original GCM output, and bias corrected GCM output, respectively.

The analysis finds that the IDD greatly improves the downscaled climate in both climatological means and extreme events relative to traditional dynamical downscaling approach (TDD). The errors of downscaled climatological mean air temperature, geopotential height, wind vector, moisture, and precipitation are greatly reduced when the GCM bias corrections are applied. In the meantime, IDD also improves the downscaled extreme events characterized by the reduced errors in 2-year return levels of surface air temperature and precipitation. In comparison with TDD, IDD is also able to produce a more realistic probability distribution in summer daily maximum temperature over the central United States-Canada region as well as in summer and winter daily precipitation over the middle and eastern United States.

This is a clearly written, very important paper, but for reasons the authors did not emphasize. First, however, I disagree with their first sentence in the paper where they write

An accurate regional projection of future climate and its impacts on society and environment have become crucial for public policy and decision-making.

Policymakers do not require accurate regional projections to make intelligent decisions as discussed in my son’s book The Climate Fix. Indeed, if one claims skill that actually does not exist, this is misleading policymakers.

The conclusion of the Xu and Yang 2012 contains the text

The most significant improvement of summer precipitation appears in the central United States-Canada region where the TDD overestimates precipitation by 0.5-1.5 mm d-1. The overestimated precipitation over the central United States-Canada region in TDD leads to a higher moisture content and enhanced evaporation, which in turn leads to a cold bias of surface air temperature. These significant errors in precipitation and surface temperature are largely removed in the IDD due to the GCM bias corrections.

The 2-year return level of summer daily maximum temperature simulated by the TDD is underestimated by 2-6°C over the central United States-Canada region. In contrast the bias is generally less than ±1°C in the IDD experiment.

which illustrates the level of error in the parent global model. Errors of this level would be expected outside of the regional climate model domain and as data is inserted into the regional model through the lateral boundary conditions (or through spectal nudging). The IDD makes an improvement in the regional domain since the model results are trained by real world data (the reanalysis). Such real world constraint, of course, is not available in predictions for the coming decades.  The authors did not report on this limitation.

The bias correction, rather than providing a solution to improving multi-decadal climate model predictions, actually shows how poorly the models are doing. Providing results of these model predictions to policymakers as skillful projections is not appropriate.

I have e-mailed Liang (who is an internationally very well-respected colleague who I have the privilege to publish with; see) and wrote the following

Hi Liang

I have read the paper and it is a very valuable new addition to the literature on dynamic regional downscaling. It involves the use of type 2/type 3 to assess one of the questions regarding the value of type 4, and shows, in my view, that as a type 4 application the GCMs are inadequate. A bias correction cannot remedy this deficiency for regional projections in the coming decades.

While you did not discuss this in the paper, your results, therefore, are in support of the findings that Rob and I reported on in our article in EOS.

You write, for example,

“A new dynamical downscaling method with GCM bias corrections for the regional projection of further [future] climate was developed and validated by comparing the GCM-driven WRF simulations to the NNRP-driven WRF simulation.”

which shows that the GCMs have systematic biases. These biases certainly influence the physics in the parent model (and show that these physics have serious problems in faithfully replicating the real climate system). For future predictions, there is no reanalysis data to bias correct towards reality. Thus this approach should not be used to claim skillful future projections.

For impact studies for the historical record, of course, one would just use the reanalyses. The GCM- regional downscaling would be informative for sensitivity model runs (e.g. land scale change effects, aerosols) by running with and without certain forcings, where the reanalysis is the control.

The future projections also have another requirement to overcome and that is, even IF they could recreate the historical climate statistics without bias correction, they must be able (in comparison to the real world data) be able to skillfully predict CHANGES in the regional climate statistics.

I would like to also post the announcement of the paper on my weblog, and discuss; let me know if you would want to first prepare a guest weblog post…..

Best Regards

Roger

I am pleased Liang shared this paper with me, and look forward to his response to my comments.

source of image

Comments Off

Filed under Climate Models, Research Papers

Comment On The 2012 Draft AMS Statement On “Climate Change”

The American Meteorological Society is in the process of finalizing an updated statement on Climate Change. The ability to read the statement is limited to AMS members:

Draft Statement Open for Member Comment: Climate Change

The stated goal of the Statement is

The following is an AMS Information Statement intended to provide a trustworthy, objective, and scientifically up-to-date explanation of scientific issues of concern to the public at large.

The process of providing input on the draft, the lack of identifying who drafted the statement, and an ability to see what comments others have provided and the drafters’ response clearly show the very top-down control of this professional society. It is unfortunate, as the AMS could be a neutral open forum for debate among the members as to what to include in the Statement. However, as currently constituted, it is just a presentation by a small group of individuals, whose only hurdle are members of the AMS Council, who themselves are selected by a small committee. The AMS members only get to vote for the selected slate of candidates.

This is hardly a process to advance the public’s knowledge about the diversity of perspectives on climate change by the AMS membership.

The last paragraph of the Statement shows the intent of the Statement’s authors

Technological, economic, and policy choices in the near future will determine the extent of future impacts of climate change. Policy decisions are seldom made in a context of absolute certainty. The policy debate should include consideration of the best ways to both adapt to and mitigate climate change. Mitigation will reduce the amount of future climate change and the risk of impacts that are potentially large and dangerous. At the same time, some continued climate change is inevitable, and policy responses should include adaptation to climate change. Prudence dictates extreme care in managing our relationship with the only planet known to be capable of sustaining human life.

The following is what I submitted as a Comment. I will have more to say on this Statement after it is officially accepted by the AMS.

The process of providing input on the draft, the lack of identifying who drafted the statement, and an ability to see what comments others have provided and the drafters’ response clearly show the very top-down control of our professional society. It is unfortunate, as the AMS could be a neutral open forum for debate among the members as to what to include in the Statement. However, as currently constituted, the Statements are just presentations of a viewpoint by a small group of individuals, whose only hurdle are members of the AMS Council, who themselves are selected by a small committee. As AMS members we only get to vote for the selected slate of candidates.

This is hardly a process to advance the public’s knowledge about the diversity of perspectives on climate change by the AMS membership.

With specific respect to the draft statement, there are a number of inaccuracies.

One of the most blatant is the statement that

 “it is widely accepted that the dominant cause of the rapid change in climate of the past half century is human-induced increases in the amount of atmospheric greenhouse gases, including carbon dioxide”.

It is actually quite straightforward to refute this claim as we summarized in the article

Pielke Sr., R., K. Beven, G. Brasseur, J. Calvert, M. Chahine, R. Dickerson, D. Entekhabi, E. Foufoula-Georgiou, H. Gupta, V. Gupta, W. Krajewski, E. Philip Krider, W. K.M. Lau, J. McDonnell, W. Rossow, J. Schaake, J. Smith, S. Sorooshian, and E. Wood, 2009: Climate change: The need to consider human forcings besides greenhouse gases. Eos, Vol. 90, No. 45, 10 November 2009, 413. Copyright (2009) American Geophysical Union

“In addition to greenhouse gas emissions, other first-order human climate forcings are important to understanding the future behavior of Earth’s climate. These forcings are spatially heterogeneous and include the effect of aerosols on clouds and associated precipitation [e.g., Rosenfeld et al., 2008], the influence of aerosol deposition (e.g., black carbon (soot) [Flanner et al. 2007] and reactive nitrogen [Galloway et al., 2004]), and the role of changes in land use/land cover [e.g., Takata et al., 2009]. Among their effects is their role in altering atmospheric and ocean circulation features away from what they would be in the natural climate system [NRC, 2005]. As with CO2, the lengths of time that they affect the climate are estimated to be on multidecadal time scales and longer.”

and

“The evidence predominantly suggests that humans are significantly altering the global environment, and thus climate, in a variety of diverse ways beyond the effects of human emissions of greenhouse gases, including CO2″.

The failure to recognize that there is a disagreement among members of our professional society on this issue (of the dominance of the radiative forcing of CO2) should be reported in the Statement.

There are also issues with the summary of observations. As one example, the absence of warming in the upper oceans and lower troposphere for the last 10 years is not mentioned. Nor is the absence of increases in atmospheric water vapor over the same time period.

The claims as to what models are capable of in terms of projections, is also misleading. The Statement should report that no regional skill of predicting changes in climate statistics on multi-decadal time scales has been shown. We report on this in

Pielke Sr., R.A., R. Wilby, D. Niyogi, F. Hossain, K. Dairuku, J. Adegoke, G. Kallos, T. Seastedt, and K. Suding, 2012: Dealing with complexity and extreme events using a bottom-up, resource-based vulnerability perspective. AGU Monograph on Complexity and Extreme Events in Geosciences, in press.

and

Pielke Sr., R.A., and R.L. Wilby, 2012: Regional climate downscaling – what’s the point? Eos Forum, 93, No. 5, 52-53, doi:10.1029/2012EO050008.

Either refute our findings or report on them.

This Statement needs much more vetting if it is going to be read as

 “….a trustworthy, objective, and scientifically up-to-date explanation of scientific issues of concern to the public at large.”

As currently written, it perpetuates the myth that there is a broad peer-reviewed literature agreement with all of the claims of findings that are in the report. This is not the case. If the Statement is accepted as written, it will not only be easy to refute significant parts of it, but it will present the American Meteorological Society as an advocacy group rather than an objective professional organization.

source of image

Comments Off

Filed under Advocacy Masking As Science, Climate Science Reporting

Guest Post By Ben Herman On Regional Climate Modeling

Guest Post By Professor Ben Herman of the University of Arizona. As written on the University’s website

Dr. Herman is primarily concerned with the optics of atmospheric aerosols, polarization and scattering, and the application of inversion techniques to analyze remote sensing data obtained from aircraft and satellites. Currently, he is working on several satellite based remote sensing projects to monitor ozone, temperature, water vapor and aerosols from space.

Following is Ben’s guest post on regional climate modelling

I have had several discussions here with various people concerning  the problem of regional climate prediction using climate models to set up boundary conditions for a smaller, regional area in which a  much smaller grid size is used. The problem is that if the boundary conditions are incorrect, obviously this will deteriorate any predictions made using those boundaries. Now with that said, there seems to be  quite a lot of effort being put into regional prediction using global climate models. The climate models, at this time, are using a much larger grid than required for regional prediction, so this has led to the climate of the smaller regional areas of interest to be solved for separately, using the results of the global to  be used as boundary conditions. I have  compared it to having a set of numbers accurate to one decimal point, using those measurements in a series of mathematical operations, and providing answers to ,say three decimal points. You can do that, but, of course, the additional decimal points have no practical use.

Another way to look at this is to imagine that the global solution for a given time has been broken down into a Fourier series. This series will contain frequencies that are limited by the grid spacing of the global model. Any frequencies beyond this range are not present and thus, cannot have an impact within the smaller region, even though higher frequencies within this region may be solved for due to the higher resolution within this region. But, since these higher frequencies cannot be influenced by similar frequencies outside of the region in question, they will generally be in error. The only way that this error could be avoided would be if none of these higher frequencies were present during the valid time of the forecast, generally an unlikely occurence. It thus appears to me that undermost conditions, the regional forecasts, using present techniques, are operating with an incomplete set of initial conditions which will certainly limit their accuracy.

source of image

Comments Off

Filed under Uncategorized

Cherrypicking – A Comment On The Atlantic Article “Now This Is Interesting: A Climate Prediction From 1981″ By James Fallows

This week, I received an e-mail from a respected colleague who wrote

Hi all, Simply put — a more convincing case could not be made. This should be front page news.

http://www.theatlantic.com/technology/archive/2012/04/now-this-is-interesting-a-climate-prediction-from-1981/255658/

There is always room for doubt, and alternate hypotheses, and science allows one to be wrong — but this case is made with one simple figure.

The article referred to is

Now This Is Interesting: A Climate Prediction From 1981

by James Fallows of the Atlantic and the figure is at the top of this post.

If the observed surface temperature data used in the figure in which this claim is made is correct, but also so is the measurement of lower tropospheric temperatures (such as from MSU RSS and MSU UAH), than Hansen’s forecast for the surface temperatures would be correct, but for the wrong reason. If the warming were due to added CO2 and other greenhouse gases, the lower tropospheric temperatures would have warmed at least as much.

However, the latest available global average lower tropospheric temperature anomaly (see) is only +0.11 C above the 30 year average.  Over this time period, the Hansen figure shows an expected change anomaly of ~+0.5c.

The trend has also been essentially flat since 2002. The Hansen figure indicates the current change since 2002 should be almost +0.2C.

These discrepancies clearly show the Atlantic article did not objectively look into the Hansen prediction.

The lower tropospheric temperature anomaly analyses, therefore, need to also be compared with the Hansen model predictions. That was not done for the article. These two real-world analyses are reproduced below for the period of record.

[source of image]

source of image -  figure 7 top

The reply to my colleague on this article is

If one wants a convincing figure to show that there should be concern about adding CO2 into the atmosphere, we only need the Mauna Loa Observatory data.

However, the Hansen plot, being presented as convincingly showing that the response to this added CO2 is a more-or-less global warming is disingenuous. The surface temperature data have a warm bias as we documented in our papers

Klotzbach, P.J., R.A. Pielke Sr., R.A. Pielke Jr., J.R. Christy, and R.T. McNider, 2009: An alternative explanation for differential temperature trends at the surface and in the lower troposphere. J. Geophys. Res., 114, D21102, doi:10.1029/2009JD011841.

Klotzbach, P.J., R.A. Pielke Sr., R.A. Pielke Jr., J.R. Christy, and R.T. McNider, 2010: Correction to: “An alternative explanation for differential temperature trends at the surface and in the lower troposphere. J. Geophys. Res., 114, D21102, doi:10.1029/2009JD011841″, J. Geophys. Res., 115, D1, doi:10.1029/2009JD013655.

Even more important, the lower troposphere does not show the same trend, particularly in the last 10 years.

[see Figure 7 top in http://www.ssmi.com/msu/msu_data_description.html]

Neither does the upper ocean heat content changes; e.g. see

D.H. Douglass, R.S. Knox, 2012: Ocean heat content and Earth.s radiation imbalance. II. Relation to climate shifts. Physics Letters A. http://dx.doi.org/10.1016/j.physleta.2012.02.027

R. S. Knox, David H. Douglass 2010: Recent energy balance of Earth International. Journal of Geosciences, 2010, vol. 1, no. 3 (November). doi:10.4236/ijg2010.00000.

The latest upper ocean anomaly analysis can be seen in the figure below from NOAA PMEL;

In terms of sea surface temperatures, look at http://www.osdpd.noaa.gov/data/sst/anomaly/2012/anomnight.4.9.2012.gif

which shows a complex pattern of both warm and cool anomalies, but little long-term trend.

The Atlantic article is quite deficient in presenting all of the real world data.

The conclusion is that the Atlantic article by  James Fallows an incomplete and biased news report.

source of image at top of post

Comments Off

Filed under Bias In News Media Reports

New Physics Today Article “The Triggering And Persistence Of The Little Ice Age” By Bertram M. Schwarzschild

(image from Schwarzschild 2012)

UPDATE April 13 2012: Anthony Watts alerted us to the guest post at WUWT by Willis Eschenbach titled Dronning Maud Meets the Little Ice Age. Willis raises quite very substantive issues with the paper. I urge him to submit his analysis for peer review.

*****************************************************************************

Every once in a while. a nugget of new research insight appears that adds to our understanding of the climate system, and its complexity. One article of this type has appeared

Miller, G. H., et al. (2012), Abrupt onset of the Little Ice Age triggered by volcanism and sustained by sea-ice/ocean feedbacks, Geophys. Res. Lett.,39,L02708,doi:10.1029/2011GL050168

with the abstract [highlight added]

Northern Hemisphere summer temperatures over the past 8000 years have been paced by the slow decrease in summer insolation resulting from the precession of the equinoxes. However, the causes of superposed century-scale cold summer anomalies, of which the Little Ice Age (LIA) is the most extreme, remain debated, largely because the natural forcings are either weak or,in the case of volcanism, short lived. Here we present precisely dated records of ice-cap growth from Arctic Canada and Iceland showing that LIA summer cold and ice growth began abruptly between 1275 and 1300 AD, followed by a substantial intensification 1430–1455 AD. Intervals of sudden ice growth coincide with two of the most volcanically perturbed half centuries of the past millennium. A transient climate model simulation shows that explosive volcanism produces abrupt summer cooling at these times, and that cold summers can be maintained by sea-ice/ocean feedbacks long after volcanic aerosols are removed. Our results suggest that the onset of the LIA can be linked to an unusual 50-year-long episode with four large sulfur-rich explosive eruptions,each with global sulfate loading >60 Tg. The persistence of cold summers is best explained by consequent sea-ice/ocean feedbacks during a hemispheric summer insolation minimum; large changes in solar irradiance are not required.

The Kep Points listed in the Miller et al 2012 paper are

  • Little Ice Age began abruptly in two steps
  • Decadally paced explosive volcanism can explain the onset
  • A sea-ice/ocean feedback can sustain the abrupt cooling

The Miller et al article is summarized in

Schwarzschild, Bertram M., 2012:The triggering and persistence of the Little Ice Age. Physics Today. April 2012. page 15 http://dx.doi.org/10.1063/PT.3.1506

The abstract of the Physics Today article reads

“A mere half century of volcanism seems to have initiated a chill lasting half a millennium”.

Extracts from the article are

For more than 500 years until the middle of the 19th century, much of the Northern Hemisphere experienced the “Little Ice Age,” the most extended period of anomalous cold—winter and summer—in 8000 years. Picturesque aspects of the LIA are familiar from paintings of winter scenes in northern Europe. But more somber manifestations include numerous famines in Europe and Asia and the extinction of the Norse settlements in southern Greenland.

The LIA’s start and finish dates, as well as its cause, have long been subjects of debate and puzzlement. Variations in solar irradiation and volcanic eruptions have been invoked as possible causes. But the one seems too weak and the other too ephemeral.

Unlikely or not, four major tropical eruptions in the late 13th century are identified as the LIA’s triggering mechanism in a recent paper, by Gifford Miller (University of Colorado at Boulder) and coworkers in the US and Iceland. They present precise new carbon-14 dating results and a model simulation of reinforcing feedbacks to pinpoint the LIA’s abrupt onset and understand its duration.

An important manifestation of the LIA was the expansion of glaciers and year-round icecaps at high latitudes and elevations. Expansion chronology is often measured by 14C dating of biological debris swept into glacial moraines. But much of the vegetation in those dumping grounds of glacial scouring was killed decades or centuries after the initial shift to colder summers and the consequent advance of perennial ice cover. So Miller and company chose to concentrate on small, localized icecaps that would have reacted much faster and preserved rooted vegetation precisely where it was killed by the expanding ice.

At several dozen such highland sites across Baffin Island in the Canadian Arctic where the surface has only now been exposed after centuries of icy entombment, the team obtained precise 14C kill dates for more than 100 clusters of freshly exposed moss (see figure 1). Recalibrating raw 14C ages for known temporal variations of the atmosphere’s 14C concentration, the team was able to reveal and date prominent kill-rate peaks with almost decadal resolution…

The team concludes that the peak near 1300 AD marks the LIA’s sudden onset, and that it was triggered by the four major volcanic explosions in the previous half century, shown in figure 2b. The figure estimates the global stratospheric load of sulfate aerosol based on dateable sulfate concentrations in Arctic and Antarctic ice cores. Though each eruption would have produced just a few cold summers before its aerosol precipitated out, the millennium’s greatest eruption, followed in quick succession by three more, seems to have initiated the very long chill…”

Long stretches with low kill rates on Baffin Island—for example, the three centuries preceding the 1300 peak—might reflect either a warm period with receding perennial ice or unrelenting cold with continuous maximal ice cover. To resolve such ambiguities, the team dated and measured thicknesses of annual sediment layers under a glacier-fed lake in central Iceland. The more massive the icecap that drives the glacier, the greater (on average) is the thickness of the debris layer it deposits each summer. In that way the team was able to conclude that the pre-1300 trough was indeed a warm spell and that the falloff after the great kill peak around 1450 marks the onset of continuous maximal ice coverage through the end of the 19th century. The 1450 peak is thought to be associated with the well-documented 1452 Kuwae eruption on an island east of New Guinea.

”The problem with volcanic explanations for the LIA has always been that the sulfate aerosol is gone after three years,“ says Miller. “But the sea-ice data have impelled us to undertake a model simulation to see if decadal volcanic triggering, reinforced by interaction between expanding sea ice and ocean currents, might do the trick.”

If eruptions recur faster than surface water temperatures can recover, the cumulative ocean cooling could be much greater than from any single eruption. In the Arctic Ocean, the anomalously cold surface water might inhibit summer ice melt enough to cause an extended southward expansion of sea ice into the Atlantic.

”Model simulation is always tenuous,” cautions Miller, “but this one does suggest a plausible mechanism for the centuries-long persistence of a decade-scale perturbation. I’m particularly pleased that the empirical evidence for the sudden accumulation of ice off Iceland’s north coast around 1300 supports the simulation’s massive export of arctic ice into the North Atlantic and the consequent disruption of warming ocean circulation.”

This study provides evidence that the climate system responds to perturbation (in this case from volcanic eruptions) that is quite nonlinear as a result of atmospheric-ocean-land interactions. It also illustrates a challenge to skillful modeling on decadal and longer time scales as such volcanic eruptions cannot be predicted.

There is also an interesting observation in figure 2 of the Schwarzschild 2012 article. I reproduced this figure 2 at the begining of this weblog post. In this figure, cool temperature anomalies are higher on the left hand axis. It clearly shows the little ice age effect that is discussed in the Schwarzschild and Miller et al articles (with the coldest actually ~1850-1900!). However it also shows that the warmest period was in the mid 20th century and it has actually cooled since then. 

Comments Off

Filed under Climate Change Forcings & Feedbacks, Research Papers