I was alerted to an article and news releases on the prediction of cold outbreaks decades from now [h/t Ned Niklov]. The researchers are affiliated with Oak Ridge National Laboratory. That is relevant to me as I was on science review panel at Oak Ridge several years ago where one of our major recommendations was that they assess the predictability of climate forecasts starting as an initial value problem. This would have been a robust scientific approach as observations can be used to test the skill of the multi-decadal predictions.
However, this article (and the climate modeling research program at Oak Ridge National Laboratory, if this paper is typical) has been derailed from the proper assessment of the skill at climate prediction.
Instead, as illustrated in the paper below, they have adopted the scientifically flawed approach of making regional climate forecasts decades into the future. The journal, Geophysical Research Letters, by accepting such a prediction paper, is similarly compromising robust science.
I have discussed the failure of the scientific method with such studies in past posts; e.g. see
This article which fails as robust science is
Kodra, E., K. Steinhaeuser, and A. R. Ganguly (2011), Persisting cold extremes under 21st-century warming scenarios, Geophys. Res. Lett., doi:10.1029/2011GL047103, in press.
The abstract reads [highlight added]
“Analyses of climate model simulations and observations reveal that extreme cold events are likely to persist across each land-continent even under 21st-century warming scenarios. The grid based intensity, duration and frequency of cold extreme events are calculated annually through three indices: the coldest annual consecutive three-day average of daily maximum temperature, the annual maximum of consecutive frost days, and the total number of frost days. Nine global climate models forced with a moderate greenhouse-gas emissions scenario compares the indices over 2091 2100 versus 1991-2000. The credibility of model-simulated cold extremes is evaluated through both bias scores relative to reanalysis data in the past and multi-model agreement in the future. The number of times the value of each annual index in 2091-2100 exceeds the decadal average of the corresponding index in 1991-2000 is counted. The results indicate that intensity and duration of grid-based cold extremes, when viewed as a global total, will often be as severe as current typical conditions in many regions, but the corresponding frequency does not show this persistence. While the models agree on the projected persistence of cold extremes in terms of global counts, regionally, inter-model variability and disparity in model performance tends to dominate. Our findings suggest that, despite a general warming trend, regional preparedness for extreme cold events cannot be compromised even towards the end of the century.”
An excerpt reads [boldface added].
“We find evidence from nine climate models that intensity and duration of cold extremes may occasionally, or in some cases quite often, persist at end-of-20th-century levels late into the 21st century in many regions. This is expected despite unanimous projections of relatively significant mean warming trends.”
The use of the term “evidence” with respect to climate models illustrates that this study is incorrectly assuming that models can be used to test how the real world behaves.
Moreover, they write “[t]he credibility of model-simulated cold extremes is evaluated through both bias scores relative to reanalysis data in the past and multi-model agreement in the future.” The testing against reanalysis data for the period 1991-2000 is robust science. However, bias scores using“multi-model agreement in the future” is a fundamentally incorrect approach.
Models are hypotheses and need to be tested against real data. However, the climate models have not been shown skill at predicting how the statistics of cold waves change in response to human climate forcings during the 21st century. Indeed, there is no way to perform this test until those decades occur.