A New Paper On The Statistics Of Record-breaking Temperatures

There is a recent paper on the statistics of record breaking temperatures (thanks to Peter Schuck for alerting Climate Science to this paper!). The paper is

S. Redner and Mark R. Petersen: 2006: “Role of global warming on the statistics of record-breaking temperatures” Physical Review E 74, 061114

The abstract reads,

“We theoretically study the statistics of record-breaking daily temperatures and validate these predictions using both Monte Carlo simulations and 126 years of available data from the city of Philadelphia. Using extreme statistics, we derive the number and the magnitude of record temperature events, based on the observed Gaussian daily temperature distribution in Philadelphia, as a function of the number of years of observation. We then consider the case of global warming, where the mean temperature systematically increases with time. Over the 126-year time range of observations, we argue that the current warming rate is insufficient to measurably influence the frequency of record temperature events, a conclusion that is supported by numerical simulations and by the Philadelphia data. We also study the role of correlations between temperatures on successive days and find that they do not affect the frequency or magnitude of record temperature events.”

An excerpt reads,

“Our primary result is that we cannot yet distinguish between the effects of random fluctuations and long-term systematic trends on the frequency of record-breaking temperatures with 126 years of data. For example, in the 100th year of observation, there should be 365/100=3.65 record-high temperature events in a stationary climate, while our simulations give 4.74 such events in a climate that is warming at a rate of 0.6 °C per 100 years. However, the variation from year to year in the frequency of record events after 100 years is larger than the difference of 4.74–3.65, which should be expected because of global warming …..After 200 years, this random variation in the frequency of record events is still larger than the effect of global warming. On the other hand, global warming already does affect the frequency of extreme temperature events that are defined by exceeding a fixed thresholdâ€?

This is an interesting study that appears in a journal that is not usually read by climate scientists. One comment, of course, is that the term “global warming” is not correct here. The statistics are really with respect to a long term change in the temperatures at one location; a global average temperature change is not the appropriate climate metric to use.

Their study raises the issue as to what is an extreme temperature (an “abnormal” value), and what is above average but “normal” (i.e. falls with a certain number of standard deviations from the mean). We completed such an analysis in 1987;

Pielke, R.A. and N. Waage, 1987: A definition of normal weather. Natl. Wea. Dig., 12, 20-22,

with the abstract

“This paper clarifies the distinction between abnormal weather, and above and below average weather, using standard statistical analyses. Abnormal maximum and minimum temperatures are defined as requiring at least two standard deviations from the mean; otherwise even though they could be above or below average, the weather is still “normal.” July and January maximum and minimum temperatures for Denver, New York, Los Angeles, Miami, and Bismarck are presented as examples of this analysis.”

Such a distinction between what is above average and what is abnormal should be part of any assessment of multi-decadal temperature trends.

Comments Off

Filed under Climate Change Metrics

Comments are closed.