Category Archives: Climate Models

New Paper “Parameterization Of Instantaneous Global Horizontal Irradiance At The Surface. Part II: Cloudy-Sky Component” By Sun Et Al 2012

There is yet another paper that documents the lack of skill in multi-decadal global climate models to skillfully predict climate conditions in the coming years. This paper involves the question of accuracy lost when radiation parameterizations are used at time intervals that are long compared to other physical processes in the models.  The paper is

Sun, Z., J. Liu, X. Zeng, and H. Liang  (2012), Parameterization of instantaneous global horizontal irradiance at the surface. Part II:  Cloudy-sky componentJ. Geophys. Res., doi:10.1029/2012JD017557, in press. [the full paper is not yet available   the full paper is available at the  JGR site by clicking PIP PDF - h/t Victor Venema]

The abstract reads [highlight added]

Radiation calculaions in global numerical weather prediction (NWP) and climate models are usually performed in 3-hourly time intervals in order to reduce the computational cost. This treatment can lead to an incorrect Global Horizontal Irradiance (GHI) at the Earth’s surface, which could be one of the error sources in modelled convection and precipitation. In order to improve the simulation of the diurnal cycle of GHI at the surface a fast scheme has been developed in this study and it can be used to determine the GHI at the Earth’s surface more frequently with affordable costs. The scheme is divided into components for clear-sky and cloudy-sky conditions. The clear-sky component has been described in part I. The cloudy-sky component is introduced in this paper. The scheme has been tested using observations obtained from three Atmospheric Radiation Measurements (ARM) stations established by the U. S. Department of Energy. The results show that a half hourly mean relative error of GHI under all-sky conditions is less than 7%. An important application of the scheme is in global climate models. The radiation sampling error due to infrequent radiation calculations is investigated using the this scheme and ARM observations. It is found that these errors are very large, exceeding 800 W m-2 at many non-radiation time steps due to ignoring the effects of clouds. Use of the current scheme can reduce these errors to less than 50 W m-2.

These errors are clearly larger than the few W m-2 that are due to human climate forcings, and even large relative to the natural variations of radiative fluxes.  This is yet another example of why the IPCC models are not robust tools to predict changes in global, regional and local climate statistics.

source of image

Comments Off

Filed under Climate Models, Research Papers

Guest Post “Modeled European Precipitation Change Smaller Than Observed” By Ronald van Haren, Geert Jan van Oldenborgh, Geert Lenderink andWilco Hazeleger

 

Modeled European precipitation change smaller than observed

by  Ronald van Haren, Geert Jan van Oldenborgh, Geert Lenderink, Wilco Hazeleger of the Royal Dutch Meteorological Institute (KNMI)

Introduction

Now is an exciting time to do climate research. In many areas of the world climate change is emerging from the noise of natural variability. This opens the opportunity to compare the observed changes to the changes that are simulated by climate models. Climate models are mathematical representations of the climate system and should in principle give a physics-based response to increased concentrations of CO2 and other greenhouse gases, different types of aerosols, solar and volcanic forcings. However, many processes are too small-scale or complex to be physically represented in the model and are parameterized: the average or expected effect of such processes are specified. Examples are clouds, thunderstorms, fog, ocean mixing. The necessity to parameterize these processes adds model uncertainty into the simulations. Projections of the climate are also dependent on uncertainties in the forcings. Aerosol emissions and concentrations in the past are poorly known and future social-economic developments that affect emissions of greenhouse gases, aerosols and land use change are uncertain. Finally, we should always keep in mind that the climate system also shows natural variations on different timescales.

To deal with these uncertainties, use is often made of multiple climate models: a multi-model ensemble. The spread between the model results of such an ensemble is a combination of model uncertainty and natural climate variability. Note that even when natural variability is low, the model uncertainty is not equal to the spread of the ensemble. It can both be larger (if all models do not represent an essential process) or smaller (if the ensemble contains models of lower quality). For some models multiple realizations are available that allow an estimation of the natural variability from the spread within the model.

To come back to our goal: to have confidence in future climate projections, a correct representation of trends in the past is necessary (but not sufficient). In a recent article (van Haren et al, Clim.Dyn., 2012) we investigated if modeled changes in precipitation over Europe are in agreement with the observed changes.

Results & Discussion

Clear precipitation trends have been observed in Europe over the past century. In winter (October – March), precipitation has increased in north-western Europe. In summer (April – September), there has been an increase along many coasts in the same area. Over the second half of the past century precipitation also decreased in southern Europe in winter (figures 1a and 1d). We checked by comparing different analyses of precipitation that the difference between modeled and observed precipitation changes that are discussed in this article are much larger than the analysis uncertainty in the observations, except for some countries in eastern Europe that do not share much data. These analyses are partly based on the same station observations, but agreement between precipitation changes calculated over the second half of the past century and the complete past century give further confidence that the observed changes are physical and not artifacts of changes in the observational methods.

An investigation of precipitation trends in an ensemble of regional climate models (RCMs) of the ENSEMBLES project shows that these models fail to reproduce the observed trends (figures 1b and 1e). In many regions the observed trend is larger than in any of the models. Similar results are obtained for the entire last century in a comparison of the observed trends with trends in global climate models (GCMs) from the CMIP3 co-ordinated modeling experiment. The models should cover the full range of natural variability, so that the result that the observed trend is outside the ensemble implies that either the natural variability is underestimated, or the trend itself. We compared the natural variability over the last century between the models and observations. The GCMs were indeed found to underestimate the variability somewhat, but the RCMs actually overestimate natural variability on the interannual time scale. In Europe, there is very little evidence of low-frequency variability over land beyond the integrated effects of interannual variability: both the observations and the models are compatible with white noise once the trend has been subtracted.

We also have available from ENSEMBLES regional climate model experiments in which the large scale circulation and sea surface temperatures are prescribed from reanalysis data, which are close to the observations. These simulations reproduce the observed precipitation trends much better (figures 1c and 1f). The observed trends are largely compatible with the (smaller) range of uncertainties spanned by the ensemble, indicating that the prescribed factors in regional climate models, large scale circulation and sea surface temperatures, are responsible for large parts of the trend biases in the GCM-forced ensemble and the GCMs themselves.

Figure 1: Comparison of observed and modeled precipitation trends over 1961-2000 [%/century]. (a) Relative trends in observed summer precipitation. (b) Mean relative trends of summer precipitation of the GCM forced RCM ensemble. (c) Mean relative trends of summer precipitation of the RCM ensemble forced by reanalysis data. (d-f)

Using a simple statistical model we next investigated the relative importance of these two prescribed factors. We find that the main factor in setting the trend in winter is the large scale atmospheric circulation (as we found earlier for the winter temperature trends). The air pressure over the Mediterranean area has increased much stronger in the observations than in the models. In the summer season, sea surface temperature (SST) changes are important in setting precipitation trends along the North Sea and Atlantic coasts. Climate models underestimate the SST trends along the Atlantic coast, the North Sea and other coastal areas (if represented at all). This leads to lower evaporation trends and reduced trends in coastal precipitation.

Conclusions

The results of this study show that climate models are only partly capable of reproducing the details in observed precipitation changes: the local observed trends are often much larger than modeled in Europe. Because it is not clear (yet) whether the trend biases in SST and large scale circulation are due to greenhouse warming, their importance for future climate projections needs to be determined. Processes that give rise to the observed trends may very well be relatively unimportant for climate projection for the end of the century. Therefore, a straightforward extrapolation of observed trends to the future is not possible. A quantitative understanding of the causes of these trends is needed so that climate model based projections of future climate can be corrected for these trend biases.

References:

-       Ronald van Haren, Geert Jan van Oldenborgh, Geert Lenderink, Matthew Collins and Wilco Hazeleger, SST and circulation trend biases cause an underestimation of European precipitation trends, Clim.Dyn, (2012) 10.1007/s00382-012-1401-5. preprint

-       G. J. van Oldenborgh, S. Drijfhout, A. van Ulden, R. Haarsma, A. Sterl, C. Severijns, W. Hazeleger, and H. Dijkstra, Western Europe is warming much faster than expected, Clim.Past, 5, 1-12, 2009. doi:10.5194/cp-5-1-2009, full text

-       van der Linden, P. and Mitchell, J. F. B. (Eds), ENSEMBLES: Climate Change and its Impacts: Summary of research and results from the ENSEMBLES project. Met Office Hadley Centre, 2009. book

-       Meehl, Gerald A., Curt Covey, Karl E. Taylor, Thomas Delworth, Ronald J. Stouffer, Mojib Latif, Bryant McAvaney, John F. B. Mitchell, 2007: The WCRP CMIP3 Multimodel Dataset: A New Era in Climate Change Research. Bull. Amer. Meteor. Soc., 88, 1383–1394. doi: doi:10.1175/BAMS-88-9-1383. Full text

Comments Off

Filed under Climate Change Metrics, Climate Models, Guest Weblogs

Comments On The Paper “Evaluating Explanatory Models Of The Spatial Pattern of Surface Climate Trends Using Model Selection And Bayesian Averaging Methods” By McKitrick and Tole 2012

 

There is a new paper which documents further the lack of skill of multi-decadal climate model predictions. This paper has also been commented on by Judy Curry in the post

Three new papers on interpreting temperature trends

and by Anthony Watts at

New modeling analysis paper by Ross McKitrick.

As I summarized in my post

Kevin Trenberth Was Correct – “We Do Not Have Reliable Or Regional Predictions Of Climate”

these climate model predictions are failing to accurately simulate fundamental aspects of the climate system.

The paper is

McKitrick, Ross R. and Lise Tole (2012) “Evaluating Explanatory Models of the Spatial Pattern of Surface Climate Trends using Model Selection and Bayesian Averaging Methods” Climate Dynamics, 2012, DOI: 10.1007/s00382-012-1418-9

with the abstract [highlight added]

We evaluate three categories of variables for explaining the spatial pattern of warming and cooling trends over land: predictions of general circulation models (GCMs) in response to observed forcings; geographical factors like latitude and pressure; and socioeconomic influences on the land surface and data quality. Spatial autocorrelation (SAC) in the observed trend pattern is removed from the residuals by a well-specified explanatory model. Encompassing tests show that none of the three classes of variables account for the contributions of the other two, though 20 of 22 GCMs individually contribute either no significant explanatory power or yield a trend pattern negatively correlated with observations. Non-nested testing rejects the null hypothesis that socioeconomic variables have no explanatory power. We apply a Bayesian Model Averaging (BMA) method to search over all possible linear combinations of explanatory variables and generate posterior coefficient distributions robust to model selection. These results, confirmed by classical encompassing tests, indicate that the geographical variables plus three of the 22 GCMs and three socioeconomic variables provide all the explanatory power in the data set. We conclude that the most valid model of the spatial pattern of trends in land surface temperature records over 1979-2002 requires a combination of the processes represented in some GCMs and certain socioeconomic measures that capture data quality variations and changes to the land surface.

The text starts off with

General Circulation Models (GCMs) are the basis for modern studies of the effects of greenhouse gases and projections of future global warming. Reliable trend projections at the regional level are essential for policy guidance, yet formal statistical testing of the ability of GCMs to simulate the spatial pattern of climatic trends has been very limited. This paper applies classical regression and Bayesian Model Averaging methods to test this aspect of GCM performance against rival explanatory variables that do not contain any GCM-generated information and can therefore serve as a benchmark.

This paper  supports the viewpoint of the papers

Klotzbach, P.J., R.A. Pielke Sr., R.A. Pielke Jr.,  J.R. Christy, and R.T. McNider, 2009: An alternative explanation for differential temperature trends at the  surface and in the lower troposphere. J. Geophys. Res., 114, D21102, doi:10.1029/2009JD011841.

Klotzbach, P.J., R.A. Pielke Sr., R.A. Pielke Jr.,  J.R. Christy, and R.T. McNider, 2010: Correction to: “An alternative explanation for differential temperature trends at the  surface and in the lower troposphere. J. Geophys. Res., 114, D21102, doi:10.1029/2009JD011841″, J. Geophys. Res.,  115, D1, doi:10.1029/2009JD013655.

where we showed that the multi-decadal trends in surface and lower tropospheric temperature trends are diverging from one another with much greater differences over land areas than over ocean areas. The socioeconomic influences on the land surface and data quality issues identified in the McKittrick and Tole 2012 paper are reasons such a divergence should be expected.

In a paper in press (which I am a co-author on) on the subject of the surface temperature trends, we docuement in depth why there is warm bias in the minumum temperature trends that are used to construct an annual, global average multi-decadal temperture trends. I will be posting on this paper as soon as it is posted on the journal website.  It provides even more support on the findings of McKittrick and Tole 2012 on the importance of socioeconomic influences on the land surface and data quality as a factor in long term temperature trends.

Comments Off

Filed under Climate Models

New Paper “Climate Physics, Feedbacks, And Reductionism (And When Does Reductionism go Too Far?)” By Dick Lindzen

I was alerted to an important, informative new paper by Dick Lindzen (h/t to Anthony Watts) on the issue of climate. The paper is

R.S. Lindzen, 2102: Climate physics, feedbacks, and reductionism (and when does reductionism go too far?). Eur. Phys. J. Plus (2012) 127: 52 DOI 10.1140/epjp/i2012-12052-8.

The introduction reads (there is no abstract) [highlight added]

The public perception of the climate problem is somewhat schizophrenic. On the one hand, the problem is perceived to be so complex that it cannot be approached without massive computer programs. On the other hand, the physics is claimed to be so basic that the dire conclusions commonly presented are considered to be self-evident. Consistent with this situation, climate has become a field where there is a distinct separation of theory and modeling. Commonly, in traditional areas like fluid mechanics, theory provides useful constraints and tests when applied to modeling results. This has been notably absent in current work on climate. In principle, climate modeling should be closely associated with basic physical theory. In practice, it has come to consist in the almost blind use of obviously inadequate models.

In this paper, I would like to sketch some examples of potentially useful interaction with specific reference to the issue of climate sensitivity. It should be noted that the above situation is not strictly the fault of modelers. Theory, itself, has been increasingly idealized and esoteric with little attempt at real interaction. Also, theory in atmospheric and oceanic dynamics consists in conceptual frameworks that are generally not mathematically rigorous. Perhaps, we should refer to it as physical or conceptual reasoning instead. As we shall see, when reductionism goes beyond the constraints imposed by these frameworks, it is probably going too far though reductionism remains an essential tool of analysis.

The concluding remarks read

This paper considers approaches to estimating climate sensitivity involving the basic physics of the feedback processes rather than attempting to estimate climate sensitivity from time series of temperature. The latter have to assume a perfect knowledge of all sources of climate variability —something generally absent. The results of a variety of independent approaches all point to relatively low sensitivities. We also note that when climate change is due to regional and seasonal forcing, the concept of one dimensional climate sensitivity may, in fact, be inappropriate. Finally, it should be noted that I have not followed the common practice of considering the feedback factor to be the sum of separate feedback factors from water vapor, clouds, etc. The reason for this is that these feedback factors are not really independent. For example, in fig. 2, we refer to a characteristic emission level that is one optical depth into the atmosphere. For regions with upper level cirrus, this level is strongly related to the cloud optical depth (in the infrared), while for cloud-free regions the level is determined by water vapor. However, as shown by Rondanelli and Lindzen [30], and Horvath and Soden [31], the area covered by upper level cirrus is both highly variable and temperature dependent. The water vapor feedback is dependent not only on changes in water vapor but also on the area of cloud-free regions. It, therefore, cannot readily be disentangled from the cloud feedback.

One interesting statement in the paper is that, with respect to regional climate features,

“……current models do not simulate the PDO [Pacific Decadal Oscillation]. We are currently beginning such a study.”

The entire article is an important new contribution to the climate science discussion by a well-respected colleague.  I recommend reading the entire article.

My one substantive comment is the use of the terminology “climate sensitivity“.  I recognize that so much of the literature is focusing on the response of the global, annual averaged surface temperature to an imposed global averaged forcing (such as the radiative effect of added CO2) and calling this “climate sensitivity“.   However, this is but a very small part of true climate sensitivity. While I completely agree with Dick that there is a fundamental problem with “one-dimensional thinking” as he discussed in section 4 of his paper, it is an even higher dimensional (and more complex) issue than presented in the paper.

As I have often presented on my weblog, the climate system can be schematically illustrated below from NRC (2005).

The real world climate sensitivity is the influence of natural and human climate forcings on each of the components of the climate system.  Research is only just beginning to examine this issue, which needs to be completed using the bottom-up, contextual vulnerability approach that we discuss in our paper

Pielke Sr., R.A., R. Wilby, D. Niyogi, F. Hossain, K. Dairuku, J. Adegoke, G. Kallos, T. Seastedt, and K. Suding, 2011: Dealing  with complexity and extreme events using a bottom-up, resource-based  vulnerability perspective. AGU Monograph on Complexity and  Extreme Events in Geosciences, in press.

source of image at top of post

Comments Off

Filed under Climate Models, Research Papers

New Paper The Impact Of Spring Subsurface Soil Temperature Anomaly In The Western U.S. On North American Summer Precipitation: A Case Study Using Regional Rlimate Model Downscaling” By Xue Et al 2012

There is an important new regional climate model paper (using type 3 downscaling as defined in

Pielke Sr., R.A., and R.L. Wilby, 2012: Regional climate downscaling – what’s the point? Eos Forum, 93, No. 5, 52-53, doi:10.1029/2012EO050008.

The new paper is

Xue, Y., R. Vasic, Z. Janjic, Y. M. Liu, and P. C. Chu (2012), The impact of spring subsurface soil temperature anomaly in the western U.S. on North American summer precipitation: A case study using regional climate model downscaling, J. Geophys. Res., 117, D11103, doi:10.1029/2012JD017692.

The abstract reads [highlight added]

This study explores the impact of spring subsurface soil temperature (SUBT) anomaly in the western U.S. on North American summer precipitation, mainly southeastern U.S., and possible mechanisms using a regional climate Eta model and a general circulation model (GCM). The GCM produces the lateral boundary condition (LBC) for the Eta model. Two initial SUBT conditions (one cold and another warm) on May 1st were assigned for the GCM runs and the corresponding Eta runs. The results suggest that antecedent May 1st warm initial SUBT in the western U.S. contributes positive June precipitation over the southern U.S. and less precipitation to the north, consistent with the observed anomalies between a year with a warm spring and a year with a cold spring in the western U.S. The anomalous cyclone induced by the surface heating due to SUBT anomaly propagated eastward through Rossby waves in westerly mean flow. In addition, the steering flow also contributed to the dissipation of perturbation in the northeastern U.S. and its enhancement in southeastern U.S. However, these results were obtained only when the Eta model run was driven by the corresponding GCM run. When the same reanalysis data were applied for both (cold and warm initial SUBT) Eta runs’ LBCs, the precipitation anomalies could not be properly produced, indicating the intimate dependence of the regional climate sensitivity downscaling on the imposed global climate forcing, especially when the impact was through wave propagation in the large-scale atmospheric flow.

Excerpts from the conclusion reads

This study demonstrates that although GFS runs with large internal variability and coarse resolutions were unable to produce adequate precipitation difference patterns, the downscaling of GFS precipitation output using Eta did yield significant results consistent with observations.

The Eta results were obtained only when we used GFS outputs for the corresponding Eta runs’ LBCs. When we applied the same reanalysis data for both (control and sensitivity) Eta runs’ LBCs, the Rossby wave propagation was suppressed and observed precipitation anomalies were not properly produced. Because large scale circulation and low-level moisture transfer played crucial roles in proper simulations of the U.S. summer precipitation, maintaining the same LBC produced similar large-scale patterns, causing severe limitations in this sensitivity study. Downscaled regional climate is closely linked to the imposed global climate forcing. Therefore, for climate sensitivity studies using RCMs, consistent lateral boundary forcing may be crucial, especially when the impact is produced through wave transference in the atmosphere.

This is the first modeling study to explore the western U.S. SUBT impact and its teleconnections with Eastern U.S. precipitation. The results suggest that SUBT may be able to provide an extended element of memory, which would enhance predictability. However, there are many issues which require more investigations……

The Xue et al 2012 paper also confirms what we presented in

Pielke Sr., R.A., G.E. Liston, J.L. Eastman, L. Lu, and M. Coughenour,  1999: Seasonal weather prediction as an initial value problem. J. Geophys.  Res., 104, 19463-19479.

The abstract of our paper reads

Using a climate version of a regional atmospheric model, we show that the seasonal evolution of weather is dependent on the initial soil moisture and landscape specification. Coupling this model to a land-surface model, the soil moisture distribution and landscape are shown to cause a significant nonlinear interaction between vegetation growth and precipitation. These results demonstrate that seasonal weather prediction is an initial value problem. Moreover, on seasonal and longer timescales the surface characteristics such as soil moisture, leaf area index, and landcover type must be treated as dynamically evolving dependent variables, instead of prescribed parameters.

source of image

Comments Off

Filed under Assessment of climate predictability, Climate Change Forcings & Feedbacks, Climate Models

Comments On “Human-induced Global Ocean Warming On Multidecadal Timescales” By Gleckler Et Al 2012

source of figure from Gleckler et al 2012

There is a new paper

P. J. Gleckler, B. D. Santer, C. M. Domingues, D. W. Pierce, T. P. Barnett, J. A. Church, K. E. Taylor, K. M. AchutaRao, T. P. Boyer, M. Ishii & P. M. Caldwell: 2012 Human-induced global ocean warming on multidecadal timescales. Nature Climate Change doi:10.1038/nclimate1553

which has been receiving media attention; e.g. see from the ABC Radio Australia

Research shows humans main cause of global warming.

Judy Curry has a post on June 12 2012 worth reading on this paper and other papers on this subject on her weblog in

Causes(?) of ocean warming

The news article contains the text [highlight added]

Scientists say this is the most comprehensive study to date on global ocean warming.

The research has been published in the journal Nature Climate Change.

The team looked at rising ocean temperatures over the past 50 years, and a dozen models projecting climate change patterns.

Australian based co-author, Dr John Church from Australia’s island state of Tasmania says there’s no way all of the world’s oceans could’ve warmed by one tenth of a degree Celsius without human impact.

He says nature only accounts for 10 per cent of the increase.

Dr Church says researchers from America, Australia, Japan and India examined a dozen different models used to project climate change, past studies have only looked at a couple at a time.

“And this has allowed the group to rule out that the changes are related to natural variability in the climate system,” he said.

Leading climate change and oceanography expert, Professor Nathan Bindoff says scientists are now certain man-made greenhouse gases are the primary cause.

“The evidence is unequivocal for global warming,” he said.

He says the new research balances the man-made impacts of warming greenhouse gases and cooling pollution in the troposphere, against natural changes in the ocean’s temperature and volcanic eruptions.

“This paper is  important because for the first time we can actually say that we’re virtually certain that the oceans have warmed, and that warming is caused not by natural processes but by rising greenhouse gases primarily,” he said

The Nature Climate Change article has the abstract

Large-scale increases in upper-ocean temperatures are evident in observational records. Several studies have used well-established detection and attribution methods to demonstrate that the observed basin-scale temperature changes are consistent with model responses to anthropogenic forcing and inconsistent with model-based estimates of natural variability. These studies relied on a single observational data set and employed results from only one or two models. Recent identification of systematic instrumental biases in expendable bathythermograph data has led to improved estimates of ocean temperature variability and trends and provide motivation to revisit earlier detection and attribution studies. We examine the causes of ocean warming using these improved observational estimates, together with results from a large multimodel archive of externally forced and unforced simulations. The time evolution of upper ocean temperature changes in the newer observational estimates is similar to that of the multimodel average of simulations that include the effects of volcanic eruptions. Our detection and attribution analysis systematically examines the sensitivity of results to a variety of model and data-processing choices. When global mean changes are included, we consistently obtain a positive identification (at the 1% significance level) of an anthropogenic fingerprint in observed upper-ocean temperature changes, thereby substantially strengthening existing detection and attribution evidence.

Their text includes the summary

“We have identified a human-induced fingerprint in observed estimates of upper-ocean warming on multidecadal timescales, confirming the results of previous D&A work2, 3, 4, 5. work. Our results are robust to the use of multiple bias-corrected observational data sets, to use of infilled or subsampled data, to model signal and noise uncertainties and to different technical choices in simulation drift removal and in the application of our D&A method. There is evidence from our variability comparisons that the models used here may underestimate observed decadal scale variability of basin-average upper-ocean temperatures. However, this variability underestimate would have to be smaller than observed by a factor of more than two to negate our positive identification of an anthropogenic fingerprint in the observed multidecadal warming of the upper 700 m of the oceans. Our analysis provides no evidence of a noise error of this magnitude.”

This post is to comment on several aspects of that paper:

1. The paper ended their analysis in 2008 [see figure at the top of this post].  While this may not have changed their trend analysis, it is an important issue in that, unlike a temperature at a single level where there is a lag between heat imposed and the temperature response, their temperatures are globally and 0-700m averaged values.  Since this layer contains the large majority of the heat changes in the climate system, any time lag it has with respect to an imposed heating or cooling would be small.  This mass weighted temperature can be directly converted to Joules as Levitus et al did in their paper, which I posted on, for example, in

Comment On Ocean Heat Content “World Ocean Heat Content And Thermosteric Sea Level Change (0-2000), 1955-2010″ By Levitus Et Al 2012

Levitus et al 2012 wrote for the period 1955-2010 that

The heat content of the world ocean for the 0-2000 m layer increased by 24.0×1022 J corresponding to a rate of 0.39 Wm-2 (per unit area of the world ocean) and a volume mean warming of 0.09ºC. This warming rate corresponds to a rate of 0.27 Wm-2 per unit area of earth’s surface. The heat content of the world ocean for the 0-700 m layer increased by 16.7×1022 J corresponding to a rate of 0.27 Wm-2 (per unit area of the world ocean) and a volume mean warming of 0.18ºC. The world ocean accounts for approximately 90% of the warming of the earth system that has occurred since 1955.

In my post, as a result of their finding, I concluded in my post that

Thus either using the 1955 to 2010 time period, or the shorter time period from 1990 to 2010 in the Levitus et al 2012 paper, the diagnosed magnitudes of ocean warming and global warming are significantly less than claimed by Jim Hansen in 2005. This discrepancy is even larger if we use the NOAA’s Pacific Marine Environmental Laboratory data.

Gleckler et al 2012 neglect to comment on the radiative imbalance diagnosed from the Levitus et al 2012 paper. Even though both were published in 2012, Glecker and colleagues certainly must have had an opportunity to update their paper with the Levitus et al 2012 new paper.

2.  The Glecker et al 2012 paper seems to ignore some non-CO2 positive human-caused radiative forcings. This includes black carbon, for example, in which in the paper

Jacobson, M. Z. (2010), Short‐term effects of controlling fossil‐fuel soot, biofuel soot and gases, and methane on climate, Arctic ice, and air pollution health, J. Geophys. Res., 115, D14209, doi:10.1029/2009JD013795

that I posted on in

Soot and Climate Change – A New Article By Jacobson 2010

Jacobson et al 2010 concluded that

fossil‐fuel soot, solid‐biofuel soot and gases, and CH4  may be the second leading cause of warming after CO2

and that eliminating them would

reduce global surface air temperatures by a statistically significant 0.3–0.5 K, 0.4–0.7 K, and 0.2–0.4 K, respectively, averaged over 15 years.

This conclusion conflicts with the statement in the news article by Nathan Bindoff that

man-made greenhouse gases are the primary cause

of the warming in the oceans. There still would be a discernible human influence, but it is more than just from greenhouse gases, as Nathan Bindoff claims.

The report

National Research Council, 2005: Radiative forcing of climate change: Expanding the concept and addressing uncertainties.Committee on Radiative Forcing Effects on Climate Change, Climate Research Committee, Board on Atmospheric Sciences and Climate, Division on Earth and Life Studies, The National Academies Press, Washington,D.C., 208 pp.

adds to the uncertainty in terms of what we know about positive (and negative) radiative forcings from aerosols. See Table 2-2 in that report where, with respect to aerosol indirect effects on clouds,  the “semidirect effect” and the  glaciation indirect effect” have uncertain positive forcing and the “thermodynamic effect” is of unknown magnitude.

Clearly, our understanding of the relative contributions of different human radiatice forcings is still uncertain.

3. The Glecker et al article also still may underestimate natural variations in upper ocean content, as Judy Curry documents in her weblog post Causes(?) of ocean warming. This has also been documented in the Ph.d. dissertation by Marcia Wyatt, reported recently on in the post

Marcia Wyatt’s University of Colorado at Boulder Ph.d Dissertation “A Multidecadal Climate Signal Propagating Across the Northern Hemisphere”

where her third paper (chapter in her dissertation) found that

A network of simulated climate indices, reconstructed from a data set generated by models of the third Coupled Intercomparison Project (CMIP3 (Meehl et al. 2007)), is analyzed. None of the sixty analyses performed on these networks succeeded in reproducing a propagating signal. While model results varied from one another in the climate footprints simulated, their results were far more similar to one another than they were to observations found in the instrumental and proxy networks, implying physical mechanisms relevant to signal propagation may be missing from this suite of general circulation models.

Such failure at modeling circulation features certainly would influence regional ocean heating patterns.

Among the climate model failings are those reported in my post

Kevin Trenberth Was Correct – “We Do Not Have Reliable Or Regional Predictions Of Climate”

These include

Fyfe, J. C., W. J. Merryfield, V. Kharin, G. J. Boer, W.-S. Lee, and K. von Salzen (2011), Skillful predictions of decadal trends in global mean surface temperature,  Geophys. Res. Lett.,38, L22801, doi:10.1029/2011GL049508

who concluded that

”….for longer term decadal hindcasts a linear trend correction may be required if the model does not reproduce long-term trends. For this reason, we correct for systematic long-term trend biases.”

and

Stephens, G. L., T. L’Ecuyer, R. Forbes, A. Gettlemen, J.‐C. Golaz, A. Bodas‐Salcedo, K. Suzuki, P. Gabriel, and J. Haynes (2010), Dreary state of precipitation in global models, J. Geophys. Res., 115, D24211, doi:10.1029/2010JD014532.

who wrote

“models produce precipitation approximately twice as often as that observed and make rainfall far too lightly…..The differences in the character of model precipitation are systemic and have a number of important implications for modeling the coupled Earth system …….little skill in precipitation [is] calculated at individual grid points, and thus applications involving downscaling of grid point precipitation to yet even finer‐scale resolution has little foundation and relevance to the real Earth system.”

See also

Sun, De-Zheng, Yongqiang Yu, Tao Zhang, 2009: Tropical Water Vapor and Cloud Feedbacks in Climate Models: A Further Assessment Using Coupled Simulations. J. Climate, 22, 1287–1304.

who wrote

“…….extended calculation using coupled runs confirms the earlier inference from the AMIP runs that underestimating the negative feedback from cloud albedo and overestimating the positive feedback from the greenhouse effect of water vapor over the tropical Pacific during ENSO is a prevalent problem of climate models.”

De-Zheng also wrote in an EOS article

Climate Dynamics: Why Does Climate Vary?

“….even without any external forcing from human activity, the state of the climate system varies substantially.

“….one thing this book emphasizes is that, at least for interannual and decadal time scales, the climate is capable of varying in a substantial way in the complete absence of any external forces.”

Thus, the real world natural variations in upper ocean heat content may not be captured by the set of models used by Glecker et al 2012, despite their claim to the contrary.  To their credit, they do examine observed ocean trends in their analysis, but as they also state there remain limitations in the data, which is particularly true prior to the full deployment of the Argo network.

4. The Glecker et al 2012 paper also persists in using trends rather than time slices to assess change. A snapshot of the upper ocean heat content today compared to a snapshot for another year, as long as the data is spatially and temporally dense enough, provides a quantitative measure of the ocean warming over that time period. This change in heat can then be used to diagnose the global average radiative imbalance as Levitus et al 2012 did, and as I proposed in the paper

Pielke Sr., R.A., 2003: Heat storage within the Earth system. Bull. Amer.  Meteor. Soc., 84, 331-335.

While the regionally mapped ocean heat data from 0 to 700m is not available in a ready format on-line to compare with the models, the sea surface temperature anomalies are. Below is SST anomalies for the week of June 11.

Glecker et al 2012 present upper ocean heat changes for different ocean basins: North Atlantic, South Atlantic, North Pacific, South Pacific, North Indian and South India. Since we can use time slices, if we had a figure such as above for the 0-700m layer [and I encourage its creation by one of our readers], we could then directly compare the current state of the ocean heat content with the model predictions.

To the extent that the SST anomalies agree with the upper ocean heat anomalies ( a very important caveat, since the SSTs do have time lags since this is not a mass weighted temperature), what we see (using an eyecrometer) is in terms of the average is a warm North Atlantic, a  cool South Atlantic,  a highly variable North Pacific,  a less variable South Pacific but still a mixed signal,  a similar mixed signal for the North and South Indian Ocean. The global average SST anomaly is +0.192 deg as reported on the excellent weblog post by Bob Tisdale

May 2012 Sea Surface Temperature (SST) Anomaly Update

The Glecker et al paper has about a warming of the upper ocean of about a tenth of a degree so the SST anomaly is close to that value. However, the regional heat content anomlies (to the extent we can compare SSTs with upper ocean temperatures) are not in as good an agreement. Moreover, as recently as 1995, the global average SST anomalies were around zero and lower than +0.1C in 2008 (see from Bob’s analysis).

My bottom line conclusion is that, while Glecker et al 2012 is an informative paper and makes a comparison with real world data, they understate the role of non-greenhouse gases and natural climate variations as an explanation at least part of the recent warming. This disagreement will only be resolved, unfortunately, as time passes and we see further the evolution of the heat changes in the upper ocean.

Also, the authors should quantitatively compare observed regional time slices of heating for the different ocean basins and compare with the models for 2012. They should also make a prediction as to what we should see in the coming years, so that validation can be an ongoing process.

Comments Off

Filed under Climate Models, Research Papers

New Paper “Stochastic And Scaling Climate Sensitivities: Solar, Volcanic And Orbital Forcings” By Lovejoy And Schertzer 2012

Anthony Watts alerted me to this paper

Lovejoy, S., and D. Schertzer (2012),Stochastic and scaling climate sensitivities: Solar, volcanic
and orbital forcings, Geophys. Res. Lett., 39, L11702, doi:10.1029/ 2012GL051871.

This is another contribution which documents shortcomings in the ability of multi-decadal global climate models to simulate the real climate.  Global climate model projections neglect important low frequency natural climate effects on time scales of decades and longer according to this paper.

The abstract reads [highlight added]

Climate sensitivity (λ) is usually defined as a deterministic quantity relating climate forcings and responses. While this may be appropriate for evaluating the outputs of (deterministic) GCM’s it is problematic for estimating sensitivities from empirical data. We introduce a stochastic definition where it is only a statistical link between the forcing and response, an upper bound on the deterministic sensitivities. Over the range ≈30 yrs to 100 kyrs we estimate this λ using temperature data from instruments, reanalyses, multiproxies and paleo spources; the forcings include several solar, volcanic and orbital series. With the exception of the latter – we find that λ is roughly a scaling function of resolution Δt: λ ≈ ΔtHλ, with exponent 0 ≈ < Hλ ≈ 0, the implied feedbacks must generally increase with scale and this may be difficult to achieve with existing GCM’s.

The conclusions read

After decreasing over several decades of scale, to a minimum of ≈ +/-0.1 K at around 10–100 yrs, temperature fluctuations begin to increase, ultimately reaching +/-3 to +/-5 K at glacial-interglacial scales. In order to understand the origin of this multidecadal, multicentennial and multimillenial variability, we empirically estimated the climate sensitivities of solar and volcanic forcings using several reconstructions. To make this practical, we introduced a stochastic definition of the sensitivity which could be regarded as an upper bound on the usual (deterministic) sensitivity with the two being equal in the case of full (and causal) correlation between the temperature and driver. Although the RMS temperature fluctuations increased with scale, the RMS volcanic and 10Be based solar reconstructions all decreased with scale, in roughly a power law manner. If any of these reconstructions represented dominant forcings, the corresponding feedbacks would have to increase strongly with scale (with exponent Hλ ≈ 0.7), and this is not trivially compatible with existing GCM’s. Only the sunspot based solar reconstructions were consistent with scale independent sensitivities (Hλ ≈ 0), these are of the order 4.5 K/(W per meter squared) (i.e., implying large feedbacks) and would require quite strong solar forcings of ≈1 W per meter squared at scales of 10 kyrs.

A recent analysis of S2Δt1/2 for forced GCM outputs over the past millennium S. Lovejoy et al. (Do GCM’s predict the climate…. Or low frequency weather?, submitted to Nature Climate Change, 2012) showed that they strongly underestimate the low frequency variability – even when for example strong solar forcings were used. Our findings here of the occasionally surprising scale-by-scale forcing variabilities helps explain why they were too weak. It seems likely that GCM’s are a missing an important mechanism of internal variability. A possible “slow dynamics” candidate is land-ice whose fluctuations are plausibly scaling over the appropriate ranges of space-time scales but which is not yet integrated into existing GCM’s.

source of image 

Comments Off

Filed under Climate Change Forcings & Feedbacks, Climate Models

Comment on the BAMS article “Two Time Scales for The Price Of One (Almost)” By Goddard Et Al 2012

There is an interesting essay in the May issue of BAMS that urges a focus on seasonal and decadal prediction. It is an informative article, but it completely leaves out the issue of where the huge funding of multi-decadal climate prediction fits. The essay is

Goddard, Lisa, James W. Hurrell, Benjamin P. Kirtman, James Murphy, Timothy Stockdale, Carolina Vera, 2012: Two Time Scales for The Price Of One (Almost). Bull. Amer. Meteor. Soc., 93, 621–629.   doi: http://dx.doi.org/10.1175/BAMS-D-11-00220.1

The article starts with the text [highlight added]

While some might call Decadal Prediction the new kid on the block, it would be better to consider it the latest addition to the Climate Prediction family. Decadal Prediction is the fascinating baby that all wish to talk about, with such great expectations for what she might someday accomplish. Her older brother, Seasonal Prediction, is now less talked about by funding agencies and the research community. Given his capabilities, he might seem mature enough to take care of himself, but in reality he is still just an adolescent and has yet to reach his full potential. Much of what he has learned so far, however, can be passed to his baby sister. Decadal could grow up faster than Seasonal did because she has the benefit of her older brother’s experiences. They have similar needs and participate in similar activities, and thus to the extent that they can learn from each other, their maturation is in some ways a mutually reinforcing process. And, while the attention that Decadal brings to the household might seem to distract from Seasonal, the presence of a sibling is actually healthy for Seasonal because it draws attention to the need for and use of climate information, which can bring funding and new research to strengthen the whole Climate Prediction family.

The conclusion reads

The investments described will take considerable human and financial resources and a commitment to sustain them. Compared to the costs of adaptation, the costs of implementing these recommendations will be low, but substantial enough to highlight the need for international coordination to minimize duplication and share the lessons learned throughout the communities involved. These are actions that would be prudent even in the absence of climate change. However, given that climate change has focused global attention on the need for climate information, climate services could build adaptation incrementally through better awareness, preparedness, and resiliency to climate variability at all time scales.

Seasonal and Decadal should not be treated as competitors for the attention of the scientific community. Rather, we should enable them to “play nicely” together, in order to maximize the efforts invested in each.

The essay, however, ignores the subject of multi-decadal climate predictions, and where it fits in this family. One reason for the neglect, of course, is the implicit assumption that such predictions are not contributing signficiantly to the assessment of either seasonal or decadal predictability.

However, I propose the following. If the image of a child and toddler are intended to represent seasonal and decadal prediction, respectively, the image below captures multi-decadal climate prediction. :-)

Comments Off

Filed under Climate Models

Further Discussion With Zhongfeng Xu On The Value Of Dynamic Downscaling For Multi-Decadal Predictions

In the post

Question And Answer On The Value Of Dynamic Downscaling For Multi-Decadal Predictions

two colleagues of mine and I discussed the significance of their new paper

Xu, Zhongfeng and Zong-Liang Yang, 2012: An improved dynamical downscaling method with GCM bias corrections and its validation with 30 years of climate simulations. Journal of Climate 2012 doi: http://dx.doi.org/10.1175/JCLI-D-12-00005.1

This post continues this discussion with  Zong-Liang Yang of the University of Texas in Austin and Zhongfeng Xu of the Institute of Atmospheric Physics of the Chinese Academy of Science.

Following is the comment by Zhongfeng, with my responses embedded.

Dear Roger,

Thank you for your interest to our paper.

In terms of your comments “their results show that they are not adding value to multi-decadal climate projections”. I think the comment is not accurate enough. We did not compare the climate changes simulated by IDD and TDD in the paper.

My Comment:

What you and Liang have very effectively documented are systematic errors in the observationally unconstrained model runs. You did not compare climate change, but you do show that the model results are biased. This bias is an impediment to skillful multi-decadal forecasts as it shows errors in the model physics and dynamics at that level. The elimination of these errors in the unconstrained runs is a necessary condition for skillful multi-decadal global model predictions.

Zhongfeng continues

So it’s too early to make conclusion whether IDD has adding value to climate change simulation.

My Response

To show skill, one has to show that changes in regional climate statistics between your control and your “future” are skillfully predicted. For model predictions in the coming decades, it is not enough to predict the same climate statistics, one must also skillfully predict changes to these statistics. Otherwise, the impact community could just as well use reanalyses.

Zhongfeng continues

 I guess it’s possible that IDD improves climate change projection when the GCM does a good job in producing climate change signals but producing a bad climatological means.

My Response

This cannot be correct. If the climatological means are in error, there are clearly problems in the model physics and dynamics. Also, what evidence do you have that the GCM does a good job in terms of multi-decadal predictions? [please see my post http://pielkeclimatesci.wordpress.com/2012/05/08/kevin-trenberth-is-correct-we-do-not-have-reliable-or-regional-predictions-of-climate/]

Zhongfeng continues

I will pay more attention to the IDD performance in climate change projection in our future study. I will keep you updated if we find some interesting results.

My Response

I look forward to learning more on your study. Thanks!

Zhongfeng continues

BTW: The IDD does significantly improve the projection of climatological mean. It’s still better than TDD which shows larger bias than IDD in projecting climatological means.

My Response

However, the global model multi-decadal predictions still are run with these biases. Even if you use IDD for the interior, the global model still has these errors meaning they have substantive physics and/or dynamic problems.

Zhongfeng’s comment

 Thank you for all your comments. They are very informative and make me thinking more about this dynamical downscaling study.  ^_^

My Reply 

I have also valued the discussion. I will add this as a weblog post follow-up. Your paper is a very important addition to the literature but the bottom line message is, in my view, documentation of why the impacts communities (e.g. for the IPCC assessments) should not be focusing on this methodology as bracketing the future of regional climates.

source of image

Comments Off

Filed under Climate Models, Debate Questions, Q & A on Climate Science

Comments On The Climate Etc Post “CMIP5 Decadal Hindcasts”

Judy Curry has an excellent new paper

Citation: Kim, H.-M., P. J. Webster, and J. A. Curry (2012), Evaluation of short-term climate change prediction in multi-model CMIP5 decadal hindcasts, Geophys. Res. Lett., 39, L10701, doi:10.1029/2012GL051644. [supplemental material]

which she posted on at Climate Etc

CMIP5 decadal hindcasts

I made a suggestion in the comments on her weblog, which I want to also post here on my weblog. First, one of the benchmark which the dynamical model predictions of atmospheric-ocean circulation features must improve on is clearly captured in the seminal paper

Landsea,Christopher W. and ,John A.  Knaff, 2000: How Much Skill Was There in Forecasting the Very Strong 1997–98 El Niño? Bulletin of the American Meteorological Society Volume 81, Issue 9 (September 2000) pp. 2107-2119.

As they wrote

 “A …….simple statistical tool—the El Niño–Southern Oscillation Climatology and Persistence (ENSO–CLIPER) model—is utilized as a baseline for determination of skill in forecasting this event”

and that

“….more complex models may not be doing much more than carrying out a pattern recognition and extrapolation of their own.”

Using persistence which means that the benchmark assumes that the initial values remain constant is not a sufficient test.  Persistance-climatology is the more appropriate evaluation benchmark in which a continuation of a cycle whose future is predicted based on past statistical behavior, given a set of initial conditions. This is what Landsea and Knaff so effectively reported on in their paper for ENSO events. Real world observations, of course, provide the ultimate test of any model prediction, and the reanalysis products, as Kim et al 2012 have done is the best choice.

I recommend, therefore,  that Kim et al extend their evaluation to use this benchmark in which the degree to which the CMIP5 decadal predictions can improve on a statistical forecast of the Atlantic Multidecadal Oscillation (AMO) and Pacific Decadal Oscillation (PDO) is examined.

However, it is important to realize, that the CMIP5 runs also need to be tested in terms of their ability to predict changes in the statistical behavior of the AMO and PDO.

Dynamic models need to improve on that skill (i.e. accurately predict changes in this behavior) if those models are going to add any predictive (projection) value in response to human climate forcings. The Kim et al 2012 paper is another valuable, much needed assessment of global model prediction skill. However, the ability of the CMIP5 models to predict changes in the climatological behavior is also needed. Of course, the time period required of observed data is long enough to adequately develop the statistics.

source of image

Comments Off

Filed under Climate Models