Monthly Archives: July 2011

Additional Information On The “Ocean’s Missing Heat” By Katsman and van Oldenborgh 2011

I discussed the papers

C. A. Katsman and G. J. van Oldenborgh, 2011: Tracing the upper ocean’s ‘missing heat’. Geophysical Research Letters (in press).

Palmer, M. D., D. J. McNeall, and N. J. Dunstone (2011), Importance of the deep ocean for estimating decadal changes in Earth’s radiation balance, Geophys. Res. Lett., 38, L13707, doi:10.1029/2011GL047835.

in my posts

2011 Update Of The Comparison Of Upper Ocean Heat Content Changes With The GISS Model Predictions

New Paper “Importance Of The Deep Ocean For Estimating Decadal Changes In Earth’s Radiation Balance” By Palmer Et Al 2011

I have been sent a summary article from Klimaat wereld on this subject that was published on July 28 2011  [h/t/ Erik].

This Klimaat wereld summary article is titled

Tracing the upper ocean’s ‘missing heat’

by Caroline Katsman and Geert Jan van Oldenborgh, KNMI.  Although, as discussed below, I have several issues with their interpretations and  conclusions, the authors should be commended for publishing a significant new contribution to our understanding of the climate system.  This is an effective paper which can be built on to improve our knowledge of the science of climate.

The abstract reads

“Against expectations, the upper ocean (from 0 to 700 meter depth) has not warmed since 2003. A recent KNMI study shows that an eight-year interruption of the rise expected from global warming is not exceptional. It can be explained by natural variability of the climate, in particular the climate oscillation El Niño in the Pacific Ocean and changes in ocean currents in the North Atlantic Ocean. Recent observations point to an upcoming resumption of the heating of the upper ocean.”

I have extracted several parts of the text of this article [and highlighted text] and comment on them.


Observations of the sea water temperature show that the upper ocean has not warmed since 2003. This is remarkable as it is expected the ocean would store that the lion’s share of the extra heat retained by the Earth due to the increased concentrations of greenhouse gases. The observation that the upper 700 meter of the world ocean have not warmed for the last eight years gives rise to two fundamental questions:

  1. What is the probability that the upper ocean does not warm for eight years as greenhouse gas concentrations continue to rise?
  2. As the heat has not been not stored in the upper ocean over the last eight years, where did it go instead?

These question cannot be answered using observations alone, as the available time series are too short and the data not accurate enough. We therefore used climate model output generated in the ESSENCE project, a collaboration of KNMI and Utrecht University that generated 17 simulations of the climate with the ECHAM5/MPI-OM model to sample the natural variability of the climate system. When compared to the available observations, the model describes the ocean temperature rise and variability well.”

My Comment: If the “question cannot be answered using observations alone“, how can it be stated that “When compared to the available observations, the model describes the ocean temperature rise and variability well“?  This is a circular argument.  Models themselves are hypotheses, and the more accurate statement by the authors would be that the available observations do not falsify the model as replicating reality.

The next extract reads

“Observations of the temperature of the upper few hundred meters of the ocean go back to the 1960s. Up to ten years ago most measurements were taken by simple thermometers that were thrown overboard and sent back the temperature as they fell down through a wire (expandable bathythermographs, XBTs). Since about ten years these have been superseded by fully automatised ARGO floats that measure temperature down to 2000 m depth and send the data home every ten days. Starting from these raw observations the global temperature distribution down to 700 meter is reconstructed, filling in the gaps in the coverage. Using the heat capacity of water this enables the estimation of the amount of heat stored in the world ocean.”

My Comment:  This is a succinct summary of why we need to focus on the observations and model comparisons over the last ten years. Prior to this time period, the values of the ocean heat content are much less certain.

They further write

“In the model, the fraction of negative eight-year trends decrease as the warming trend accelerates, but between 1990 and 2020 (31 years around 2005) 3% of the trends still is negative. This implies a one in three chance of at least one eight-year period with a negative trend in these 31 years. An eight-year pause in the rise of the upper ocean heat content is therefore not at all rare, even in a warming climate.”

My Comment:  If the ocean heat warming pauses, this part of the climate system is not warming.

Next, they provide an effective summary of the importance of the ocean as the reservoir for heating and cooling of the climate system.

“Where does the heat go?

The amounts of CO2 and other greenhouse gases in the atmosphere are steadily increasing. The increased absorption of thermal radiation by these gases causes the radiation to space to emanate from higher in the atmosphere on average, where it is colder. The colder air emits less thermal radiation, so that the incoming solar energy is no longer balanced by outgoing radiation. The excess heat is absorbed by the ocean, slowly warming the water from the top down.

If the upper ocean does not warm for a few years the excess heat from the imbalance between incoming and outgoing radiation has to go elsewhere. The ocean temperature has only risen 0.02 ºC less than expected, but due to the size of the ocean and the large heat capacity of water this represents a huge amount of heat. If this heat would have been used to heat the atmosphere, the air temperature would have increased by 5 ºC. This obviously did not happen, so the heat was not stored in the atmosphere. The ground has a larger heat capacity, but heat penetrates only slowly down. Storing the heat missing from the upper ocean in the ground would have raised its temperature by about 1.5 ºC. This also was not observed, so we can conclude that the bulk of the heat did not go into the ground. If the heat would have been absorbed by land or sea ice it would also have had large consequences that have not been detected, for example a sea level rise of 20 cm if the heat would have been used to melt land ice.

By elimination, only two possibilities remain. Either the Earth radiates more energy to space during these periods of no increase in upper ocean heat content or the heat content of the deep ocean (below 700 meter) increases temporarily. Both possibilities were found to play a role in the climate model.”

My Comment: The authors use the “climate model” to explain where the heat goes. However, in the real world, heat that is transported to deeper levels should be seen in the ARGO observations.  A  further comparison of this tranport, as predicted in the models, with the observations is needed. Moreover, even if there is heat transported to deeper ocean depths, this would mute subsequent atmospheric heating (and, therefore, effects of weather), as the disperion of this heat at depth would be expected to result at most in only a slow transfer back to the surface. The resulting heating of the atmosphere would be muted.

They next write

The model shows that during periods that the upper layers of the ocean do not heat, the deeper layers show a stronger increase in temperature. This vertical seesaw is strongest in the North Atlantic Ocean south of Greenland. In this area the surface waters cool each winter due to cold winds from Canada. As it gets heavier than the slightly warmer but more salty water at depth the surface water sinks and the warmer water rises. This exchange therefore cools the deeper ocean. In winters with little mixing the upper ocean stays colder and the deeper layers stay warmer.”

My Comment:  There is a problem with their statement that The model shows that during periods that the upper layers of the ocean do not heat, the deeper layers show a stronger increase in temperature”.  If the upper layers do not show heating, how does heat transfer (even in the model) to deeper layers?  The Joules of heat cannot just appear below the upper 700m if the reason for the assumed heating is from added greenhouse gas forcing in the atmosphere.

They next provide (to their credit) a forecast

Outlook for the coming years

Since two years ago El Niño has been replaced by a series of La Niña events that should cause a heating trend in the upper ocean. The heat exchange between the upper and deep ocean in the Labrador Sea has also started again recentIy. We therefore expect that the upper ocean heat content will soon resume its upward trend.”

My Comment:  First, what is the observational basis to conclude that the “The heat exchange between the upper and deep ocean in the Labrador Sea has also started again recentIy”.  Nonetheless, their expectation (forecast) that the upper ocean heat content will resume its upward trend is a hypothesis that can be tested over the next few years [unlike the IPCC type forecasts of weather patterns decades from now!].   As of the most recent upper ocean data analysis, however, the heating has not yet restarted; i.e. see from NODC

Global Ocean Heat Content 1955-present 

It is also important to realize in interpreting this data that for the period before the establishement of the Argo network, the quantitative accuracy of the analyses is less. The data are actually constructed by merging two distinct methods to observe the ocean heat content. The jump seen in the data in the first years after 2000 might have occured due to the temporal inhomogenity of the data analysis.

Finally, they write at the end of their article

“Because of these natural fluctuations a short trend in the upper ocean heat content is not a good indicator of enhanced greenhouse warming, only the long-term trend is.”

My Comment: This is a recognition of the increasingly better recognized importance of “natural climate variations”.  However, the authors did not include in their original paper, nor in their Klimaat wereld article how many years of a lack of warming would have to occur before they would reject their models as being skillful replicators of the climate system’s changes in upper ocean heat content.

source of image

Comments Off

Filed under Climate Change Metrics, Climate Models

Richard Muller On NPR On April 11 2011 – My Comments

I have already posted on Rich Muller’s testimony to Congress; see

Informative News Article by Margot Roosevelt In The Los Angeles Times On Richard Muller’s Testimony To Congress

Is There A Sampling Bias In The BEST Analysis Reported By Richard Muller?

Comments On The Testimony Of Richard Muller At the United States House Of Representatives Committee On Science, Space And Technology

He was also  been interviewed, after that testimony, by NPR on April 11 2011  (see).  I have defered posting on this interview as I have been seeking to engage Rich in a dialog on the issues with the surface temperature data.  However, I understand from an indirect source that he has a Science article under review, and thus, conclude that presenting comments on his NPR interview now would be informative to others. He is also not replying to my e-mail requests to interact on the surface temperature analysis issue.

In the NPR interview, the following questions and answers occurred (highlight added):

CONAN: Do you find that, though, there is a lot of ideology in this business?
Prof. MULLER: Well, I think what’s happened is that many scientists have gotten so concerned about global warming, correctly concerned I mean they look at it and they draw a conclusion, and then they’re worried that the public has not been concerned, and so they become advocates. And at that point, it’s unfortunate, I feel that they’re not trusting the public. They’re not presenting the science to the public. They’re presenting only that aspect to the science that will convince the public. That’s not the way science works. And because they don’t trust the public, in the end the public doesn’t trust them. And the saddest thing from this, I think, is a loss of credibility of scientists because so many of them have become advocates.
CONAN: And that’s, you would say, would be at the heart of the so-called Climategate story, where emails from some scientists seemed to be working to prevent the work of other scientists from appearing in peer-reviewed journals.
Prof. MULLER: That really shook me up when I learned about that. I think that Climategate is a very unfortunate thing that happened, that the scientists who were involved in that, from what I’ve read, didn’t trust the public, didn’t even trust the scientific public. They were not showing the discordant data. That’s something that – as a scientist I was trained you always have to show the negative data, the data that disagrees with you, and then make the case that your case is stronger. And they were hiding the data, and a whole discussion of suppressing publications, I thought, was really unfortunate. It was not at a high point for science
And I really get even more upset when some other people say, oh, science is just a human activity. This is the way it happens. You have to recognize, these are people. No, no, no, no. These are not scientific standards. You don’t hide the data. You don’t play with the peer review system.

Since I posted  on another viewpoint with respect to the surface temperature issue, for example, in

Paper “A Stable Boundary Layer Perspective On Global Temperature Trends” By McNider Et Al 2010,

I have decided to post the e-mail I sent to Rich Muller before his NPR interview. I have waited for several months but have had no reply from him on this e-mail (or more recent ones) that I sent to him.

R. Pielke Sr. e-mail to Rich Muller

 The trends in max and min T are central to the computation of Tavg. Two compensating systematic biases does not make the Tavg trend more robust unless one can show this is a universal finding and not a fluke of the current sample of USHCN sites.

 Your view on the value of Tavg as being of dominate importance also perpetuates the misuse of this metric to quantify global warming (as contrasted with ocean heat content change which as a physicist, I would have thought you would emphasize in your testimony and interviews).

 Even with respect to surface air temperature trends, however, we have published on a number of systematic biases and uncertainties that you have not communicated in your interviews.

For example, there is a systematic observed warm bias in Tavg; see

1. Klotzbach, P.J., R.A. Pielke Sr., R.A. Pielke Jr., J.R. Christy, and R.T. McNider, 2009: An alternative explanation for differential temperature trends at the surface and in the lower troposphere. J. Geophys. Res., 114, D21102, doi:10.1029/2009JD011841.

2. The trend in minimum temperatures (and less in Tmax where the surface layer is better mixed during the day) is a function of height near the surface; e.g. see

Steeneveld, G.J., A.A.M. Holtslag, R.T. McNider, and R.A Pielke Sr, 2011: Screen level temperature increase due to higher atmospheric carbon dioxide in calm and windy nights revisited. J. Geophys. Res., 116, D02122, doi:10.1029/2010JD014612.

There must, therefore, be a different trend in Tavg as a function of height near the surface.

3. There is also the issue of how concurrent trends in moist enthalpy affect Tavg; e.g. see

Pielke Sr., R.A., C. Davey, and J. Morgan, 2004: Assessing “global warming” with surface heat content. Eos, 85, No. 21, 210-211.

Your statement that

“And our tentative conclusion, which is what I reported to Congress, is even though these sites can lead to different temperatures, that for the trends, for the thing we called warming, that there does not seem to be a significant effect.”

is erroneous. You made a blanket statement which is not correct.

I would like your permission to post our e-mail exchange on my weblog. Is that okay with you? If not, I will still post my e-mails.



In Richard Muller’s NPR interview he said with respect to Climategate

“They were not showing the discordant data. That’s something that – as a scientist I was trained you always have to show the negative data, the data that disagrees with you”

Clearly he is also not showing “discordont data” and “data that disageees with” him. I invite him to engage in constructive interactions with those of us in the climate community who have been examining a range of unresolved issues with the multi-decadal land surface temperature analyses.

source of image

Comments Off

Filed under Climate Science Misconceptions, Climate Science Reporting

New Paper “A New, Lower Value Of Total Solar Irradiance: Evidence And Climate Significance” By Kopp and Lean 2011

source of image

There is a new paper that, while further improving our understanding and monitoring of solar forcing, makes several substantive incorrect statements on its significance with respect to the climate system.  The article is

G. Kopp and J. L. Lean (2011), A new, lower value of total solar irradiance: Evidence and climate significance, Geophys. Res. Lett., 38, L01706, doi:10.1029/2010GL045777.

The abstract reads [highlight added]

The most accurate value of total solar irradiance during the 2008 solar minimum period is 1360.8 ± 0.5 W m−2 according to measurements from the Total Irradiance Monitor (TIM) on NASA’s Solar Radiation and Climate Experiment (SORCE) and a series of new radiometric laboratory tests. This value is significantly lower than the canonical value of 1365.4 ± 1.3 W m−2 established in the 1990s, which energy balance calculations and climate models currently use. Scattered light is a primary cause of the higher irradiance values measured by the earlier generation of solar radiometers in which the precision aperture defining the measured solar beam is located behind a larger, view‐limiting aperture. In the TIM, the opposite order of these apertures precludes this spurious signal by limiting the light entering the instrument. We assess the accuracy and stability of irradiance measurements made since 1978 and the implications of instrument uncertainties and instabilities for climate research in comparison with the new TIM data. TIM’s lower solar irradiance value is not a change in the Sun’s output, whose variations it detects with stability comparable or superior to prior measurements; instead, its significance is in advancing the capability of monitoring solar irradiance variations on climate‐relevant time scales and in improving estimates of Earth energy balance, which the Sun initiates.”

Excerpts from this study include [with highlights bold-faced] followed by my comments are

“Instrument inaccuracies are a significant source of uncertainty in determining Earth’s energy balance from space‐based measurements of incoming and reflected solar radiation and outgoing terrestrial thermal radiation. A nonzero average global net radiation at the top of the atmosphere is indicative of Earth’s thermal disequilibrium imposed by climate forcing. But whereas the current planetary imbalance is nominally 0.85 W m−2 [Hansen et al., 2005], estimates of this quantity from space‐based measurements range from 3 to 7 W m−2. SORCE/TIM’s lower TSI value reduces this discrepancy by 1 W m−2 [Loeb et al., 2009]. We note that the difference between the new lower TIM value with earlier TSI measurements corresponds to an equivalent climate forcing of −0.8 W m−2, which is comparable to the current energy imbalance.

My Comment: The article reports on a “climate forcing of -0.8 W -2″ as being “comparable to the current energy imbalance”.  First such a value of -0.8 Watts m-2 is a radiative forcing which is only a subset of climate forcings [a substantive terminology error which has been pointed out numerous times; e.g. see NRC (2005) and Pielke et al (2009). Even more important, the use of the more robust metric of upper ocean heat changes shows a much smaller value of the radiative energy imbalance for the period 2003 up to the present (e.g. see Knox and Douglass 2010).

The impact of the new low TSI value on climate models is unknown. A few tenths percent change in the absolute TSI level is typically considered to be of minimal consequence for climate simulations. However, model parameters are adjusted to ensure adequate representation of current climate, for which incoming solar radiation is a decisive factor. Underway are experiments with the GISS Model 3 to investigate the sensitivity of model performance to the TSI absolute value during present and pre‐industrial epochs, and describe, for example, how the irradiance reduction is partitioned between the atmosphere and surface and the effects on outgoing radiation.”

My Comment:   Indeed, this is an important consequence of their finding. For example, since ocean surface evaporation is proportional to the exponent of the sea surface temperature, changes in the incoming solar radiation could be significant in terms of accurately simulating the water vapor feedback into the atmosphere.

A stable record of solar irradiance variations combined with reliable global surface temperature observations can provide a tool for quantifying climate response processes to radiative forcing on decadal time scales. The association of the observed 0.1% irradiance solar cycle increase (Figure 1) imparts 0.22 W m−2 instantaneous climate forcing, for which the empirically detected global temperature increase of 0.1°C (Figure 2) suggests a transient climate response of 0.6°C per W m−2 [Douglass and Clader, 2002]. This response is larger by a factor of 2 or more than in the current models assessed by IPCC [Tung et al., 2008], possibly because of the models’ excessive heat uptake by the ocean. With a stable multi‐decadal solar irradiance record, it will be possible to quantify the relationship between transient and (likely larger) equilibrium responses.”

My Comment:  First, the authors accept that global surface temperature observations are a robust measure of this metric. As we  have shown, however, there is a clear documented warm bias in this metric (e. g. see Kloltzbach et al 2009; Christy et al 2010).  Second, the authors claim that the real world surface temperature response to a radiative forcing on decadal time scales is underestimated “by a factor of 2 or more in the current models assessed by the IPCC…possibly because of the models’ excessive heat uptake by the ocean”.  If this is true, where has the heat gone? It certainly is not accumulating in the atmosphere in recent years, as illustrated below.

Channel TLT Trend Comparison

image from RSS MSU LT

image from UAH MSU

Indeed, there is no reservoir except the deeper ocean (e.g. see ) or out into space (e.g. see) where this heat could have gone.  Thus, while the Kopp and Lean 2011 article is an important new contribution to climate science, it has misinterpreted some of its conclusions concerning its importance to the debate on climate change.

Comments Off

Filed under Climate Change Forcings & Feedbacks, Research Papers

New Paper “On the Misdiagnosis Of Surface Temperature Feedbacks From Variations In Earth’s Radiant Energy Balance” By Spencer and Braswell 2011

There is a new paper published which raises further questions on the robustness of multi-decadal global climate predictions. It is

Spencer, R.W.; Braswell, W.D. On the Misdiagnosis of Surface Temperature Feedbacks from Variations in Earth’s Radiant Energy Balance. Remote Sens. 2011, 3, 1603-1613.

The University of Alabama has issues a news release on it which reads [h/t to Phillip Gentry]

Climate models get energy balance wrong, make too hot forecasts of global warming

HUNTSVILLE, Ala. (July 26, 2011) — Data from NASA’s Terra satellite shows that when the climate warms, Earth’s atmosphere is apparently more efficient at releasing energy to space than models used to forecast climate change have been programmed to “believe.”

The result is climate forecasts that are warming substantially faster than the atmosphere, says Dr. Roy Spencer, a principal research scientist in the Earth System Science Center at The University of Alabama in Huntsville.

The previously unexplained differences between model-based forecasts of rapid global warming and meteorological data showing a slower rate of warming have been the source of often contentious debate and controversy for more than two decades.

In research published this week in the journal “Remote Sensing”, Spencer and UA Huntsville’s Dr. Danny Braswell compared what a half dozen climate models say the atmosphere should do to satellite data showing what the atmosphere actually did during the 18 months before and after warming events between 2000 and 2011.

“The satellite observations suggest there is much more energy lost to space during and after warming than the climate models show,” Spencer said. “There is a huge discrepancy between the data and the forecasts that is especially big over the oceans.”

Not only does the atmosphere release more energy than previously thought, it starts releasing it earlier in a warming cycle. The models forecast that the climate should continue to absorb solar energy until a warming event peaks. Instead, the satellite data shows the climate system starting to shed energy more than three months before the typical warming event reaches its peak.

“At the peak, satellites show energy being lost while climate models show energy still being gained,” Spencer said.

This is the first time scientists have looked at radiative balances during the months before and after these transient temperature peaks.

Applied to long-term climate change, the research might indicate that the climate is less sensitive to warming due to increased carbon dioxide concentrations in the atmosphere than climate modelers have theorized. A major underpinning of global warming theory is that the slight warming caused by enhanced greenhouse gases should change cloud cover in ways that cause additional warming, which would be a positive feedback cycle.

Instead, the natural ebb and flow of clouds, solar radiation, heat rising from the oceans and a myriad of other factors added to the different time lags in which they impact the atmosphere might make it impossible to isolate or accurately identify which piece of Earth’s changing climate is feedback from manmade greenhouse gases.

“There are simply too many variables to reliably gauge the right number for that,” Spencer said. “The main finding from this research is that there is no solution to the problem of measuring atmospheric feedback, due mostly to our inability to distinguish between radiative forcing and radiative feedback in our observations.”

For this experiment, the UA Huntsville team used surface temperature data gathered by the Hadley Climate Research Unit in Great Britain. The radiant energy data was collected by the Clouds and Earth’s Radiant Energy System (CERES) instruments aboard NASA’s Terra satellite.

The six climate models were chosen from those used by the U.N.’s Intergovernmental Panel on Climate Change. The UA Huntsville team used the three models programmed using the greatest sensitivity to radiative forcing and the three that programmed in the least sensitivity.

Comments Off

Filed under Climate Change Forcings & Feedbacks, Climate Science Reporting, Research Papers

Comments On “Accuracy Of Climate Change Predictions Using High Resolution” By Matsueda and Palmer 2011

In 1988 I wrote the book chapter

Pielke, R.A., 1988: Evaluation of climate change using numerical models. In “Monitoring Climate for the Effects of Increasing Greenhouse Gas Concentrations. Proceedings of a Workshop”. R.A. Pielke and T. Kittel, Eds., Cooperative Institute for Research in the Atmosphere (CIRA), Fort Collins, Colorado, August 1987, 161-172.

Included in the text is

“The dynamic accuracy of GCM models have not been adequately tested. Such models need to be used to predict short-term weather changes since skill at such forecasts is essential if the models are to demonstrate a numerical fidelity in simulating wave-wave interactions. If the GCMs have insufficient spatial resolution or physics to forecast weather as accurately as current operational weather numerical weather prediction models, what confidence should be placed on their skill at predicting long-term climate change?”

Now, finally in 2011, a paper examines part of this issue, although it equates  high spatial resolution and lower spatial resolution model runs at two different 25 year time slices with an actual scientific test of climate change, when observations are not, of course, available to test their results decades from now.  It is an informative study, however, on the effect of model resolution for the time period of 1979 to 2003.

The paper is (h/t Dallas Staley)

Matsueda, M., and T. N. Palmer (2011), Accuracy of climate change predictions using high resolution simulations as surrogates of truth, Geophys. Res. Lett., 38, L05803, doi:10.1029/2010GL046618.

The abstract reads [highlight added]

How accurate are predictions of climate change from a model which is biased against contemporary observations? If a model bias can be thought of as a state‐independent linear offset, then the signal of climate change derived from a biased climate model should not be affected substantially by that model’s bias. By contrast, if the processes which cause model bias are highly nonlinear, we could expect the accuracy of the climate change signal to degrade with increasing bias.Since we do not yet know the late 21st Century climate change signal, we cannot say at this stage which of these two paradigms describes best the role of model bias in studies of climate change. We therefore study this question using time‐slice projections from a global climate model run at two resolutions ‐ a resolution typical of contemporary climate models and a resolution typical of contemporary numerical weather prediction – and treat the high‐resolution model as a surrogate of truth, for both 20th and 21st Century climate. We find that magnitude of the regionally varying model bias is a partial predictor of the accuracy of the regional climate change signal for both wind and precipitation. This relationship is particularly apparent for the 850 mb wind climate change signal. Our analysis lends some support to efforts to weight multi‐model ensembles of climate change according to 20th Century bias, though note that the optimal weighting appears to be a nonlinear function of bias.”

They outline their model experiments in the text

“….we make use here of the “timeslice” technique, whereby an atmosphere‐only model is integrated over two periods of 25 years corresponding to the late 20th Century and the late 21st Century with prescribed sea surface temperatures (SSTs). We do, however, recognise that prescribed SST integrations are themselves subject to systematic biases due to one‐way coupling [Douville, 2005]. [7] Model integrations were conducted for the (“control”) period 1979–2003 using observed interannually‐varying HadISST SSTs and sea ice concentrations (SICs) [Rayner et al., 2003] as lower boundary conditions. For the period 2075–2099, the SST and SIC climate‐change signals are estimated by the CMIP3 [Meehl et al., 2007] multi‐model ensemble mean to which the detrended interannual variations in HadISST have been added [Mizuta et al., 2008]. In this way, both control and timeslice integrations are integrated with interannually varying SSTs and SICs. The IPCC SRES A1B scenario was assumed for future emissions of greenhouse gases.”

They describe the results for the time period 1979 to 2003, which is actually the only scientifically robust part of their paper, with the text

“Table 1 shows the 20th Century RMS bias in 850 hPa wind (U850) for all individual Giorgi regions for the low and high resolution models, for December to February (DJF) and June to August (JJA), against real data (Japanese reanalysis [Onogi et al., 2007]). Notice that in general (10 out of 16 entries in Table 1) the high resolution model has lower bias than the low resolution model against the real data ‐ in a further 3 cases the single‐member high‐resolution simulation has equal bias with the smoother low‐resolution ensemblemean field.”

The conclusion reads

“Using high resolution simulations as a surrogate of truth, we have shown that the regionally dependent 20th Century 850 hPa zonal wind and precipitation bias of a climate model is a predictor of the accuracy of its 21st Century climate change signal. In particular, in two regions where model bias was especially large, the low‐resolution model’s climate change signal was negatively correlated with the true climate change signal

The results give some support to efforts to weight multi‐model ensembles with bias, though our results suggest the weighting should depend nonlinearly with bias, and, for precipitation, may also depend on season. More generally the results in this paper lend support to aims to try to reduce model bias – the notion of a state independent linear bias offset is simply not tenable. A byproduct of our study was the finding that the bias of a model run at typical NWP resolution was typically smaller than that of an equivalent model run at typical climate resolution (though due to some changes of parameters, it cannot be stated unambiguously that the reduction of bias was uniquely due to resolution). Consistent with the seamless prediction methodology [Palmer and Webster, 1993; Palmer et al., 2008], we strongly recommend that a fully rigourous study of the impact of running climate models at today’s NWP resolutions be made using fully comprehensive coupled ocean‐atmosphere climate models, where high‐resolution ocean dynamics is also likely to be important [Shaffrey et al., 2009]. Given the demands of Earth‐System complexity and the need for ensemble integrations, this would require computational facilities with sustained multi‐petaflop performance, dedicated to climate prediction. Such facilities are currently unavailable to the climate modelling community [Palmer, 2005; Nature Editorial, 2008; Shukla et al., 2009]. Given the preeminence of the climate threat, and the need to reduce uncertainty in climate predictions, we believe this to be a matter of importance and urgency.”

This paper is an important addition to the understanding of spatial resolution in terms of the accuracy of the model predictions.  Such an examination can only be performed, however, when real world observational data is available (i.e. 1979 to 2003 in their paper). It is a necessary condition to have any confidence in multi-decadal global model predictions.

However, it is far from a sufficient test since, as I summarized in the post

The Failure Of Dynamic Downscaling As Adding Value to Multi-Decadal Regional Climate Prediction

the multi-decadal global model predictions must be able to skillfully predict the changes in the statistics of the climate system. In my post, I wrote

” Finally, There is sometimes an incorrect assumption that although global climate models cannot predict future climate change as an initial value problem, they can predict future climate statistics as a boundary value problem [Palmer et al., 2008]. With respect to weather patterns, for the downscaling regional (and global) models to add value over and beyond what is available from the historical, recent paleo-record, and worse case sequence of days, however, they must be able to skillfully predict the changes in the regional weather statistics.

 There is only value for predicting climate change, however, if they could skillfully predict the changes in the statistics of the weather and other aspects of the climate system. There is no evidence, however, that the model can predict changes in these climate statistics even in hindcast.”

This was not tested in the Matsueda and Palmer paper.  The paper is an informative addition to our understanding of the role of spatial resolution in a model of the atmospheric portion of the climate system. However, it is not a robust study of the effect of spatial resolution on model prediction skill of climate change decades from now.  Indeed, a more accurate title of their paper is

‘Accuracy Of  Model Simulations Using High Spatial Resolution In An Atmospheric General Circulation Global Model”

It is not a scientifically robust study of the accuracy of climate change predictions.

source of images

Comments Off

Filed under Climate Models, Research Papers

Comments On The Article “Stratospheric Pollution Helps Slow Global Warming” By David Biello

There is yet another article that documents that the role of humans in the climate system is much more than the radiative effect of CO2 and a few other gases (h/t to Marc Morano). This new study bolsters our conclusions in

Pielke Sr., R., K. Beven, G. Brasseur, J. Calvert, M. Chahine, R. Dickerson, D. Entekhabi, E. Foufoula-Georgiou, H. Gupta, V. Gupta, W. Krajewski, E. Philip Krider, W. K.M. Lau, J. McDonnell,  W. Rossow,  J. Schaake, J. Smith, S. Sorooshian,  and E. Wood, 2009: Climate change: The need to consider human forcings besides greenhouse gases. Eos, Vol. 90, No. 45, 10 November 2009, 413. Copyright (2009) American Geophysical Union

The Scientific American article, however,  still misinterprets climate system heat changes (and climate change more generally) as dominated by added CO2.

The new article is

Stratospheric Pollution Helps Slow Global Warming By David Biello July 22 2011 in Scientific American

with the headline

Particles of sulfuric acid–injected by volcanoes or humans–have slowed the pace of climate change in the past decade.

My comment on this statement is that the ejection of aerosols from humans into the stratosphere IS part of human climate change. The implication from the term “pace” is that the radiative effect of CO2 and a few other greenhouse gases is climate change. It is NOT as we summarize in our EOS article

Although the natural causes of climate variations and changes are undoubtedly important, the human influences are significant and involve a diverse range of first-order climate forcings, including, but not limited to, the human input of carbon dioxide (CO2). Most, if not all, of these human influences on regional and global climate will continue to be of concern during the coming decades.

Excerpts from the Scientific American  text read [highlights added]

Despite significant pyrotechnics and air travel disruption last year, the Icelandic volcano Eyjafjallajokull simply didn’t put that many aerosols into the stratosphere. In contrast, the eruption of Mount Pinatubo in 1991, put 10 cubic kilometers of ash, gas and other materials into the sky, and cooled the planet for a year. Now, research suggests that for the past decade, such stratospheric aerosols—injected into the atmosphere by either recent volcanic eruptions or human activities such as coal burning—are slowing down global warming.

Combined with a decrease in atmospheric water vapor and a weaker sun due to the most recent solar cycle, the aerosol finding may explain why climate change has not been accelerating as fast as it did in the 1990s. The effect also illustrates one proposal for so-called geoengineering—the deliberate, large-scale manipulation of the planetary environment—that would use various means to create such sulfuric acid aerosols in the stratosphere to reflect sunlight and thereby hopefully forestall catastrophic climate change.

But that points up another potential problem: if aerosol levels, whether natural or human-made, decline in the future, climate change could accelerate—and China is adding scrubbing technology to its coal-fired power plants to reduce SO2 emissions and thereby minimize acid rain. In effect, fixing acid rain could end up exacerbating global warming. China “could cause some decreases [in stratospheric aerosols] if that is the source,” Neely says, adding that growing SO2 emissions from India could also increase cooling if humans are the dominant cause of injecting aerosols into the atmosphere. On the other hand, “if some volcanoes that are large enough go off and if they are the dominant cause [of increasing aerosols], then we will probably see some increases” in cooling.

First, the statement that water vapor has been decreasing is remarkable. An increase in atmospheric water vapor is central to the hypothesis that the radiative effect of added CO2 would result in global warming that is significant in terms of effects on society. A lack of such an increase in water vapor is in contradiction to the 2007 IPCC model projections.

Second, the claim that “fixing acid rain could end up exacerbating global warming” somehow seems to suggest we should consider geoengineering that retains these aerosol emissions in order “to forestall catastrophic climate change“.  This is an absurd claim. I wrote about this in my post

Health Benefits Of Air Quality Control Should Never Be Sacrificed By Delaying The Clean-Up Of Aerosol Emissions For Climate Reasons

I ended that post with the conclusion

Thus, when I see attempts to delay implementation of any air quality improvement, which will cost lives, in order to provide a climate effect (i.e. through the delay in reducing sulphate emissions), we need to recognize that the priorities of those making such climate recommendations are misplaced.

source of image

Comments Off

Filed under Climate Change Forcings & Feedbacks, Climate Science Reporting

Another Example Of The Misuse Of Climate Science

On my weblog, I continue to post examples of the misrepresentation of climate predictions decades from now as skillful forecasts for the impact communities. Below, I post yet another example –  in this case a seminar presented in mid-June in Boulder. The announcement for the seminar that was held at NOAA’s David Skaggs Research Center reads

Brian Ashe
Manager of Business Development,
Riverside Technologies, Inc.

The Climate Change-Decision Support System:  A Web-Based System for Water Managers
and Planners

THURSDAY, JUNE 16, 10:00 A.m., Room 1D708

Riverside will present a briefing and demonstration of their Climate Change Decision
Support System (CC-DSS).  The aim of the CC-DSS, which was supported by a NOAA SBIR,
is to provide a web-based system for widespread and low-cost access to tools used
in generating scenarios of future water managers to rapidly assess the impact of
 projected climate change on natural flows at critical nodes along a river network.
 The system uses various IPCC driven global climate models that have been downscaled
to basin scales to drive calibrated hydrologic models.

Here is yet another example where

“The system uses various IPCC driven global climate models that have been downscaled
to basin scales to drive calibrated hydrologic models.”

As presented in our paper

Pielke Sr., R.A., R. Wilby, D. Niyogi, F. Hossain, K. Dairuku, J. Adegoke, G. Kallos, T. Seastedt, and K. Suding, 2011: Dealing with complexity and extreme events using a bottom-up, resource-based vulnerability perspective. AGU Monograph on Complexity and Extreme Events in Geosciences, in press

there is no skill in downcaling multi-decadal climate predictions from global climate models.  This approach has never been shown capable of predicting changes on this time scale in the statistics of weather on basin scales (or any other spatial scale). 

 In my view, ultimately, such studies will be recognized as misleading policymakers to the actual threats to water resources. The sooner the funders realize this, the less money and time that will be wasted. It would be an informative research project for someone to document how many NSF, NOAA and other agency funds (in the USA and elsewhere) are being spent to provide such multi-decadal climate forecasts.

source of image

Comments Off

Filed under Climate Science Meetings, Climate Science Misconceptions

News Article On The Earth’s Heat From Radioactive Decay

An intrguiging news article has appeared by Charles Q. Choi titled

Radioactive decay fuels Earth’s inner fires

The article includes the text

“Extraordinary amount of heat remains from primordial days, scientists say

The researchers found the decay of radioactive isotopes uranium-238 and thorium-232 together contributed 20 trillion watts to the amount of heat Earth radiates into space, about six times as much power as the United States consumes. U.S. power consumption in 2005 averaged about 3.34 trillion watts.

As huge as this value is, it only represents about half of the total heat leaving the planet. The researchers suggest the remainder of the heat comes from the cooling of the Earth since its birth.”

To concert the estimate in the MSNBC news article to watts per meter squared, 20 trillion watts must be divided by the area of the Earth [5.1 x 10^14 meter squared] which yields a heat source of 0.039 watts per meter squared. 

This is well less than the  significant radiative forcings as estimated in figure SPM.2 in the 2007 IPCC WG1 report and, except for local effects where lava flows and volcanic eruptions are occuring , this heat is of minor climatic importance [the outgassing of sulphur dioxide and other chemicals and of ash, of course, are a different issue].  The heating of the interior and resulting effect on currents in the Earth’s mantle, however, are important in climate on very long time scales as this helps drive plate tectonics, such as continental drift.

source of image

Comments Off

Filed under Climate Change Forcings & Feedbacks, Climate Science Reporting

My Comments On An AGU Meeting Announcement – Regional Climate Prediction at High Resolution

I was alerted to a meeting for the upcoming December 2011 AGU meeting.  After presenting the session title and abstract, and I have some comments.  The meeting is titled Regional Climate Prediction Session with the following announcement

Date: Fri, 15 Jul 2011 13:59:17 -0600

From: James Done (of UCAR)

Dear All

We invite you to submit abstracts to our session on Regional Climate Prediction at High Resolution at the American Geophysical Union Fall Meeting, 5-9 Dec 2011, San Francisco, CA. Abstract Deadline: 4th Aug.

Session Details:

GC09: Regional Climate Modeling 3. Regional Climate Prediction at High Resolution
Sponsor: Global Environmental Change (GC)
Co-Sponsor(s): Atmospheric Sciences (A), Earth and Space Science
Informatics (IN), Public Affairs (PA)

1. Greg Holland, NCAR
2. Howard Kunreuther, Wharton, University of Pennsylvania
3. William Skamarock, NCAR

Regional climate predictions at high resolution and decadal time scales are needed by industry, government and society to enable sufficient understanding and mitigate future costs and disruptions. This exciting session will present the latest scientific results and applications in high resolution climate prediction. Presentations are invited on: predictions of regional climate and high-impact weather statistics on decadal time scales, including uncertainty; coupled data assimilation for regional coupled prediction systems; coupled regional Earth system processes; statistical downscaling, and societal decision support tools. This session will stimulate interaction between diverse areas of expertise and promote novel collaboration.

Many thanks,
James Done

My comments on this announcement follow:

First, as outlined on my weblog; e.g. see

The Failure Of Dynamic Downscaling As Adding Value to Multi-Decadal Regional Climate Prediction

and in our paper

Pielke Sr., R.A., R. Wilby, D. Niyogi, F. Hossain, K. Dairuku, J. Adegoke, G. Kallos, T. Seastedt, and K. Suding, 2011: Dealing with complexity and extreme events using a bottom-up, resource-based vulnerability perspective. AGU Monograph on Complexity and Extreme Events in Geosciences, in press.

there has been NO demonstrated multi-decadal climate regional predictive skill.  Unless the AGU session is going to present papers that introduce evidence of this skill the portion of their session which is on

“….predictions of regional climate and high-impact weather statistics on decadal time scales, including uncertainty…”

is not only worthless but will be misleading policymakers.

Second, their statement that

‘Regional climate predictions at high resolution and decadal time scales are needed by industry, government and society to enable sufficient understanding and mitigate future costs and disruptions”

is certainly a desirable goal, the presentation of regional decadal climate predictions as skillful is invalid. What is missing from their session is the need to introduce a new framework to provide estimates of risk on this time scale. We proposed such an approach in our Pielke et al 2011 paper where we wrote

“We discuss the adoption of a bottom-up, resource–based vulnerability approach in evaluating the effect of climate and other environmental and societal threats to societally critical resources. This vulnerability concept requires the determination of the major threats to local and regional water, food, energy, human health, and ecosystem function resources from extreme events including climate, but also from other social and environmental issues. After these threats are identified for each resource, then the relative risks can be compared with other risks in order to adopt optimal preferred mitigation/adaptation strategies.

This is a more inclusive way of assessing risks, including from climate variability and climate change than using the outcome vulnerability approach adopted by the IPCC. A contextual vulnerability assessment, using the bottom-up, resource-based framework is a more inclusive approach for policymakers to adopt effective mitigation and adaptation methodologies to deal with the complexity of the spectrum of social and environmental extreme events that will occur in the coming decades, as the range of threats are assessed, beyond just the focus on CO2 and a few other greenhouse gases as emphasized in the IPCC assessments.”

The 2011 AGU session talks on regional climate prediction at high resolution on decadal time scale will be scientifically flawed if they do not present how they can claim prediction skill on that time scale, and if they do not also consider a bottom-up, resource-based focus which is what is actually needed for

“…..industry, government and society to enable sufficient understanding [to] mitigate future costs and disruptions”.

Comments Off

Filed under Climate Science Meetings, Climate Science Misconceptions

Ignored Request For NSF To Respond On The Lack Of Value Of Regional Downscaling Of Climate Forecasts Decades From Now

I am posting below an e-mail I sent in late June to Jay Fein and Margaret Cavanaugh at the National Science Foundation regarding what I view as fatal problems with impact studies in the coming decades based on multi-decadal global climate predictions. These impact studies are based on an assumed skill at downscaling to regional and local regions where these impacts would occur.

from: Roger A Pielke Sr to”Fein, Jay S.” <xxxxxx> with a cc to “Cavanaugh, Margaret A.” <xxxxxx>

Wed, Jun 29, 2011 at 11:04 AM

subject: NSF

Dear Jay

 I have not heard further from you or Margaret on the major issues I have raised with the funding by NSF of regional impact studies based on multi-decadal global climate model predictions.  I would appreciate your (and/or Margaret’s or other NSF program manager responses) to the substantive concerns (one could consider these as hypotheses which need to be tested) raised in my post

The Failure Of Dynamic Downscaling As Adding Value to Multi-Decadal Regional Climate Prediction

which are

“1. As a necessary condition for an accurate prediction, the multi-decadal global climate model simulations must include all first-order climate forcings and feedbacks. However, they do not [see for example: NRC, 2005; Pielke Sr. et al., 2009].

2. These global multi-decadal predictions are unable to skillfully simulate major atmospheric circulation features such the Pacific Decadal Oscillation [PDO], the North Atlantic Oscillation [NAO], El Niño and La Niña, and the South Asian monsoon [Pielke Sr., 2010; Annamalai et al., 2007].

3. While dynamic regional downscaling yield higher spatial resolution, the regional climate models are strongly dependent on the lateral boundary conditions and interior nudging by their parent global models [e.g., see Rockel et al., 2008]. Large-scale climate errors in the global models are retained and could even be amplified by the higher spatial resolution regional models.

4. Since as reported, the global multi-decadal climate model predictions cannot accurately predict circulation features such as the PDO, NAO, El Niño, and La Niña [Compo et al., 2011] they cannot provide accurate lateral boundary conditions and interior nudging to the regional climate models.

5. The regional models themselves do not have the domain scale (or two-way interaction) to skillfully predict these larger-scale atmospheric features.

6. There is also only one-way interaction between regional and global models which is not physically consistent. If the regional model significantly alters the atmospheric and/or ocean circulations, there is no way for this information to alter the larger-scale circulation features which are being fed into the regional model through the lateral boundary conditions and nudging.

7. When higher spatial analyses of land use and other forcings are considered in the regional domain, the errors and uncertainty from the larger model still persists thus rendering the added complexity and details ineffective [Ray et al. 2010; Mishra et al. 2010].

8. The lateral boundary conditions for input to regional downscaling require regional-scale information from a global forecast model. However the global model does not have this regional-scale information due to its limited spatial resolution. This is, however, a logical paradox since the regional model needs something that can only be acquired by a regional model (or regional observations). Therefore, the acquisition of lateral boundary conditions with the needed spatial resolution becomes logically impossible.

Finally, There is sometimes an incorrect assumption that although global climate models cannot predict future climate change as an initial value problem, they can predict future climate statistics as a boundary value problem [Palmer et al., 2008]. With respect to weather patterns, for the downscaling regional (and global) models to add value over and beyond what is available from the historical, recent paleo-record, and worse case sequence of days, however, they must be able to skillfully predict the changes in the regional weather statistics.

 There is only value for predicting climate change, however, if they could skillfully predict the changes in the statistics of the weather and other aspects of the climate system. There is no evidence, however, that the model can predict changes in these climate statistics even in hindcast. As highlighted in Dessai et al. [2009] the finer and time-space based downscaled information can be .misconstrued as accurate., but the ability to get this finer-scale information does not necessarily translate into increased confidence in the downscaled scenario [Wilby, 2010].”
These issues have passed peer review in our paper

Pielke Sr., R.A., R. Wilby, D. Niyogi, F. Hossain, K. Dairuku, J. Adegoke, G. Kallos, T. Seastedt, and K. Suding, 2011: Dealing with complexity and extreme events using a bottom-up, resource-based vulnerability perspective. AGU Monograph on Complexity and Extreme Events in Geosciences, in press.

I have also submitted comments to the National Science Board as I report in my post

My Comments On “NSB/NSF Seeks Input on Proposed Merit Review Criteria Revision and Principles”

This is based on my documented negative experiences with the NSF program with respect to climate research in

My Experiences With A Lack Of Proper Diligence And Bias In The NSF Review Process For Climate Proposals
I plan to post on my request for information from you on my science issues regarding downscaling in order to obtain multi-decadal regional climate impacts. While, as you requested, I will not reproduce your e-mails but I will report that I have contacted you on this.

Quite frankly, in my view, this is a waste of large amounts of NSF funding on climate. However, I welcome responses, which I can post, that seek to refute this conclusion in order to facilitate a much overdue debate on these questions.



The failure of Jay Fein and Margaret Cavanaugh of the National Science Foundation to even have the courtesy of a reply to the issues I am raising illustrates a failure in accountability of this US federal agency. They hold the authority over funding these multi-decadal climate prediction impact studies without any oversight over the scientific robustness of this methodology. Huge amounts of money are being wasted in this misuse of modeling.

My recommendation is that a Congressional subcommittee examine whether their expenditures of funds for these impact studies is an effective use of federal tax dollars. In my view, it is not only a waste of money but is misleading policymakers on the actual spectrum of risks that society and the environment face in the future, as outlined in our paper

Pielke Sr., R.A., R. Wilby, D. Niyogi, F. Hossain, K. Dairuku, J. Adegoke, G. Kallos, T. Seastedt, and K. Suding, 2011: Dealing with complexity and extreme events using a bottom-up, resource-based vulnerability perspective. AGU Monograph on Complexity and Extreme Events in Geosciences, in press.

source of image

Comments Off

Filed under Climate Models, RA Pielke Sr. Position Statements