Category Archives: Uncategorized

Guest Post By Ben Herman On Regional Climate Modeling

Guest Post By Professor Ben Herman of the University of Arizona. As written on the University’s website

Dr. Herman is primarily concerned with the optics of atmospheric aerosols, polarization and scattering, and the application of inversion techniques to analyze remote sensing data obtained from aircraft and satellites. Currently, he is working on several satellite based remote sensing projects to monitor ozone, temperature, water vapor and aerosols from space.

Following is Ben’s guest post on regional climate modelling

I have had several discussions here with various people concerning  the problem of regional climate prediction using climate models to set up boundary conditions for a smaller, regional area in which a  much smaller grid size is used. The problem is that if the boundary conditions are incorrect, obviously this will deteriorate any predictions made using those boundaries. Now with that said, there seems to be  quite a lot of effort being put into regional prediction using global climate models. The climate models, at this time, are using a much larger grid than required for regional prediction, so this has led to the climate of the smaller regional areas of interest to be solved for separately, using the results of the global to  be used as boundary conditions. I have  compared it to having a set of numbers accurate to one decimal point, using those measurements in a series of mathematical operations, and providing answers to ,say three decimal points. You can do that, but, of course, the additional decimal points have no practical use.

Another way to look at this is to imagine that the global solution for a given time has been broken down into a Fourier series. This series will contain frequencies that are limited by the grid spacing of the global model. Any frequencies beyond this range are not present and thus, cannot have an impact within the smaller region, even though higher frequencies within this region may be solved for due to the higher resolution within this region. But, since these higher frequencies cannot be influenced by similar frequencies outside of the region in question, they will generally be in error. The only way that this error could be avoided would be if none of these higher frequencies were present during the valid time of the forecast, generally an unlikely occurence. It thus appears to me that undermost conditions, the regional forecasts, using present techniques, are operating with an incomplete set of initial conditions which will certainly limit their accuracy.

source of image

Comments Off on Guest Post By Ben Herman On Regional Climate Modeling

Filed under Uncategorized

Attrbibution of the Warm Winter To Global Warming – An Example Of The Mistatement Of Reality By Some Climate Scientists

There is a news article with the header

Not just March, but start of 2012 shatter US records for heat, worrying meteorologists

by Seth Borenstein

Excerpts from the article read [with my comments inserted]

It’s not just March.

“It’s been ongoing for several months,” said Jake Crouch, a climate scientist at NOAA’s National Climatic Data Center in Asheville, N.C.

Meteorologists say an unusual confluence of several weather patterns, including La Nina, was the direct cause of the warm start to 2012. While individual events cannot be blamed on global warming, Crouch said this is like the extremes that are supposed to get more frequent because of man-made climate change. Greenhouse gases come from the burning of fossil fuels such as coal and oil.

My Comment:  The highlighted text is yet another example where unsubstantiated claims are made to infer that these extremes are a result of added CO2. Yet what Mr. Crouch ignored is that in March, the global average lower tropospheric temperature anomalies were only +0.11C above the 30 year average for March (see)! The lower tropospheric anomalies are ~0.4C cooler than in early 2010.

The magnitude of how unusual the year has been in the U.S. has alarmed some meteorologists who have warned about global warming. One climate scientist said it is the weather equivalent of a baseball player on steroids, with old records obliterated.

“When you look at what’s happened in March this year, it’s beyond unbelievable,” said University of Victoria climate scientist Andrew Weaver.

My Comment: Andrew Weaver is failing to communicate that extreme cool and warm anomalies occur often in specific geographic regions. If he wants to make the case that this warm event was “beyond unbelievable” he should do this with quantitative analyses, such as we did in our paper

Chase, T.N., K. Wolter, R.A. Pielke Sr., and Ichtiaque Rasool, 2006: Was  the 2003 European summer heat wave unusual in a global context? Geophys.  Res. Lett.,  33, L23709, doi:10.1029/2006GL027470.

rather than make unsubstantiated claims in the media.

The news article continues

NOAA climate scientist Gabriel Vecchi compared the increase in weather extremes to baseball players on steroids: You can’t say an individual homer is because of steroids, but they are hit more often and the long-held records for home runs fall.

They seem to be falling far more often because of global warming, said NASA top climate scientist James Hansen. In a paper he submitted to the journal Proceedings of the National Academy of Sciences and posted on a physics research archive, Hansen shows that heat extremes aren’t just increasing but happening far more often than scientists thought.

What used to be a 1-in-400 hot temperature record is now a 1 in 10 occurrence, essentially 40 times more likely, said Hansen. The warmth in March is an ideal illustration of this, said Hansen, who also has become an activist in fighting fossil fuels.

My Comment: Jim has, as with Andrew Weaver, failed to quantify his statement with real world observations as shown in the figure at the top of this post. This figure shows that much of the planet, in terms of areal extent, was cooler than average. Weaver’s comment below shows that he is ready to endorse any claim which supports his view, regardless of the scientific rigor in which it has been examined.

Weaver, who reviewed the Hansen paper and called it “one of the most stunning examples of evidence of global warming.”

Seth did include the important caveat in the middle of his report that reads

It is important to note that this unusual winter heat is mostly a North America phenomenon. Much of the rest of the Northern Hemisphere has been cold, said NOAA meteorologist Martin Hoerling.

My Comment: This statement by Martin in the news report is refreshing as it does properly place the warm March in the United States and Canada in it proper perspective. Martin should be commended for this objective statement, and Seth should be acknowledged that he included this so as to provide a bit of balance in the article.

My recommendation for future press articles of this sort, however, is to be more inclusive of credentialed climate scientists who can provide a complete view of such weather extremes. Such internationally respected colleagues include Joe Daleo, Bill Gray, Peter Webster,  Judy Curry, Pat Michaels and Klaus Wolter should also be interviewed.

Comments Off on Attrbibution of the Warm Winter To Global Warming – An Example Of The Mistatement Of Reality By Some Climate Scientists

Filed under Uncategorized

Multi-Decadal Climate Model Testing Requirements – A Summary

In this post, I summarize two necessary requirements for multi-decadal global climate models to be met before multi-decadal projections for the coming decades should be communicated to stakeholders and policymakers.  I have discussed this in past posts (e.g. see), but am motivated to summarize here in light of the recognition (finally) of the inability of researchers to attribute changes in disasters to changes in climate statistics, as discussed in my son’s post

A Handy Bullshit Button on Disasters and Climate Change

In terms of testing the models, necessary conditions (but still not a sufficient condition) for the models to have any credibility to predict the future climate on decadal time scales are:

1. They must accurately simulate (hindcast) the statistics of major atmospheric and ocean circulation features over the last few decades (since real world data is available)


2. They must accurately simulate (hindcast) the statistics of the changes in the statistics of these major atmospheric and ocean circulation features over the last few decades.

If they cannot do both #1 and #2, they must be rejected as robust predictive (projection) tools for the coming decades.

A rationalization that the climate forcing in the coming decades could be outside of what has occurred in the past does not in any way remedy this deficiency. If they cannot skillfully predict #1 and #2, model predictions of the coming decades, published in journal articles, news reports, and climate assessments, are misinforming and misleading  stakeholders and policymakers.

In terms of #1, there is progress, as reported, for example, in

Guest Post Titled “Decadal Prediction Skill In A Multi-Model Ensemble” By Geert Jan van Oldenborgh, Francisco J. Doblas-Reyes, Bert Wouters, Wilco Hazeleger.

but there is no evidence that I am aware of satisfying #2. Even with #1, we could use reanalyses and have no need for the climate hindcasts (other than to investigate climate processes).

The real barrier that must be overcome is #2.  To date, to my knowledge, this issue has not even been discussed as part of the current IPCC assessment. They clearly have more work to do.

If, however, the IPCC ignores the need to satisfy #1 and #2 but they present projections as part of the IPCC reports, policymakers, the public and impact scientists should be ready to press the “bullshit button”.

source of image

Comments Off on Multi-Decadal Climate Model Testing Requirements – A Summary

Filed under Climate Models, Uncategorized

Funding Agency Bias – A Short Summary

The current focus and funding priorities of the NSF, NOAA, the Uk Met Office and other agencies can be succinctly summarized by the introductory sentence in the BAMS paper

Ho, Chun Kit , David B. Stephenson, Matthew Collins, Christopher A. T. Ferro, Simon J. Brown. 2012: Calibration Strategies: A Source of Additional Uncertainty in Climate Change Projections. Bulletin of the American Meteorological Society. Volume 93, Issue 1 (January 2012) pp. 21-26. doi:

which states

“Reliable projections of weather variables from climate models are required for the assessment of future climate change impacts (e.g., flooding, drought, temperature-related mortality, and crop yield).”

This is just one example of the top-down approach which we have shown in our papers (and with referral to other studies) to be a scientifically flawed methodology, but it is a mindset that permeates the funding agencies in the USA, UK and elsewhere.

In contrast, as I report in the post

The NSF CREATIV Initiative On Interdisciplinary Research – Another Example Of Thinking Inside The Box

the alternative  bottom-up, resource-based perspective, which we conclude is not only scientifically robust but is of direct and immediate benefit to stakeholders and policymakers, is cavalierly dismissed. As the NSF Program Officer wrote

“The notion that one can usefully look at the incremental threat to a sector from any particular hazard is not a great conceptual leap forward. “

Until these fundings agencies can become “honest brokers” of the issues in climate science, we are going to continue to be preventing from robust examination of many of the issues in climate science, and  more broadly, in environmental science.

source of image

Comments Off on Funding Agency Bias – A Short Summary

Filed under Uncategorized

An Example Of The Reasons That Skillful Multi-Decadal Predictions Of Climate Change Has Not Been Achieved – “Long Tails In Regional Surface Temperature Probability Distributions With Implications For Extremes Under Global Warming” By Ruff and Neelin 2012

The recognition that the models need to skillfully predict changes in the statistics of climate variables (and they have not in hindcasts), if properly recognized by the IPCC impacts assessment group, would have major implications. So far they have mostly ignored this issue when seeking to convince people as to why the multi-decadal regional and global modal predictions should be considered robust.

Jos de Laat has alerted us to a new paper which addresses part of this issue. While the article contains the usual acceptance of the multi-decadal model predictions as robust, their paper actually illustrates why the models have so far not passed this test. The models have not passed this test. The paper is

Ruff, T. W., and J. D. Neelin (2012), Long tails in regional surface temperature probability distributions with implications for extremes under global warming, Geophys. Res. Lett., 39, L04704, doi:10.1029/2011GL050610.

The abstract reads [highlight added]

“Prior work has shown that probability distributions of column water vapor and several passive tropospheric chemical tracers exhibit longer-than-Gaussian (approximately exponential) tails. The tracer-advection prototypes explaining the formation of these long-tailed distributions motivate
exploration of observed surface temperature distributions for non-Gaussian tails. Stations with long records in various climate regimes in the National Climatic Data Center Global Surface Summary of Day observations are used to examine tail characteristics for daily average, maximum and minimum surface temperature probability distributions. Each is examined for departures from a Gaussian fit to the core (here approximated as the portion of the distribution exceeding 30% of the maximum). While the core conforms to Gaussian for most distributions, roughly half the cases exhibit non-Gaussian tails in both winter and summer seasons. Most of these are asymmetric, with a long, roughly exponential, tail on only one side. The shape of the tail has substantial implications for potential changes in extreme event occurrences under global warming. Here the change in the probability of exceeding a given threshold temperature is quantified in the simplest case of a shift in the present-day observed distribution. Surface temperature distributions with long tails have a much smaller change in threshold exceedances (smaller increases for high-side and smaller decreases for low-side exceedances relative to exceedances in current climate) under a given warming than do near-Gaussian distributions. This implies that models used to estimate changes in extreme event occurrences due to global warming should be verified regionally for accuracy of simulations of probability distribution tails.”

The conclusion of the paper has the text

“The sensitive dependence of tail characteristics on regional effects noted here suggests that it will be (i) useful to understand the physical mechanisms that produce them (including the observed asymmetry, and the sources of regional dependence); and (ii) essential to verify whether high-resolution models accurately reproduce observed tail characteristics for any region for which an assessment of extreme events is being conducted. A model that has an error in the nature of the tail, e.g., erroneously produces a Gaussian rather than a long tail under current climate for a particular region, will likely have serious errors in quantitatively predicting the increase in exceedances under future climate.”

As we wrote in our article

Pielke Sr., R.A., and R.L. Wilby, 2012: Regional climate downscaling – what’s the point? Eos Forum,  93, No. 5, 52-53, doi:10.1029/2012EO050008.

“….for regional downscaling (and global) models to add value (beyond what is available to the impacts community via the historical, recent paleorecord and a worst case sequence of days), they must be able to skillfully predict changes in regional weather statistics in response to human climate forcings. This is a greater challenge than even skillfully simulating current weather statistics.”

The new Ruff and Neelin 2012 provide support for this conclusion.

source of image

Comments Off on An Example Of The Reasons That Skillful Multi-Decadal Predictions Of Climate Change Has Not Been Achieved – “Long Tails In Regional Surface Temperature Probability Distributions With Implications For Extremes Under Global Warming” By Ruff and Neelin 2012

Filed under Uncategorized

A Proposal “Comparison Of GHCN Temperature Anomalies And Trends With Long Term Fluxnet Temperature Anomalies And Trends”

We [Markus Reichstein and I] have been unsucessful in obtaining funding for the proposal below, so I have posted to encourage others to pursue it. It builds on the issue of station siting quality that we discuss in our paper

Fall, S., A. Watts, J. Nielsen-Gammon, E. Jones, D. Niyogi, J. Christy, and R.A. Pielke Sr., 2011: Analysis of the impacts of station exposure on the U.S. Historical Climatology Network temperatures and temperature trends. J. Geophys. Res.,  116, D14120, doi:10.1029/2010JD015146.Copyright (2011) American Geophysical Union.

A Proposal 

The comparison of GHCN temperature anomalies and trends with long term Fluxnet temperature anomalies and trends

The Global Historical Climate Reference Network [GHCN] is the foundation for the land portion of the annual average multi-decadal global surface temperature trends [; Peterson et al 1998; Karl et al, 2006]. This temperature data is assumed to be robust with respect to assessing anomalies and long term trends, as reported, for example, by Parker (2004).

However, there have been questions raised with respect to the existence of systematic biases in the data due to the local landscape around the observing sies, as well as a need to attribute what fraction of the anomalies and long term trends are due to added CO2 and other greenhouse gases, aerosols and landscape change [e.g. Mahmood et al 2010].

This need to further examine the quantitative robustness of this land surface temperature data was highlighted at the 2010 Exeter meeting –  Surface temperature datasets for the 21st Century . For example, Matt Menne, Claude Williams and Jay Lawrimore reported that

“[GHCN Monthly]Version 2 [was] released in 1997….but without station histories for stations outside the USA)”


“Undocumented changes [in the USHCN] can be as prevalent as documented changes even when extensive (digitized) metadata are available”.

There is also a growing divergence between multi-decadal lower tropospheric temperature trends and surface temperature trends [Klotzbach et al, 2009, 2010), a  need to simultaneously assess long term temperature and humidity trends (i.e. moist enthalpy; e.g. see Fall et al, 2010), and of the determination of specific landscape in which the temperature measurements are made and how this effects the absolute humidity and dry bulb temperature (e.g. see Fall et al 2009).

This debate is overviewed in Pielke et al (2007,2009) and Parker et al (2009). At FLUXNET sites a recent related study has shown the contrasting behavior of forest and grassland sites in terms of radiative, sensible and latent energy fluxes during heatwaves (Teuling et al. 2010). This study also showed the potential of remote sensing (e.g. land surface temperatures) in this context.

The fundamental questions include:

  • What is the role of the local landscape in the immediate vicinity of the GHCN sites on long term temperature trends? Fall et al (2010) has found that poorly sited locations in the USA (i.e. those that are not representative of the larger scale region) have biases in averaged maximum and minimum temperatures and in diurnal range.
  • What is the importance of anomalies and multi-decadal trends in absolute humidity on the anomalies and multi-decadal trends in the dry bulb temperature?  Pielke et al 2004 discussed how the same trends in heat (moist enthalpy) can be accommodated by a variety of different trends in humidity and temperature.  Land use-land cover change clearly can influence both.
  • What is the role of landscape type on temperature (and moist enthalpy) on anomalies and long term trends? Diffenbaugh et al 2009, for example, found statistically significant cooling in areas of the Great Plains where crop/mixed farming has replaced short grass, areas of the Midwest and southern Texas where crop/mixed farming has replaced interrupted forest, and areas of the western United States containing irrigated crops.

The long term measurements at the Fluxnet sites provide an opportunity to assess the quantitative the spatial representiveness of anomalies and long term trends in sites within the GHCN network that are close to Fluxnet sites. At the FLUXNET sites air temperature and humidity are measured together with energy and carbon fluxes in the boundary layer above the vegetation canopy at half-hourly time-step. In the most recent standardized collection, the La-Thuille 2007 data set, there are around 950 site-years containing observations from a total of 253 sites (documented and available subject to specific use-policies at The FLUXNET network has the highest density of sites in Europe and North America, but data from all other continents are available as well. Information on the exact instrument and configuration for temperature and humidity measurements is not generally available. External data which is available to characterize the landscape context include images from Google at a maximum of five resolutions (at some remote sites the highest resolution is not available) (Reichstein pers comm.), Visual Earth ( and MODIS cutouts (ORNL DAAC). Some sites but not many also have webcams installed, but those are not in the current data set.

Figure 1:  Distribution of FLUXNET sites within the LaThuile database (A) in geographical space, (B) in simplified climate space. In (A) maps colors code mean annual temperature (CRU) according to legend. Grey are area which are not covered by FLUXNET sites in terms of climate space (climate space distance threshold). In (A) and (B) symbol represent land cover classes as in the legend of (B) . (from Reichstein et al. in prep.)

Our proposal is to compare the anomalies and long term trends in dry bulb temperature and absolute humidity at the Fluxnet sites with GHCN measurements of these quantities sites that are in the vicinity of the Fluxnet locations. The sensible, latent and radiative fluxes at the Fluxnet sites can be used to explain the observed anomalies and trends.

Among the research questions are:

  • Are there statistically different anomalies and trends between the Fluxnet and GHCN nearly collocated sites? Do they occur predominantly during specific synoptic situations?
  • If so, what is the reason for the differences? Can a landscape type component be used to explain some or all of the differences?

Photographs of the GHCN sites that are used for these comparisons need to be obtained, as has been completed for the USHCN (see Watts, 2009).


Diffenbaugh, N. S., 2009:Influence of modern land cover on the climate of the United States. Climate Dynamics. DOI 10.1007/s00382-009-0566-z

Fall, S., D. Niyogi, A. Gluhovsky, R. A. Pielke Sr., E. Kalnay, and G. Rochon, 2009: Impacts of land use land cover on temperature trends over the continental United States: Assessment using the North American Regional Reanalysis. Int. J. Climatol., DOI: 10.1002/joc.1996.

Fall, S., N. Diffenbaugh, D. Niyogi, R.A. Pielke Sr., and G. Rochon, 2010: Temperature and equivalent temperature over the United States (1979 – 2005). Int. J. Climatol., DOI: 10.1002/joc.2094.

Fall, S., A. Watts, J. Nielsen-Gammon, E. Jones, D. Niyogi, J. Christy, and R.A. Pielke Sr., 2011: Analysis of the impacts of station exposure on the U.S. Historical Climatology Network temperatures and temperature trends. J. Geophys. Res.,  116, D14120, doi:10.1029/2010JD015146.Copyright (2011) American Geophysical Union.

Thomas R. Karl, Susan J. Hassol, Christopher D. Miller, and William L. Murray, editors, 2006. Temperature Trends in the Lower Atmosphere: Steps for Understanding and Reconciling Differences. A Report by the Climate Change Science Program and the Subcommittee on Global Change Research, Washington, DC.

Klotzbach, P.J., R.A. Pielke Sr., R.A. Pielke Jr., J.R. Christy, and R.T. McNider, 2009: An alternative explanation for differential temperature trends at the surface and in the lower troposphere. J. Geophys. Res., 114, D21102, doi:10.1029/2009JD011841.

Klotzbach, P.J., R.A. Pielke Sr., R.A. Pielke Jr., J.R. Christy, and R.T. McNider, 2010: Correction to: “An alternative explanation for differential temperature trends at the surface and in the lower troposphere. J. Geophys. Res., 114, D21102, doi:10.1029/2009JD011841″, J. Geophys. Res., 115, D1, doi:10.1029/2009JD01365

Mahmood, R., R.A. Pielke Sr., K.G. Hubbard, D. Niyogi, G. Bonan, P. Lawrence, B. Baker, R. McNider, C. McAlpine, A. Etter, S. Gameda, B. Qian, A. Carleton, A. Beltran-Przekurat, T. Chase, A.I. Quintanar, J.O. Adegoke, S. Vezhapparambu, G. Conner, S. Asefi, E. Sertel, D.R. Legates, Y. Wu, R. Hale, O.W. Frauenfeld, A. Watts, M. Shepherd, C. Mitra, V.G. Anantharaj, S. Fall,R. Lund, A. Nordfelt, P. Blanken, J. Du, H.-I. Chang, R. Leeper, U.S. Nair, S. Dobler, R. Deo, and J. Syktus, 2010: Impacts of land use land cover change on climate and future research priorities. Bull. Amer. Meteor. Soc., 91, 37–46, DOI: 10.1175/2009BAMS2769.1

Parker, D. E. (2004), Climate: Large-scale warming is not urban, Nature, 432, 290(18 November 2004); doi:10.1038/432290a.

Parker, D. E., P. Jones, T. C. Peterson, and J. Kennedy (2009), Comment on ‘Unresolved Issues with the Assessment of Multi-Decadal Global Land Surface Temperature Trends’ by Roger A. Pielke, Sr. et al., J. Geophys. Res., doi:10.1029/2008JD010450

Reichstein, M., Papale, D., Baldocchi, D. et al. (in prep). A new global harmonized eddy covariance data set from FLUXNET: uncertainties, limitations and robust global patterns

Teuling A.J., Seneviratne S.I., Stöckli R., Reichstein M., Moors E., Ciais P., Luyssaert S., van den Hurk B., Ammann C., Bernhofer C., Dellwik E., Gianelle D., Gielen B., Grünwald T., Klumpp K., Montagnani L., Moureaux C., Sottocornola M. & Wohlfahrt G. (2010) Contrasting response of European forest and grassland energy exchange to heatwaves. Nature Geoscience, doi:10.1038/ngeo950

Thomas C. Peterson, Russell Vose, Richard Schmoyer, Vyachevslav Razuvaëv, 1998:  Global historical climatology network (GHCN) quality control of monthly temperature data DOI: 10.1002/(SICI)1097-0088(199809)18:11<1169::AID-JOC309>3.0.CO;2-U

Pielke Sr., R.A., C. Davey, and J. Morgan, 2004: Assessing “global warming” with surface heat content. Eos, 85, No. 21, 210-211

Pielke Sr., R.A., C. Davey, D. Niyogi, S. Fall, J. Steinweg-Woods, K. Hubbard, X. Lin, M. Cai, Y.-K. Lim, H. Li, J. Nielsen-Gammon, K. Gallo, R. Hale, R. Mahmood, S. Foster, R.T. McNider, and P. Blanken, 2007: Unresolved issues with the assessment of multi-decadal global land surface temperature trends. J. Geophys. Res., 112, D24S08, doi:10.1029/2006JD008229.

Pielke Sr., R.A., C. Davey, D. Niyogi, S. Fall, J. Steinweg-Woods, K. Hubbard, X. Lin, M. Cai, Y.-K. Lim, H. Li, J. Nielsen-Gammon, K. Gallo, R. Hale, R. Mahmood, S. Foster, R.T. McNider, and P. Blanken, 2009: Reply to comment by David E. Parker, Phil Jones, Thomas C. Peterson, and John Kennedy on “Unresolved issues with the assessment of multi-decadal global land surface temperature trends. J. Geophys. Res., 114, D05105, doi:10.1029/2008JD010938.

Watts, A. 2009: Is the U.S. Surface Temperature Record Reliable? 28 pages, March 2009 The Heartland Institute.

source of image

Comments Off on A Proposal “Comparison Of GHCN Temperature Anomalies And Trends With Long Term Fluxnet Temperature Anomalies And Trends”

Filed under Uncategorized

An NRC Study “A National Strategy for Advancing Climate Modeling” – A Missed Opportunity

There is an NRC project underway that is titled
The Statement of Tasks is reproduced below [highlight added[
Statement of Task

Climate models are the foundation for understanding and projecting climate and climate-related changes and are thus critical tools for supporting climate-related decision making.  This study will develop a strategy for improving the nation’s capability to accurately simulate climate and related Earth system changes on decadal to centennial timescales.  The committee’s report is envisioned as a high level analysis, providing a strategic framework to guide progress in the nation’s climate modeling enterprise over the next 10-20 years.  Specifically, the committee will:

1. Engage key stakeholders in a discussion of the status and future of climate modeling in the United States over the next decade and beyond, with an emphasis on decade to century timescales and local to global resolution.  This discussion should include both the modeling and user communities, broadly defined, and should focus on the strengths and challenges of current modeling approaches, including their usefulness to decision making, the observations and research activities needed to support model development and validation, and potential new directions in all of these spheres.

2. Describe the existing landscape of domestic and international climate modeling efforts, including approaches being used in research and operational settings, new approaches being planned or discussed, and the relative strengths and challenges of the various approaches, with an emphasis on models with decade to century timescales and local to global resolution.

3. Discuss, in broad terms, the observational, basic and applied research, infrastructure, and other requirements of current and possible future climate modeling efforts, and develop a strategic approach for identifying the priority observations, research, and decision support activities that would lead to the greatest improvements in our understanding and ability to monitor, model, and respond to climate change on local to global space scales and decade to century timescales.

4. Provide recommendations for developing a comprehensive and integrated national strategy for climate modeling over the next decade (i.e., 2011-2020) and beyond. This advice should include discussion of different modeling approaches (including the relationship between decadal-to-centennial scale modeling with modeling activities at other timescales); priority observations, research activities, and infrastructure for supporting model development; and how all of these efforts can be made most useful for decision making in this decade and beyond.

Examples of the types of strategic questions to be addressed include:  What is the appropriate balance between improving resolution and adding complexity as computing power improves?  What are the advantages and disadvantages of different approaches to projecting regional climate change (e.g., embedded regional models, statistical downscaling, etc.)?  What are the benefits and tradeoffs associated with multi-model versus unified modeling frameworks?  What opportunities might exist to develop better interfaces and integration between Earth system models and models of human systems?  What observations and process studies are needed to initialize climate predictions on both regional and global scales, advance our understanding of relevant physical processes and mechanisms, and validate model results?  What critical infrastructure constraints, including high performance computing and personnel issues, currently limit model development and use?  What steps can be taken to improve the communication of climate model results (e.g., presentation of uncertainties) and ensure that the climate modeling enterprise remains relevant to decision making?  What modeling approaches and activities are likely to provide the most value for the investments required?

The membership of this committee has internationally well-respected scientists on it. However, while they are tasked to

Engage key stakeholders in a discussion of the status and future of climate modeling in the United States over the next decade and beyond, with an emphasis on decade to century timescales and local to global resolution.”


develop a strategic approach for identifying the priority observations, research, and decision support activities that would lead to the greatest improvements in our understanding and ability to monitor, model, and respond to climate change on local to global space scales and decade to century timescales”

there are no stakeholders on the Panel!  Instead of taking advantage of this opportunity to outline a robust way forward to reduce the risks to key (as specified by the stakeholders) societal and environmental resources, this Panel perpetuates the top-down global climate model dominated approach to provide information to the impacts community.  However, we we presented in our article

Pielke Sr., R.A., R. Wilby, D. Niyogi, F. Hossain, K. Dairuku, J. Adegoke, G. Kallos, T. Seastedt, and K. Suding, 2012: Dealing  with complexity and extreme events using a bottom-up, resource-based  vulnerability perspective. AGU Monograph on Complexity and  Extreme Events in Geosciences, in press

the IPCC top-down approach has not shown any skill at predicting multi-decadal changes in the climate statistics on regional and local scales. As we wrote in that article, the bottom-up (i.e. non-multi-decadal climate model prediction based) contextual approach

is a more inclusive way of assessing risks, including from climate variability and climate change than using the outcome vulnerability approach adopted by the IPCC. A contextual vulnerability assessment, using the bottom-up, resource-based framework is a more inclusive approach for policymakers to adopt effective mitigation and adaptation methodologies to deal with the complexity of the spectrum of social and environmental extreme events that will occur in the coming decades, as the range of threats are assessed, beyond just the focus on CO2 and a few other greenhouse gases as emphasized in the IPCC assessments.”

We also have summarized the fundamental deficiencies of the regional downscaling of multi-decadal climate predictions in our EOS Forum article

Pielke Sr., R.A., and R.L. Wilby, 2011: Regional climate downscaling – what’s the point? Eos Forum. January 31 2012

The members of the panel are [and I have noted who are also current IPCC authors]

1. Dr. Chris Bretherton – (Chair)  University of Washington  Current IPCC Author

Chris Bretherton is currently a Professor in the University of Washington Departments of Atmospheric Science and Applied
Mathematics, where he teaches classes on weather, atmospheric turbulence and cumulus convection, tropical meteorology, geophysical fluid dynamics, numerical methods, and classical analysis of ODEs and PDEs. He directs the University of Washington Program on Climate Change, which organizes graduate courses, seminars, a summer institute, and research on climate science and its relevance to our society and future. His group developed the parameterizations of shallow cumulus convection used in the cutting-edge versions of two leading US climate models, the National Center for Atmospheric Research Community Atmosphere Model, version 5 (CAM5), and the Geophysical Fluid
Dynamics Laboratory Atmosphere Model, version 3 (AM3). They also developed the turbulence parameterization used in CAM5, and have versions of both schemes for the Weather Research and Forecast (WRF) regional modeling system.

2. Dr. Venkatramani Balaji Princeton University

V. Balaji heads the Modeling Systems Group serving developers of Earth System models at GFDL and Princeton University. With a background in physics and climate science, he has become an expert in the area of parallel computing and scientific infrastructure,
providing high-level programming interfaces for expressing parallelism in scientific algorithms. He has pioneered the use of frameworks (such as the Flexible Modeling System: FMS, as well as community standards such as ESMF and PRISM) allowing the construction of climate models out of independently developed components sharing a technical architecture; and of curators (FMS Runtime Environment FRE) for the execution of complex workflows to manage the complete climate modeling process. The Earth System Curator (US) and Metafor (EU) projects, in which he plays a key role, have developed the use of a common information model which allows the execution of complex scientific queries on model data archives. V. Balaji plays advisory roles on NSF, NOAA and DOE review panels, including the recent series of exascale workshops. He is a
sought-after speaker and lecturer and is committed to provide training in the use of climate models in developing nations, leading workshops to advanced students and researchers in South Africa and India.

3. Dr. Thomas L. Delworth Geophysical Fluid Dynamics Laboratory

Thomas L. Delworth is a Research Scientist and group leader in the Climate Change, Variability and Prediction Group at NOAA’s GFDL.
His research is largely focused around decadal to centennial climate variability and change through the synthesis of climate models and observational data. On these time scales the behavior of the climate system is a mixture of natural variability combined with the response of the climate system to changing radiative forcing induced by changing greenhouse gases and aerosols. Understanding the natural variability of the climate system on decadal scales is critical to their ability to detect climate change, and to understand the processes responsible for observed change from the global to the regional scale.

4. Dr. Robert E. Dickinson The University of Texas at Austin

Robert E. Dickinson joined the Department of Geological Sciences in August of 2008. For the previous 9 years, he was Professor of Atmospheric Sciences and held the Georgia Power/ Georgia Research Alliance Chair at the Georgia Institute of Technology, the 9 years before that he was Professor of Atmospheric Sciences and Regents Professor at the University of Arizona, and for the previous 22 years a Senior Scientist at the National Center for Atmospheric Research. He was elected to the U.S. National Academy of Sciences in 1988, to the U.S. National Academy of Engineering in 2002, and a foreign member of the Chinese Academy of Sciences in 2006. His research interests are in climate modeling, climate variability and change, aerosols, the hydrological cycle and droughts, land surface processes, the terrestrial carbon cycle, and the application of remote sensing data to modeling of land surface processes.

5. Dr. James A. Edmonds –  Joint Global Change Research Institute –  Current IPCC Author

James Edmonds is a Chief Scientist and Laboratory Fellow at the Pacific Northwest National Laboratory’s Joint Global Change Research Institute, a collaboration with the University of Maryland at College Park. His research in the areas of long-term, global, energy, technology, economy, and climate change spans three decades, producing several books, numerous scientific papers and countless presentations. He is one of the pioneers in the field of integrated assessment modeling of climate change. His principal research focus is the role of energy technology in addressing climate change. He is the Chief Scientist for the Integrated Assessment Research Program in the Office of Science at the U.S. Department of Energy. He has been an active participant in all of the major assessments of the Intergovernmental Panel on Climate Change.

6. Dr. James S. Famiglietti University of California, Irvine

James S. Famiglietti holds a joint faculty appointment in Earth System Science and in Civil and Environmental Engineering at the University of California, Irvine, where he is the Founding Director of the system-wide UC Center for Hydrologic Modeling. He holds a B.S. in Geology from Tufts University, an M.S. in Hydrology from the University of Arizona, and an M.A. and a Ph.D. in Civil Engineering and Operations Research from Princeton University. He completed his postdoctoral studies in hydrology and climate system modeling at Princeton and at the National Center for Atmospheric Research. Before joining the faculty at UCI in 2001, Dr. Famiglietti was an Assistant and Associate Professor in the Department of Geological Sciences at the University of Texas at Austin, and was the Associate Director of the UT Environmental Science Institute. He is the past Chair of the Board of the Consortium of Universities for the Advancement of Hydrologic Science, Inc. (CUAHSI), and past Editor-in-Chief of Geophysical Research Letters. His research concerns the role of hydrology in the coupled Earth system. Areas of activity include the continued development of the hydrologic components of climate models; climate system modeling for studies of land-ocean-atmosphere-human interaction; and remote sensing of the terrestrial and global water cycles, including groundwater depletion and freshwater availability. Famiglietti is currently leading the Community Hydrologic Modeling Platform (CHyMP) effort to accelerate the development of hydrological models for use in addressing national and international priorities related to water, food, economic, climate, and national security.

7. Dr. Inez Y. Fung University of California, Berkeley

Inez Fung is a Professor in the Department of Earth and Planetary Science and the Department of Environmental Science, Policy and Management. Since 2005, she has also been a Founding Co- Director of the Berkeley Institute of the Environment. Inez Fung has been studying climate change for the last 20 years. She is a principal architect of large-scale mathematical modeling approaches and numerical models to represent the geographic and temporal variations of sources and sinks of CO2, dust and other trace substances around the globe. Dr. Fung’s recent work in climate modeling predicts the co-evolution of CO2 and climate and concludes that the diminishing capacities of the land and oceans to store carbon act to accelerate global warming. Inez Fung received her S.B. in Applied Mathematics and her Sc.D. in Meteorology from MIT. She joined the Berkeley faculty in 1998 as the first Richard and Rhoda Goldman Distinguished Professor in the Physical Sciences and the founding Director of the Berkeley Atmospheric Sciences Center.

8. Dr. James J. Hack Oak Ridge National Laboratory

James J. Hack directs the National Center for Computational Sciences (NCCS), a leadership computing facility at Oak Ridge National Laboratory supporting transformational science. He identifies major high performance computing needs from scientific and hardware perspectives and puts forth strategies to meet those needs as machines evolve to the petascale, able to carry out a quadrillion calculations per second. An atmospheric scientist, Hack also leads ORNL’s Climate Change Initiative. Dr. Hack became a research staff member at the IBM Thomas J. Watson Research Center, where he worked on the design and evaluation of highperformance computing architectures. In 1984 he moved to the National Center for Atmospheric Research, a National Science Foundation-sponsored center, where his roles included Senior Scientist, head of the Climate Modeling Section, and Deputy Director of the Climate and Global Dynamics Division. He was one of the principal developers of the climate model that ran on NCCS supercomputers to provide more than one-third of the simulation data jointly contributed by the Department of Energy and the National Science Foundation to the most recent assessment report of the United Nations’ Intergovernmental Panel on Climate Change, the group that shared the 2007 Nobel Peace Prize with Al Gore.

9. Dr. James W. Hurrell  Current IPCC Author

National Center for Atmospheric Research James (Jim) W. Hurrell is Senior Scientist in the Climate and Global Dynamics Division of the Earth System Laboratory at the National Center for Atmospheric Research (NCAR). NCAR is a federally funded research and development center that works with partners at universities and researchers to explore and understand the atmosphere and its interactions with the sun, the oceans, the
biosphere, and human society. Jim joined NCAR after earning his doctorate in atmospheric science from Purdue University. Jim’s research has centered on empirical and modeling studies and diagnostic analyses to better understand climate, climate variability and climate change. Jim has been involved in assessment activities of the Intergovernmental Panel on Climate Change and the U.S. Global Change Research Program. Jim has been extensively involved in the World Climate Research Programme (WCRP) on Climate Variability and Predictability (CLIVAR), including roles as cochair of the Scientific Steering Group (SSG) of both U.S. and International CLIVAR and membership on several other CLIVAR panels. His current position at NCAR is Chief Scientist of the Community Earth System Model (CESM). Jim has given testimony
on climate change issues for congressional subcommittees and has received numerous prestigious honors and awards in his field of atmospheric science.

10. Dr. Daniel J. Jacob   Harvard University – Current IPCC Author

Daniel J. Jacob is a Professor of atmospheric chemistry and environmental engineering at Harvard University. The goal of his research is to understand the chemical composition of the atmosphere, its perturbation by human activity, and the implications for climate change and life on Earth. His approaches include global modeling of atmospheric chemistry and climate, aircraft measurement campaigns, satellite data retrievals, and analyses of atmospheric observations.

11.  Dr. James L. Kinter, III Center for Ocean-Land-Atmospher Studies

James L. Kinter is Director of the Center for Ocean-Land- Atmosphere Studies (COLA) where he manages all aspects of basic and applied climate research conducted by the Center. Dr. Kinter’s research includes studies of climate predictability on seasonal and longer time scales. Of particular interest in his research are prospects for prediction of El Niño and the extratropical response to tropical sea surface temperature anomalies using high-resolution coupled general circulation models of the Earth’s atmosphere, oceans and land surface. Dr. Kinter is also an Associate Professor in the Climate Dynamics Ph.D. Program and the Atmospheric, Oceanic and Earth Sciences department at George Mason University, where he has responsibilities for curriculum development and teaching undergraduate and graduate courses on climate change, as well as advising Ph.D. students. After earning his doctorate in geophysical fluid dynamics at Princeton University in 1984, Dr. Kinter served as a National Research Council Associate at NASA Goddard Space Flight Center, and as a faculty member of the University of Maryland (teaching faculty 1984-1987; research faculty 1987-1993) prior to joining COLA. Dr. Kinter has served on many national review panels
for both scientific research programs and supercomputing programs for computational climate modeling.

12. Dr. Lai-Yung R. Leung Pacific Northwest National Laboratory

L. Ruby Leung is a recognized leader in modeling regional climate and the hydrological cycle. Her research focuses on understanding and modeling of regional climate variability and change, land-atmosphere interactions, orographic processes, and aerosol effects on the water cycle. She has led important efforts in defining research priorities and needs in regional climate modeling and coordinated community efforts to develop capability in community mesoscale models to simulate regional climate. Her research on climate change and aerosol effects has been featured in Science, Popular Science, Wall Street Journal, National Public Radio, and many major newspapers. Her research crosses scientific disciplines to advance the state of the art in predicting climate change and its regional impacts.

13. Dr. Shawn Marshall  University of Calgary

Shawn Marshall joined University of Calgary’s Department of Geography in January 2000, following Ph.D. and Postdoctoral research at the University of British Columbia (UBC). Since earning a B.A.Sc. in Engineering Physics at the University of Toronto he has been on a progressively geographical path, with Ph.D. work in Geophysics and Postdoctoral work in UBC´s Department of Earth and Ocean Sciences. His research interests are in glacier and ice sheet dynamics, ice-climate interactions, and paleoclimatology.

14. Dr. Linda O. Mearns  National Center for Atmospheric Research Current IPCC Author

Linda O. Mearns is Director of the Weather and Climate Impacts Assessment Science Program (WCIASP), Head of the Regional Integrated Sciences Collective (RISC) within the Institute for Mathematics Applied to Geosciences (IMAGe), and Senior Scientist at the National Center for Atmospheric Research, Boulder, Colorado. She served as Director of the Institute for the Study of Society and Environment (ISSE) for three years ending in April 2008. She holds a Ph.D. in Geography/Climatology from UCLA. She has performed research and published mainly in the areas of climate change scenario formation, quantifying uncertainties, and climate change impacts on agro-ecosystems. She has particularly worked extensively with regional climate models. She has been an author in the IPCC Climate Change 1995, 2001, and 2007 Assessments regarding climate variability, impacts of climate change on agriculture, regional projections of climate change, climate scenarios, and uncertainty in future projections of climate change. For the Fifth Assessment Report (due out in 2013) she is a lead author of Chapter 21 on Regions in WG2. She leads the multiagency supported North American Regional Climate Change Assessment Program (NARCCAP), which is providing multiple highresolution climate change scenarios for the North American impacts community. She has been a member of the National Research Council Climate Research Committee (CRC), the NAS Panel on Adaptation of the America’s Climate Choices Program, and is currently a member of the Human Dimensions of Global Change (HDGC) Committee. She was made a Fellow of the American Meteorological Society in January 2006.

15. Dr. Richard B. Rood University of Michigan

Richard B. Rood is currently a Professor of atmopheric, oceanic and space sciences at the University of Michigan. His current physical climate
research is focused on bridging the study of weather and climate. He is funded by NASA to study dynamical features as objects and to develop new methods for analyzing climate models. He is also funded by the Department of Energy to study sub-scale mixing processes in climate models. In addition, he has funding to study urban heat waves, human heat health warning systems, and how to govern open source / open innovation communities. He is a co-investigator on Michigan’s NOAA-funded Regional Integrated Sciences and Assessments Center.

16. Dr. Larry L. Smarr University of California, San Diego

Larry Smarr is the founding Director of the California Institute for Telecommunications and Information Technology (Calit2), a UC San
Diego/UC Irvine partnership, and holds the Harry E. Gruber professorship in Computer Science and Engineering (CSE) at UCSD’s Jacobs School. At Calit2, Smarr has continued to drive major developments in information infrastructure– including the Internet, Web, scientific visualization, virtual reality, and global telepresence–begun during his previous 15 years as founding Director of the National Center for Supercomputing Applications (NCSA). Smarr served as principal investigator on NSF’s OptIPuter project and currently is principal investigator of the Moore Foundation’s CAMERA project and co-principal investigator on NSF’s GreenLight project.

17. Dr. Wieslaw Maslowski U.S. Naval Postgraduate School

Wieslaw Maslowski is a research professor of oceanography at the Naval Postgraduate School in Monterey, CA. Dr. Maslowski’s research interests include polar oceanography and sea ice; regional ocean, sea-ice and climate modeling and prediction; mesoscale processes in the ocean and sea ice and their interaction with and impact on general ocean circulation, climate change and climate variability; ocean-ice sheet and air-sea-ice interactions and feedbacks. He is currently leading a DOE-supported research program to develop a Regional Arctic System Model (RASM). Dr. Maslowski earned his Ph.D. from the University of Alaska in 1994.

This is an impressive list of scientists. But they clearly have a vested interest in continuing to focus on the top-down global climate model approach to assessing risks that society and the environment may face in the coming decades. They have failed to consider the issue from the perspective of the stakeholders.

source of image

Comments Off on An NRC Study “A National Strategy for Advancing Climate Modeling” – A Missed Opportunity

Filed under Uncategorized

Missing Ocean Heat Study Reported On By Climate Wire – Response From Josh Willis

source of image – NOAA’s PMEL [note the data have not yet been updated for the last couple of years]

There is a Climate Wire report on a new Nature Geosciences paper.

The Climate Wire article reads [highlight added]

 Researchers puzzle over measurements of ocean-stored heat  (Monday, January 23, 2012)

Lauren Morello, E&E reporter

Earth’s “missing heat” might not be missing after all.

That’s the conclusion of a new study that examines how accurately satellites and floating ocean instruments track the flow of energy from the sun to Earth and back again.

Those measurements are at the heart of a puzzle climate scientists have been trying hard to crack: why, as greenhouse gas emissions rose and satellite data showed an increasing amount of energy trapped in the planet’s atmosphere, the amount of heat absorbed by the world’s oceans — a major heat sink — wasn’t rising as quickly.

One answer to the puzzle came from climate scientists Kevin Trenberth and John Fasullo of the National Center for Atmospheric Research, who coined the term “missing heat” — and later suggested it may be stored in the deep ocean, where there are few measurements to track the energy’s path.

But new research, published yesterday in the journal Nature Geoscience, argues that what Trenberth and Fasullo dubbed “missing heat” isn’t missing, after all — that the amount of radiation trapped in Earth’s atmosphere, as measured by satellite sensors, is consistent with measurements of heat absorbed by the ocean.

Any discrepancy falls within the margin of error on those measurements, say the study’s authors, led by NASA climate scientist Norman Loeb.

Part of the problem, Loeb said, is that the margin of error on the ocean measurements is large, a legacy of the early 2000s switch from an instrument originally developed in the the 1960s — the expendable bathythermograph, or XBT — to the more accurate Argo float.

Today, roughly 3,200 Argos are traveling the world’s oceans, collecting data as they repeatedly sink to prescribed depths, pop back up again and transmit the information they’ve collected to waiting satellites.

Diving into uncertainty

“Given that there’s a lot of uncertainty in the ocean measurements, given that there was this transition from XBT to Argo right around the time that satellite data and ocean data deviated, it raises a lot questions in my mind about whether you can say there is missing energy,” Loeb said.

His analysis examining the amount of solar radiation entering and leaving the atmosphere estimates the heat content of the upper ocean using three different data sets.

Loeb’s conclusion? That, if you consider the margin of error on the satellite and ocean measurements, the two data sources are in agreement — and there may not be any “missing energy.”

“It’s not to say that it’s not happening,” Loeb said. “It’s just that you can’t easily make that conclusion from the data.”

Not so fast, says Trenberth. “One of the key points of our paper was, when you try to do this inventory and things didn’t add up, if you take things at face value, that is an indicator by itself that the error bars are very large,” Trenberth said. “We were very aware of that — but they shouldn’t be that large.”

Trenberth said he also believes Loeb overestimated the error bars for the satellite data, which show the potential margin of error for those measurements.

But both scientists agree that the ongoing debate over the accounting of Earth’s energy budget demonstrates the need to improve monitoring of the Earth’s climate and to better understand sources of error in older measurements, like the ocean data collected for decades by XBTs.

“There are at least 10 estimates of upper ocean heat content,” Trenberth said. “They are all over the place, in spite of the fact that we have the best ocean observing system, with Argo floats, that we’ve ever had.”

My Request To Josh Willis of JPL for a response [Josh, as most of you already know, is an internationally well-respected expert on ocean heat content analyses]

Hi Josh

Would you be willing to comment on this for my weblog?


Josh’s Response

Hi Roger,

You bet.  You can post these comments on your blog.  However, since I comment on Kevin’s quote, perhaps you could be sure to include the paragraph below in its entirety.

I think that the Loeb et al. paper is an important step forward in our understanding of the Earth’s energy balance and our ability to observe it. As I have said for some time, I think a fair accounting of the uncertainties in the observations would cast serious doubt on the “missing heat” hypothesis, and I think the Loeb et al. paper confirms that.  I also disagree with Trenberth’s comment that the estimates of ocean warming are all over the place.  All the estimates that I am aware of agree quite well over the period from 2005 to the present, which is dominated by the Argo data.  It is true, however, that there are still large uncertainties for the period before 2005 due to unresolved biases in the XBT data.  But even with these biases, it is still possible to see the human-caused signal over a long enough period of time–like 15 to 20 years.

Hope this helps.

Cheers, Josh

My Comment: 

As I have urged in my papers

Pielke Sr., R.A., 2003: Heat storage within the Earth system. Bull. Amer.  Meteor. Soc., 84, 331-335.

Pielke Sr., R.A., 2008: A broader view of the  role of humans in the climate system. Physics Today, 61, Vol. 11, 54-55.

the assessment of ocean heat content changes is the robust approach to diagnose climate system heat changes (global warming and cooling). The ocean itself does the time and space integration needed to diagnose the accumulation or loss of heat to the climate system over time. Radiative fluxes as viewed from space is a much more difficult way to diagnose this heating. We should have the most confidence in the upper ocean data, particularly since 2005, as Josh reports.

The papers [with more on the way by these internationally well-respected climate scientists]

R. S. Knox, David H. Douglass 2010: Recent energy balance of Earth  International Journal of Geosciences, 2010, vol. 1, no. 3 (November) – In press doi:10.4236/ijg2010.00000.

D.H. Douglass. The Pacific sea surface temperature.  Physics Letters A (2011). doi:10.1016/j.physleta.2011.10.042

provide quantitative examples of the value of using this ocean data in order to improve our understanding of the climate system.

Comments Off on Missing Ocean Heat Study Reported On By Climate Wire – Response From Josh Willis

Filed under Uncategorized

BBC Interview With Joe Golden On Project Storm Fury

The BBC has a very insightful interview with Joe Golden on hurricane modification (Project Storm Fury).  Information about Joe’s biography can be viewed here.  The interview with Joe highlights aspects of this project (including that the Russians were also seeding hurricanes).  The abstract of the interview is

Fifty years ago the USA launched an ambitious attempt to control the weather. Its aim was to change the course of hurricanes away from populated areas. Hear from Joe Golden, who worked on Storm Fury.

The interview is at

Fifty years ago the USA launched an ambitious attempt to control the weather.

This project is directly relevant to the current interest in geoengineering.  Project Storm Fury is discussed in our books

Pielke, R.A., Jr. and R.A. Pielke, Sr., 1997: Hurricanes: Their nature and impacts  on society. John Wiley and Sons, England, 279 pp.

 Pielke, R.A., 1990: The hurricane. Routledge Press revivals 2011, London, England, 226 pp.

source of image

Comments Off on BBC Interview With Joe Golden On Project Storm Fury

Filed under Uncategorized

Happy New Year!

I am taking off this week from posting and will start again on January 2 2012. Meanwhile, enjoy this Holiday week!

source of image

Comments Off on Happy New Year!

Filed under Uncategorized