Comments On Numerical Modeling As The New Climate Science Paradigm

UPDATE MAY 3 2010: I e-mailed each author of the Navarra et al 2010 paper and invited them to respond to my post; as of today’s date they have not replied to even acknowledge receipt of my e-mail.

Dick Lindzen has presented a summary of how climate science has changed over the last decade or so (see). In his article he writes [h/t to David L. for posting on Climate Audit]

“In brief, we have the new paradigm where simulation and programs have replaced theory and observation, where government largely determines the nature of scientific activity, and where the primary role of professional societies is the lobbying of the government for special advantage.”

There is an article in the March 2010 issue of the Bulletin of the American Meteorological Society which exemplifies the first of the issues that have been raised by Dick Lindzen.  The article is

A. Navarra, J. L. Kinter III, J. Tribbia, 2010: Crucial Experiments in Climate Science. Bulletin of the American Meteorological Society. Volume 91 Issue 3. 343–352.

I have provided excerpts from this article and will provide comments after each indicating points of agreement and disagreement.

There is a delicate web of interactions among the different components of the climate system. The interplay among the time scales is quite intricate, as the fast atmosphere interacts with the slow upper ocean and the even slower sea ice and deep-soil and groundwater processes. Spatial scales are tightly connected too, as small-scale cloud systems, for instance, affect the large-scale energy balance. Furthermore, everything is connected by water in its various forms. Water flows easily from place to place and exchanges energy with the environment every time it changes phase. Evaporation, condensation, freezing, and melting processes must be taken into account and evaluated as accurately as possible. The past 40 years of climate simulation have made it apparent that no shortcut is possible; every process can and ultimately does affect climate and its variability and change. It is not possible to ignore some components or some aspects without paying the price of a gross loss of realism.

This summary is a much-needed ,belated recognition of the accuracy of the 2005 NRC report [uncited in the Navarra et al 2010 BAMS article]

National Research Council, 2005: Radiative forcing of climate change: Expanding the concept and addressing uncertainties. Committee on Radiative Forcing Effects on Climate Change, Climate Research Committee, Board on Atmospheric Sciences and Climate, Division on Earth and Life Studies, The National Academies Press, Washington, D.C., 208 pp.

Figure 1-1 in the NRC report [reproduced below] schematically illustrates what is written in the Navarra et al paper.

The Navarra et al 2010 article then has the text

A strict application of the scientific method requires a process of isolation of constituent subsystems and experimental verification of a hypothesis. For the climate system, this is only possible by using numerical models. Such models have become the central pillar of the quantitative scientific approach to climate science [emphasis added] because they allow us to perform “crucial” experiments under the controlled conditions that science demands. Sometimes crucial experiments are recognized as such at the design phase, like the quest for the Higgs boson currently going on at the European Organization for Nuclear Research [Conseil Européen pour la Recherche Nucléaire (CERN)]. Other times it is only in historical perspective that some experiments are recognized as truly “crucial.” This was the case of the 1887 test by Michelson and Morley that rejected the hypothesis of the existence of the “luminiferous aether” (Tipler and Llewellyn 2003), an undetected medium through which light was deemed to propagate (see Fig. 1 on title page; http://quantumrelativity.
calsci.com/Relativity/images/Michelson_Morley.jpg). Their result led to a reformulation of a physical theory of electromagnetic radiation and to special relativity and the invariance of the speed of light. “Crucial” experiments test competitive theories and the most successful one is finally selected.

This text seeks to equate climate modeling with the development of fundamental concepts in basic physics. However, these are not the same. Whereas fundamental physical constants such as the speed of light were the focus of the Michelson and Morley study, climate modeling relies on tunable parameters and functions in their parameterizations of clouds, precipitation, vegetation dynamics, etc in the construction of the models. Climate models are engineering code not basic physics. Only advection, the pressure gradient force and gravity provide the fundamental physics in climate model. The combination of a fundamental component of the model with an engineering component (in which the physics is tuned) results in engineering code, not basic physics.

I summarized the types of climate models in my post

What Are Climate Models? What Do They Do?

There are three basic classes: process studies; diagnosis; and prediction.  As I discuss in that post, the IPCC assessment models are actually process studies, although they have been marketed by the IPCC as predictions.  With respect to the  Navarra et al paper, their proposed modeling framework, in reality, is to develop a more comprehensive climate process assessment tool. The models hypotheses.

Navarra et al 2010 continue with the text

There have been no revolutionary changes in numerical models of climate since their advent over 30 years ago. The models make use of the same dynamical equations, with improved numerical methods, and have comparable resolution and similar parameterizations. Over the past 30 years, computing power has increased by a factor of 106. Of the millionfold increase in computing capability, about a factor of 1,000 was used to increase the sophistication of the model. Model resolution, the inclusion of more physical and biogeochemical processes, and more elaborate parameterizations of unresolved phenomena have all been modestly improved.

This is an accurate summary.  An interesting and important oversight, however, is any discussion on improvements in the predictive skill of the models on different time scales (i.e. seasonal; annual, multi-year; decadal). Of course, the absence of this discussion reflects the general lack of a demonstration of predictive skill beyond a few months at most by the IPCC or anyone else.

Navarra et al 2010 write

These trends indicate that the problem of weather and climate modeling can be organized in terms of four dimensions: resolution, complexity, integration length, and ensemble size.

There is an interesting oversight here. There is no mention of observational verification of the model skill.

Increasingly, century-long climate projection will become an initial-value problem requiring the current observed state of all components of the Earth system: the global atmosphere, the world oceans, cryosphere, and land surface (including physical quantities, such as temperature and soil moisture, as well as biophysical quantities, such as leaf area index, etc.) to produce the best projections of the Earth system and also giving state-of-the-art decadal and interannual predictions. The shorter time scales and weather are known to be important in their feedback on the longer-time-scale behavior. In addition, the regional manifestations of longer-time-scale changes will be felt by society mainly through the changes in the character of the shorter time scales, including extremes.

This is an accurate summary of the challenges in climate prediction. The admission that climate prediction is an initial value problem was ignored by the 2007 IPCC assessments.  See, for example, my recent post

Comments On A New Paper “A Unified Modeling Approach to Climate System Prediction” By Hurrell Et Al 2009

which refers to my paper

Pielke, R.A., 1998: Climate prediction as an initial value problem. Bull. Amer. Meteor. Soc., 79, 2743-2746. {with respect to my Comments on the Hurrell et al paper that was sent to BAMS last year, it was only sent out for review in the past month!].

Navarra et al 2010 further write

The era of industrial computing. The changes that we have described will usher in a new era of calculation on such a large scale that it will be comparable to the transition from the artisan shop to the modern factory: it will be the era of industrial computing. Issues like quality control, procedure certifications, and data integrity will no longer be the subject of discussions by researchers, but they will be matters of procedural control and monitoring. It will free climate scientists from much of the engineering work that is now necessary in the preparation of the experimental apparatus they are using in their laboratory but that is hardly necessary to the core of climate science.

It will also create some new problems. It is unclear at this point if the field is going to need more software engineers and programmers or fewer as the computing power is concentrated in larger and fewer centers. A new professional figure may emerge who will maintain the laboratory and the experiment as the routine day-by-day simulations, developing along well-planned lines, may stretch for months or years. Questions about how such professionals will be trained arise without obvious answers.

This is a remarkable proposal for a new approach in climate modeling as it removes the climate modeller  from working with the real world data.  This exemplifies what Dick Lindzen stated

“we have the new paradigm where simulation and programs have replaced theory and observation….”. 

The Navarra et al article concludes with the text

The discussions conducted for the simulations needed for the IPCC assessments have already gone in this direction, but they are still examples of a loose coordination, rather than the tight coordination that will be required by the petascale machines. The transition is similar to what happened in astronomy when that community went from coordinating observations at different telescopes to creating a consortium for the construction of one larger instrument. Industrial computing and numerical missions will rely on that capability even more to allow climate science to address problems that have never before been attempted.

The global numerical climate community soon will have to begin a proper discussion forum to develop the organization necessary for the planning of experiments in the industrial computing age.

The proposal put forth in Navarra et al 2010, if adopted, would concentrate climate modeling into a few well-funded institutions, as well as focus the use models for multi-decadal predictions of the real climate system (in which we do not, of course have observational validation data), rather than as a tool to test scientific hypotheses against real world observations. Policy decisions will be made from these unvalidated model predictions (has they have already been made based on the global-average and regional scale from the IPCC multi-decadal model forecasts).

This is a path that will likely lead to the eventual discrediting of the climate science community who participates in this activity if, as I expect, the regional multi-decadal regional (and even global average forecasts) generally fail to show skill in the coming years.

Even more importantly, they are unlikely to be useful to most of the actual needs of resource stakeholders in their plans to reduce the vulnerability to climate and other environmental and social threats; e.g. see Table E.7 in

 Pielke, R.A. Sr., and L. Bravo de Guenni, 2004: Conclusions. Chapter E.7 In: Vegetation, Water, Humans and the Climate: A New Perspective on an Interactive System. Global Change – The IGBP Series, P. Kabat et al., Eds., Springer, 537-538.

 While I support the use of climate models to examine climate processes, they must be solidly based on observational validation. It also must not be forgotten that climate models (and indeed all models) are hypotheses. Real world observations must be the standard to test the climate models.

The Navarra et al conclusion that

“Such models have become the central pillar of the quantitative scientific approach to climate science because they allow us to perform “crucial” experiments under the controlled conditions that science demands”

is not how climate science should proceed. The “central pillar” must be the real-world observations.

The American Meteorological Society, as represented by the Editor-in-Chief of the Bulletin of the AMS, Jeff Rosenfeld, agrees with the view with models as the central pillar of the quantitative scientific approach to climate science. He writes in his “Letter From The Editor” [which, unfortunately is not online at the BAMS website]

“If climate science develops the way Navarra et al suggest will this be proof that the age of numerical experimentation has matured? Perhaps so. A science shaped by Franklin and Lorenz’s critical experiments is now a critical experiment itself – a test of the viability of science when it is dependent on numerical modeling for methodology. For better or worse, the result of this grand experiment – the very state of climatology – will forever be ingrained in popular consciousness.”

Dick Lindzen’s perceptive statement that “simulation and programs have replaced theory and observation” accurately (and unfortunately) represents the current  position of the leadership of the American Meteorological Society.

Comments Off

Filed under Climate Models, RA Pielke Sr. Position Statements

Comments are closed.