Guest Post By Dan Hughes
Your recent post
presented an excellent example of validation of mathematical models of physical phenomena and processes. Rigorous validation necessitates setting objective technical evaluation requirements and associated success criteria for all important system-response functions.
The information in the post can be directly put into this form by setting the objective technical requirement to be:
In terms of testing the models, necessary conditions (but still not a sufficient condition) for the models to have any credibility to predict the future climate on decadal time scales are:
1. They must accurately simulate (hindcast) the statistics of major atmospheric and ocean circulation features over the last few decades (since real world data is available)
2. They must accurately simulate (hindcast) the statistics of the changes in the statistics of these major atmospheric and ocean circulation features over the last few decades.
The associated success metric can be stated as follows:
The evaluation requirement will be satisfied by a positive response to the following statement:
Are the calculated responses in agreement with the evaluation requirements?
Personnel, or organizations, that are independent of those responsible for development of the models, methods, and software, should carry out setting the evaluation requirements and determination of the success metric: well-qualified users, and, if necessary, other personnel qualified in statistical methodologies, for example.
You have provided an excellent summary. I have often wondered why evaluation requirements and success metrics are generally not stated whenever extreme events are attributed to climate change. The proper characterizations of the deltas should be the sole focus.