Category Archives: Climate Science Op-Eds

My Comment On “A Closer Look At Why The Climate Change Debate Is So Polarized” By Keith L. Seitter

There is an interesting write-up in the September 2012 issue of BAMS titled

A Closer Look at Why the Climate Change Debate Is So Polarized [unfortunately, there is no url for it except for AMS members, but I have reproduced below].

This article by Keith L. Seitter, Executive Director of the AMS, is an important reaching out to the climate science community. His statement that [highlight added]

“It is not uncommon for those who are convinced that human activities are significantly influencing the climate to suggest that anyone who is unconvinced simply does not understand the science or is incapable of following the logical sequence provided by the evidence. Yet there are a number of distinguished scientists who are quite outspoken in their dismissal of anthropogenic influences being among the major causes for the Earth’s recent warming and/or projections of future warming.”

This reaching out to those who do not accept statements such as recently promulgated by the AMS; i.e.

Climate Change

is a refreshing recognition of the actual diversity of viewpoints in this professional society.

While I accept that human activity has played a significant role in altering the Earth’s climate system (including its heating), I welcome the recognition that those who do not agree with some or all of this statement are still respected. We need more such reaching out by all viewpoints in the climate issue.

In terms of the context of the “cultural issues” that Keith discusses, I recommend they also be considered in the context of Graves Value Theory. This concept categorizes individuals into what someone finds important; e. g. see

Graves’ value theory

In this theory, as discussed in the above link

Graves theorized that there are eight value systems which evolved over the course of the past 100,000 years of human history. This evolutionary process has affected us biologically, psychologically and culturally.

Graves formulated the following starting points for his value system:

  • Each fundamental value system is the result, on the one hand, of someone’s circumstances and the problems that come with it (life conditions), and on the other hand of the way he deals with it based on his neurological ‘wiring’ (mind conditions).
  • Every adult contains all value systems within himself.
  • A person’s value system changes depending on the circumstances he finds himself in.
  • The development of value systems is like a pendulum, moving back and forth between value systems focused on the individual and those focused on the collective.
  • The more complex people’s circumstances, the more complex the value systems which are required.
  • Value systems depend on the context. In different contexts (family, work, etc.) people may experience their immediate environment in a different way. This means that different value systems may predominate in these different contexts.

I am certainly not an expert on this topic, but recommend those who are to pursue this line of research in the context, as Keith presents of

cultural cognition and the role it plays in polarizing the our community – and our nation -  on the subject of climate change.”

Thanks to Keith Seitter for seeking to broaden the climate discussion!

Comments Off

Filed under Climate Science Op-Eds

My Comment On The Nature Article “Extreme Weather – Better Models Are Needed Before Exceptional Events Can Be Reliably Linked To Global Warming”

As a comment at Nature on the Editorial on 19 September 2012

Extreme weather – Better models are needed before exceptional events can be reliably linked to global warming Nature 489,335–336(20 September 2012)doi:10.1038/489335b

I posted the following comment -

Roger Pielke Sr said:

I recommend readers read our articles

Pielke Sr., R.A., and R.L. Wilby, 2012: Regional climate downscaling ‘what’s the point?’ Eos Forum, 93, No. 5, 52-53, doi:10.1029/2012EO050008. http://pielkeclimatesci.files.wordpress.com/2012/02/r-361.pdf

and

Pielke Sr., R.A., R. Wilby, D. Niyogi, F. Hossain, K. Dairuku, J. Adegoke, G. Kallos, T. Seastedt, and K. Suding, 2012: Dealing with complexity and extreme events using a bottom-up, resource-based vulnerability perspective. AGU Monograph on Complexity and Extreme Events in Geosciences. http://pielkeclimatesci.files.wordpress.com/2011/05/r-365.pdf

where we overview the challenges of skillful multi-decadal regional and global climate predictions. Not only must current multi-decadal climate statistics be accurately calculated, but CHANGES in these statistics must be skillful.

As documented in these weblog posts

More CMIP5 Regional Model Shortcomings http://pielkeclimatesci.wordpress.com/2012/09/11/more-cmip5-regional-model-shortcomings/

CMIP5 Climate Model Runs - A Scientifically Flawed Approach. http://pielkeclimatesci.wordpress.com/2012/07/20/cmip5-climate-model-runs-a-scientifically-flawed-approach/

The Hindcast Skill Of The CMIP Ensembles For The Surface Air Temperature Trend –  By Sakaguchi Et Al 2012. http://pielkeclimatesci.wordpress.com/2012/09/19/the-hindcast-skill-of-the-cmip-ensembles-for-the-surface-air-temperature-trend-by-sakaguchi-et-al-2012/

the global models have not yet passed the needed tests.

“Better models are not what is needed, in my view, but a new approach to assess risk from climate and other environmental (and social) threats as we present in our second paper listed above.

Comments Off

Filed under Climate Science Op-Eds

Wall Street Journal Poll On Wind Energy

John Droz  Jr. alerted us to this Wall Street Journal poll. It is titled

Vote: Should Solar and Wind Power be Subsidized?

and reads in part

Federal subsidies have spurred the growth of renewable-energy production in recent years, but many of those subsidies are set to expire soon unless Congress acts.

Supporters say the subsidies will allow renewable technologies to grow enough to become cost-competitive with conventional energy sources—and that their benefits include reduced pollution and decreased dependence on foreign oil.

Critics want to scale back or eliminate the subsidies, arguing that renewable sources have had decades to get established but still aren’t cost-competitive with conventional energy.

Tell us what you think in advance of a special report we’ll be publishing in The Wall Street Journal. We may use some of your comments in print.

The poll is listed within the article.

My view is that the federal research program should support research and proof-of-concept development of wind energy (for feasibility, and a cost and benefit perspective), but that subsidizing any energy source (e.g. through tax credits) should be at most, a short -term approach except for the poor members of our society. 

With respect to wind energy, there is a need to assess if it is, for example, a reliable source of energy; and if its footprint on the landscape including its effect on the local ecology (e.g. birds) and the removal of this land from other uses, makes this a viable contribution to the sources of energy for society.

source of image

Comments Off

Filed under Climate Science Op-Eds

A Comment On The Nature Article – “Time To Raft Up- Climate Scientists Should Learn From The Naysayers And Pull Together To Get Their Message Across, Says Chris Rapley”

There is an interesting Nature article which was published this week

Rapley Chris, 2012: Climate science: Time to raft up.  Nature. 488, 583-585 doi:10.1038/488583a

with the subtitle

Climate scientists should learn from the naysayers and pull together to get their message across, says Chris Rapley.

In this Nature Comment, Chris Rapley makes a serious fundamental error, in my view.  He writes that

“….the voices of dismissal are trumping the messages of science.”

However, in my view, that is not the correct way to frame the problem. One reason that there has been little progress in effective climate policy is that the message on climate science that is presented to the public and policymakers is incorrectly too narrow. Climate issues, as influenced by human activities, are much more than a global average surface temperature anomaly threshold.

The issue of whether limits should be sought on atmospheric concentrations of CO2 is not all there is to the role of humans in the climate system, nor should climate mitigation and adaptation risks not consider the variations and longer term trends in the natural climate system. As Dan Sarewitz and my son said with respect to risks from adding too much CO2 into the atmosphere “We know enough!” on the threat from added CO2.

My son’s book

The Climate Fix by Roger A. Pielke Jr.

makes this case convincingly.

However, the added CO2, as important as it is, is but a part of what humans are doing to the climate.

The need for this broader viewpoint has been emphasized in international and national assessments;

Kabat, P., Claussen, M., Dirmeyer, P.A., J.H.C. Gash, L. Bravo de Guenni, M. Meybeck, R.A. Pielke Sr., C.J. Vorosmarty, R.W.A. Hutjes,             and S. Lutkemeier, Editors, 2004: Vegetation, water, humans and the climate: A new perspective on an interactive system. Springer, Berlin, Global Change – The IGBP Series, 566 pp.

National Research Council, 2005: Radiative forcing of climate change: Expanding the concept and addressing uncertainties. Committee on Radiative Forcing Effects on Climate Change, Climate Research Committee, Board on Atmospheric Sciences and Climate, Division on Earth and Life Studies, The National Academies Press, Washington, D.C., 208 pp.

but the leadership of the climate science community who are communicating with policymakers and the public are ignoring these assessments.

This was the reason we wrote our paper [of which all of the authors are AGU Fellows]

Pielke Sr., R., K.  Beven, G. Brasseur, J. Calvert, M. Chahine, R. Dickerson, D.  Entekhabi, E. Foufoula-Georgiou, H. Gupta, V. Gupta, W. Krajewski, E.  Philip Krider, W. K.M. Lau, J. McDonnell,  W. Rossow,  J. Schaake, J.  Smith, S. Sorooshian,  and E. Wood, 2009: Climate change: The need to consider human forcings besides greenhouse gases.   Eos, Vol. 90, No. 45, 10 November 2009, 413. Copyright (2009) American   Geophysical Union.

As just two examples, first in my post from yesterday

Follow Up On The AMS Statement On “Climate Change’

after a set of e-mail exchanges, Danny Rosenfeld agreed that

1. On a regional scale, the aerosols can be the dominant anthropogenic climate forcing.

2. On a global scale, the aerosols might be in par with the GHG. We just don’t know

In our paper,

Pielke Sr., R.A., A. Pitman, D. Niyogi, R. Mahmood, C. McAlpine, F. Hossain, K. Goldewijk, U. Nair, R. Betts, S. Fall, M. Reichstein, P. Kabat, and N. de Noblet-Ducoudré, 2011: Land  use/land cover changes and climate: Modeling analysis  and  observational evidence. WIREs Clim Change 2011, 2:828–850. doi: 10.1002/wcc.144

the abstract reads

This article summarizes the changes in landscape structure because of human land management over the last several centuries, and using observed and modeled data, documents how these changes have altered biogeophysical and biogeochemical surface fluxes on the local, mesoscale, and regional scales. Remaining research issues are presented including whether these landscape changes alter large-scale atmospheric circulation patterns far from where the land use and land cover changes occur. We conclude that existing climate assessments have not yet adequately factored in this climate forcing. We conclude that existing climate assessments have not yet adequately factored in this climate forcing. For those regions that have undergone intensive human landscape change, or would undergo intensive change in the future, we conclude that the failure to factor in this forcing risks a misalignment of investment in climate mitigation and adaptation.

The public is more perceptive of reality than is often realized. It is the neglect of proper consideration of the actual diversity of climate science issues that, I feel, explains at least part of what Chris reported in the Nature article as

A recent UK survey found that about one-third of the public agrees with the statement “We can trust climate scientists to tell us the truth about climate change” and that about one-third disagrees.

As a way forward, we have proposed a different approach that is in summarized in our paper

Pielke Sr., R.A., R. Wilby, D. Niyogi, F. Hossain, K. Dairuku, J. Adegoke, G. Kallos, T. Seastedt, and K. Suding, 2012: Dealing  with complexity and extreme events using a bottom-up, resource-based  vulnerability perspective. AGU Monograph on Complexity and  Extreme Events in Geosciences, in press.

where our abstract reads

We discuss the adoption of a bottom-up, resource–based vulnerability approach in evaluating the effect of climate and other environmental and societal threats to societally critical resources. This vulnerability concept requires the determination of the major threats to local and regional water, food, energy, human health, and ecosystem function resources from extreme events including climate, but also from other social and environmental issues. After these threats are identified for each resource, then the relative risks can be compared with other risks in order to adopt optimal preferred mitigation/adaptation strategies.

This is a more inclusive way of assessing risks, including from climate variability and climate change than using the outcome vulnerability approach adopted by the IPCC. A contextual vulnerability assessment, using the bottom-up, resource-based framework is a more inclusive approach for policymakers to adopt effective mitigation and adaptation methodologies to deal with the complexity of the spectrum of social and environmental extreme events that will occur in the coming decades, as the range of threats are assessed, beyond just the focus on CO2 and a few other greenhouse gases as emphasized in the IPCC assessments.

The adoption of the bott0m-up, contextual vulnerability approach fits better with the concept of the “honest broker” as discussed in my son’s book

The Honest Broker: Making Sense of Science in Policy and Politics by Roger A. Pielke Jr.

than does the current climate science leadership’s excessively narrow top-down, outcome vulnerability approach. In my view, the “naysayers” include this leadership, as exemplified by the new AMS Statement on Climate Change.  I recommend rewriting what Chris Rapley writes to

Climate scientists should include the diversity of perspectives on the role of humans and natural processes and pull together to get this broader message across.

Then, perhaps, by using the bottom-up, contextual vulnerability approach, we can then finally make progress not only on the CO2 issue, but also on all of the other aspects of risks from the climate system.

source of image

Comments Off

Filed under Climate Science Op-Eds, Climate Science Reporting

Toby Carlson Op-Ed “The Everlasting Argument Over Climate Change”

I invited a colleague of Barry Lynn who was listed on our e-mail interaction which culminated in the guest post

Guest Post By Barry Lynn Of The the Hebrew University of Jerusalem

to also write a guest post. I have known Toby Carlson for decades, and while we disagree on key issues that he discusses below, I respect Toby and want to give him this forum to present his views. His short biographical summary is below.

Dr. T. N. Carlson, Ph.D., Imperial College, University of London, is an emeritus Professor of Meteorology at the PennStateUniversity. Professor Carlson’s scientific contributions, over 90 papers published in refereed journals, reflect a wide range of interests: synoptic and dynamic meteorology, radiative transfer, severe local storms, plant-atmosphere interactions, aerosol transport and chemistry, remote sensing of land surface properties and surface energy processes, and, most recently, applications of remote sensing to the study of urban sprawl and small watershed runoff. In 1991 Professor Carlson published a widely used book on meteorology (Mid Latitude Weather Systems). He created two new web products related to his current interest in land surface processes: an online land surface process model (“Simsphere”) and a data base of impervious surface area and fractional vegetation cover determined from Landsat 5 digital imagery at 25 m resolution for all of Pennsylvania, 1985 and 2000. In addition he has helped create a web-based tool which allows one to assess the health (nutrient load) and surface runoff potential of a user-defined stream basin in Pennsylvania or in the Chesapeake Bay Basin.

Below is Toby’s Op-Ed, followed up a set of e-mail interchanges between Toby and I which are designed to expand on the Op-Ed [the e-mails were edited to focus on the  Op-Ed issues].

The everlasting argument over climate change

Until about a decade or so ago, I was a global warming skeptic. Back in the 1990s all sorts of claims were being made about climate change based on climate model simulations. At that time, the evidence was not clear and some of the research was underwhelming. I was resolved to not to remain skeptical unless and until it could be demonstrated that these models were capable of simulating the indisputable increase in global temperature that seems to have occurred during the previous century, by initializing the models with atmospheric conditions one hundred years earlier. Only when these models showed that they are capable of predicting changes over a century up to the present would I begin to take them seriously.

These set of conditions have finally been satisfied. Climate model simulations made by scientists have finally produced some convincing evidence of the effects of human activity on global climate change. Unlike previous types of computer simulations, the latest ones adopt the novel approach of predicting the present global temperature starting in the past, for example with the year 1890. I will now describe just one of several such climate simulations, albeit one of the first of its kind made over a decade ago.

Two series of four computer simulations were made under the auspices of the NationalCenter for Atmospheric Research (NCAR) and the Department of Energy using several different climate models, called General Circulation Models (GCMs). One series of computer runs included only the effect of volcanic eruptions and solar variations on the earth’s radiation budget. Aerosol particles such as sulfates and the notorious greenhouse gas, carbon dioxide, were kept constant at their 1890 levels. Another series of simulations allowed the sulfates and greenhouse gases to vary according to their observed values. For convenience, I refer to the temperature trend simulated by the first set of runs as the ‘natural’ variation and the second set as the ‘total’ variation, as the latter contains both natural and anthropogenic effects. The difference between the two sets of simulations constitute a measure of the human-induced effects on global climate. Unlike the unverifiable and more contentious predictions of future climate, these simulations are verifiable in that they can be compared with measured mean global temperature changes over the same period. In that sense the simulations can validate or invalidate themselves.

Results of the computer runs were summarized in a letter from Richard Anthes, president of the University Corporation for Atmospheric Research, to Senator John McCain of Arizona. In that letter, Dr. Anthes emphasized the role of human activity in global warming and urged the senator to treat global warming as a serious issue.

I redrew the graph included by Anthes in the letter to Senator McCain, smoothing out the wiggles to show only the essential details of the two series of simulations. Zero on the temperature scale is an arbitrary reference corresponding to the average temperature between 1890 and 1919. I don’t show the observations because they fall almost exactly on the smoothed temperature line for the total simulations, which therefore assume a high degree of credibility.

An interesting aspect of this graph is that the warming trend from the 19th century until some time after 1960 can be accounted for by natural variability. Yet, I am impressed that one can reasonably ascribe about 1°F in the temperature rise during the past thirty years or so to human activity; (the last point on each graph is extrapolated). According to Jerry Meehl, a scientist involved in these simulations at NCAR, carbon dioxide emissions have accelerated since 1960, raising global carbon dioxide concentrations by the year 2000 from about 315 parts per million to 360 parts per million during that time interval. (This is to be compared with about 275 parts per million in 1850.) As of the year 2012, the carbon dioxide concentrations have exceeded 390 parts per million.

For me (a once avowed skeptic of the global climate brouhaha) the graph is the first convincing evidence I have seen that global warming due to fossil fuel burning is significantly raising global temperature. Since then the evidence for a human cause of global warming has become even more convincing with yet more such simulations, accumulation of much more observational evidence, including temperatures showing an even steeper slope to the warming curve after the year 2000 than that shown in the figure, the 2007 IPCC report, including sea level rises, Artic ice disappearance, etc. And yet, even more such simulations reproducing the results originally made at NCAR with their climate simulations have subsequently been made. In my opinion, further denials of the global warming evidence are likely to be based more on political than on scientific, motives.

My Comment

Hi Toby

I will be asking questions on your views also, and you might like to add to your post based on this, which we can add. The first question is

“How do your reconcile your confidence in the model skill when the hindcast multi-decadal regional climate predictions are so poor, as i reported in my post

http://pielkeclimatesci.wordpress.com/2012/07/20/cmip5-climate-model-runs-a-scientifically-flawed-approach/

The second question is

“Which of the three hypotheses in

Pielke Sr., R., K. Beven, G. Brasseur, J. Calvert, M. Chahine, R. Dickerson, D. Entekhabi, E. Foufoula-Georgiou, H. Gupta, V. Gupta, W. Krajewski, E. Philip Krider, W. K.M. Lau, J. McDonnell,  W. Rossow,  J. Schaake, J. Smith, S. Sorooshian,  and E. Wood, 2009: Climate change: The need to consider human forcings besides greenhouse gases. Eos, Vol. 90, No. 45, 10 November 2009, 413. Copyright (2009) American Geophysical Union http://pielkeclimatesci.files.wordpress.com/2009/12/r-354.pdf

do you see as not being refuted?”

Best Regards

Roger

Toby’s Reply

Roger, I have not involved myself so deeply in the controversy as to address your questions. I am simply something more than a layperson but not a specialist. The graph I showed was an amalgam of four simulations combined for each of two conditions, with co2 and without co2. The simulations were made by a reputable group at NCAR and the simulations reproduced the observations exactly with co2 and not without co2. That was enough to convince me. I don’t know what you mean by poor predictions. The one’s have seen more recently seemed a good fit, though I have not studied the papers in detail.

My Comment

Hi Toby

Those questions are central to the issue of attributing all, most, some or  none of the added to CO2 to the observed warming. That added CO2 has a  warming effect is not in disagreement by anyone. The ability of regional  models to explain behavior and provide attribution for changes in drought  frequency, heat waves, etc. is, in my view, the central issue. The global  average surface temperature is almost irrelevant in this.

If you prefer just to focus on the correspondence between CO2 increase and  the global average temperature increase without any further discussion, we  can still post your comments, but it may result in your being asked by readers to respond to the type of questions I asked. Do you still want to  post given this might occur?

I would also like to post your reply below to my questions, which you (as I  would understand) might not feel comfortable with. But let me know.

Roger

Toby’s Reply

Roger….. I neglected to add a few more reasons why I changed my mind about human impact on global warming a decade ago, but these are well known and more conventional reasons: the IPCC 2007 report, the loss of sea ice in the Arctic, the rise in sea level, etc.

My Comment

Thanks Toby!

I will work with what we have. It will post sometime next week with our mails and your statement.

We differ significantly in our viewpoints, but your perspective should be presented. I will add a short bio on you, but please send me a paragraph so I can introduce you on the post.

Best Regards

Roger

Toby’s Comment

Roger, I am really just a bystander in the global warming debate. I don’t want to get drawn into a kind of biblical debate here, the kind that theologists tend to have amongst themselves.

I am certainly familiar with the issue and the physics and, to some extent, the models. But I have no ax to grind. I am simply posting my educated opinion and the  reasons for my change of mind. I would not be able to handle detailed questions. For those, one should contact my PSU colleague, Michael Mann.

My arguments, besides that of the NCAR models and the hockey stick graph are as follows:

*  CO2 concentrations have increased almost 20% since 1960. This is unprecedented even in geological time. They are higher now than at any time in the past million years. If you want to argue that issue, see my other colleague, Richard Alley.

* Someone had made a calculation that the total amount of fossil fuel burned over some period of time corresponds roughly to the increase in the CO2 during that period. This would be a relatively easy calculation to make if one had the time to do it. Therefore, the CO2 increase is almost all human made

* To say that an increase of 20% in CO2 would not make a difference means that the laws of radiative transfer must be discarded. It is no good to resort to Richard Lindzen’s arguments that feedbacks mitigate the effect (I understand that he has considered only negative feedbacks) or that the warming (which even he admits is occurring) will be no more than 0.5 C; (even he would admit to a factor of two uncertainty in this sort of highly theoretical estimate.

Second, this is specious because his argument as to the unimportance of the increase is judgmental: that this is not an important increase. He made the same mistake that was made in a Wall Street Journal article saying that the increase in global temperature has only been 0.8 C over the past century; it’s really a bit more than that, but anyway…. But, he didn’t realize that the increase in global temperature since the little ice age is ‘only’ about that amount, he is saying that the rise in temperature is no more important than the difference in climate between the little ice age and the present. I don’t think he meant to emphasize the importance of such a small increase. Such is the WSJ mindset.

Anyway, I doubt if my article will really provoke many to reply. It is simply an opinion article and the facts are not really in question in any case. I was simply indicating the issues that changed my mind on the subject.

Thanks for taking this so seriously and for including the essay on your web site.

Toby

source of image

Comments Off

Filed under Climate Science Op-Eds, Guest Weblogs

Mike Smith’s Post “Science by Press Release: The Story About Washington, DC’s Heat”

Update: On the weblog Climate Depot it is written that

Climatologist Dr. Pielke Sr. Taunts Hansen

This is a complete mischaracterization of the intent of my post. I respect Jim Hansen, and, while I disagree with him on a number of climate issues, we agree on others (such as the domaint role of the oceans as the metric to diagnose global warming). In my post below, I am challenging Jim to reconcile his view that if the added CO2 footprint on the climate is so convincing, what is the purpose of the large funds being spent to make multi-decadal regional climate predictions in the coming decades.  The term “taunt” is perjorative and is not appropriate.

************Original Post********************************

Mike Smith, author of the outstanding book When the Sirens Were Silent and writer of the weblog “Mike Smith Enterprises Blog” has an insightful summary of Jim Hansen’s climate predictions.

Mike’s post was motivated by Jim’s op-ed article in the Washington Post based on his PNAS paper that appeared on Monday. I highly recommend reading Mike’s post of August 6 2012 that assesses the lack of skill in Jim’s forecasts

“Science by Press Release: The Story About Washington, DC’s Heat”

See also

Is Jim Hansen’s Global Temperature Skillful?” Guest Post by John R. Christy

In addition to Mike’s review, there is another implication from Jim Hansen’s claim in his Washington Post op-ed

Climate change is here — and worse than we thought

Jim wrote [highlight added]

In a new analysis of the past six decades of global temperatures, which will be published Monday, my colleagues and I have revealed a stunning increase in the frequency of extremely hot summers, with deeply troubling ramifications for not only our future but also for our present.

This is not a climate model or a prediction but actual observations of weather events and temperatures that have happened. Our analysis shows that it is no longer enough to say that global warming will increase the likelihood of extreme weather and to repeat the caveat that no individual weather event can be directly linked to climate change. To the contrary, our analysis shows that, for the extreme hot weather of the recent past, there is virtually no explanation other than climate change.

Since Jim does not need a climate model to reach his conclusion, and since the climate models have shown no skill in predicting the changes in the regional climate statistics that he discusses in his post, he is actually telling us we do not need to spend the millions of dollars in making climate predictions for the impacts communities for the coming decades.

Jim supervises  such modeling at GISS (e.g. Gavin Schmidt). While I endorse the use of climate models to improve our understanding of climate processes and of assessing the limits on predictability, vast sums of money are being used (wasted – e.g. see) to just make multi-decadal climate forecasts for the impacts communities.  If Jim is to be consistent with his message, he would call for the end of funding for such multi-decadal regional climate predictions and redirect that funding to effective mitigation and adaptation activities.

If Jim would like to respond to this request for consistency, I would be glad to present his response as a guest weblog post. Jim has completed a guest post on my weblog in the past;

Guest Weblog By James E. Hansen

so there is a precedent for such a dialog.

If Jim elects to respond to Mike Smith (or  to John Christy’s) post, or to my comments on the need to redirect funding away from multi-decadal regional climate predictions, I will be glad to post unedited.

source of image

Comments Off

Filed under Climate Science Op-Eds, Climate Science Reporting

Exchange Of Viewpoints On Canadian Extreme Weather

I was alerted to two op-eds in the Star Phoenix by Madhav Khandekar. I reproduce them below with highlights added. I agree with Madhav’s perspective. [see also ‘Extreme weather is an integral part of the Earth’s climate’ at WUWT].

Extreme weather becoming norm by Lindsay Olson of the Star Phoenix on June 28 2012

Olson is the Insurance Bureau of Canada’s vice-president for British Columbia, Saskatchewan and Manitoba.

Environment Canada’s summary of 2011 reads like an annus horribilus for extreme weather.

Prairie flooding featured the highest water levels and flows in modern history across parts of Manitoba and Saskatchewan. Slave Lake, Alta., burned down. In the East, Richelieu flooded in Quebec’s longest lived disaster. Fish swam where grain should grow. Nineteen tropical storms formed in the Atlantic Basin, almost twice the average.

Unusual weather? Or are we seeing a continuing trend and a long-term norm of severe weather in Canada?

Gordon McBean, a leading Canadian climatologist, has completed a key report following current peer-reviewed research, and examines Canada’s historical weather trends and projects them to 2050. He concluded that Canada has entered an era of extreme weather, with shorter and wetter winters, hotter summers and longer spring and fall seasons.

Commissioned by the Insurance Bureau of Canada, the paper shows a clear connection to my industry’s historical experience with increasing severe weather damages.

It also conveys a strong message: Canadians need to adapt to severe weather realities that have been hitting them, are hitting them, and will be hitting them, hard.

The reason IBC commissioned the research is practical.

We want to provide clear information to support adaptation of public and private infrastructure (municipalities, private homes).

And we want to help home and business insurers anticipate factors that are likely to affect property insurance costs in the years ahead. Keeping those costs down isn’t just an insurance industry issue.

It matters to everyone who buys insurance.

Insured losses from weather related catastrophes during the past three years have been near or above $1 billion.

We know well the stories behind the numbers – communities and individual Canadians have been hit hard, lives lost, homes destroyed, livelihoods threatened, bridges collapsed, roads ruined.

Our communities urgently need to make increased natural disaster resilience a priority.

McBean’s work analyzed trends for Canada as a whole and its regions. Here are some projections:

Atlantic Canada: An increase in hurricane and storm activity in the region is likely, with resulting storm surges. Freezing rain events will increase by 50 per cent in Newfoundland. Nova Scotia could see increases of about 20 per cent.

Quebec: More hot days. Trends point to three times as many days over 30 degrees C for Quebec City as there were during the period 1961-90. Montreal is expected to see a 60 per cent increase in hot days by 2050.

More heavy precipitation, and more freezing rain events longer than six hours are probable. Forest fire frequency increases.

Ontario: Summertime warming is likely to rise by two to three degrees. Frost-free days in winter are expected to double by 2050. The research projects more heavy precipitation. As well, more freezing rain, flash flooding and wildfires are projected, with the highest increases in northwestern Ontario.

Manitoba and Saskatchewan: Temperature increases are likely to be greatest in winter and spring in the south, while drought and water scarcity are likely to be a growing climate risk through the prairies. More extreme precipitation events and flash flooding are expected.

Alberta: The province will be hit hard by drought and water scarcity due to reduced summer precipitation, falling lake levels, retreating glacier, decreasing soil-water content and more dry years. More hail, storms and wildfires are likely. Lightning flash density could increase by 20 per cent, with consequences for wildfires. More heavy rainfall events that can cause flash flooding are projected.

British Columbia: While the weather in B.C. will be variable, overall projections show warmer and wetter weather.

The mountain snowpack is expected to decline. Wildfires could increase significantly in forests.

The North: The likelihood of the temperature in Iqaluit exceeding 25C by 2050 could be five times greater than during the ’80s. Overall, the temperature is likely to increase by two to four degrees. The fire season in the Yukon and Northwest Territories will likely extend by 10 days, and sea levels could be 15-25 centimetres higher.

Unusual weather is becoming the norm in Canada.

This is clear in both the regional and national trends. We now need, as a country, to focus on adaptation to the new climate reality of more severe weather.

I have a lot of respect for Gordon McBean, and agree with him on his views on weather forecasting as we wrote in thr Report Of The 2004-2009 Research Review Of The Koninklijk Nederlands Meteorologisch Instituut which I posted on in March 2012 (see). Gordon and I were members of that Committee.

In our KNMI  report, which Gordon signed off on, we concluded that

The global climate model projections…… only a subset of possible climate conditions in the coming decades…

Thus, in the Canadian report that Gordon completed, he is ignoring a finding on the limits of the multi-decadal climate predictions that he agreed with in the KNMI report. Both of his viewpoints cannot be correct!

I summarized the lack of regional predictive skill in the post

Kevin Trenberth Was Correct – “We Do Not Have Reliable Or Regional Predictions Of Climate”

I agree with what Madhav writes below in his July 6 2012 response in the Star Phoenix;

Extreme caution best in assessing future weather by Madhav Khandekar

Khandekar is a retired Environment Canada scientist with more than 50 years of experience in weather and climate science, and an expert reviewer of the IPCC 2007 Climate Change Assessment.

In the viewpoint article Extreme weather becoming norm (SP, June 28) Lidsay Olson, vice-president of the Insurance Bureau of Canada, provides a glimpse of weather extremes for various regions of Canada and warns Canadian to be prepared to live with such extremes over the next several decades.

Olson refers to the study on future weather extremes done by Gordon McBean, former assistant deputy minister of Environment Canada. When did Canada witness a climate free of extreme weather, is what Olson fails to explain to Canadians.

Extreme weather is an integral part of the Earth’s climate. Throughout the recorded history of the Earth’s climate, extreme weather events have always occurred somewhere, and are caused by large-scale atmosphere ocean flow patterns and their complex interaction with local/regional weather and climate features.

An examination of the 20th century climate of North America reveals that the decades of 1920s and 1930s, known as the Dust Bowl years, witnessed perhaps the most extreme climate over the Great American Plains and elsewhere. There were recurring droughts and heat waves on the Canadian/American Prairies.

The prairies also witnessed some extreme cold winters during the 1910s and 1920s – for example in 1907 and 1920.

We meteorologists still do not fully understand why the climate of North America was so anomalous during the 1920s and 1930s.

During the 1950s and 1960s most of Canada witnessed extreme cold winters, especially on the prairies where record breaking low temperatures (Edmonton at minus 45C and below in the 1960s) were registered. In Ontario and Quebec, cold and snowy winters was a norm during the 1960s and early 1970s.

Parts of the Canadian Atlantic witnessed long winters with lots of snow. Spring ice jam on the St. John’s River was a common occurrence during the 1960s and 1970s.

The recent decades of the 1980s and 1990s have witnessed a warmer climate across most of North America and worldwide.

Several hot spells of varying durations (from few days to a week or more) have been recorded in North America, Europe and elsewhere. The year 1998 has been adjudged the “hottest year” in a 150-year long temperature record, according to the Intergovernmental Panel on Climate Change, the UN Body of climate scientists and environmentalists.

Will the Earth’s climate become significantly warmer in future? There is no definite answer so far. The best value for climate sensitivity (increase in the Earth’s mean temperature in future for a doubling of the atmospheric CO2 concentration) is now estimated to be just about 1C or so.

Would such a modest increase in future lead to increased severe extreme weather events, as Olson claims?

Would future extreme weather be any different from what Canadians have witnessed in the past?

Extreme weather will always be with us, no matter what. The best way to cope with future extreme weather is to develop an “early warning system” with improved long-lead weather/climate forecasting capabilities. Such an early warning system can help minimize adverse impacts from future extreme weather events.

Canadians from coast-to-coast should be able to live and cope with future weather extremes with adequate precaution and need not be psyched into accepting increased insurance in future, as Olson’s article seems to suggest.

source of image 

Comments Off

Filed under Climate Science Op-Eds

The Need For Precise Definitions In Climate Science – The Misuse Of The Terminology “Climate Change”

source of image

UPDATE JUNE 17 2012

My son had an insightful discussion on this subject in his post

The Narrow Defintion of Climate Change

where he refers to two of his papers

Pielke, Jr., R.A., 2005. Misdefining ‘‘climate change’’: consequences for science and action, Environmental Science & Policy, Vol. 8, pp. 548-561.

Pielke, Jr., R. A., 2004. What is Climate Change?, Issues in Science and Technology, Summer, 1-4.

*********ORIGINAL POST*************

The terminology in the field of climate and environmental science is filled with jargon words and the misuse of definitions. I have posted on this issue before with respect to the terms “global warming” and “climate change” in my posts

The Media (and Presidential Candidates) Remain In Error On The Distinction Between Global Warming And Climate Change

and

Recommended Definitions of “Global Warming” And “Climate Change”

To properly define these two terms, I recommended

Global Warming is an increase in the global annual average heat content measured in Joules.

Climate Change is any multi-decadal or longer alteration in one or more physical, chemical and/or biological components of the climate system.

Today’s post is to further elaborate on the terms that are used.

With respect to the terminology “climate change“, this term is being extensively used to mean “anthropognic caused changes in climate” from nearly “static” climatic conditions; e.g. see the figure below [source of image]

This is why  terminology such as “climate stabilization” is misused; e. g. see

Climate Stabilization Targets: Emissions, Concentrations, and Impacts Over Decades to Millennia (2010)

where this National Academy report writes

This new report from the National Research Council concludes that emissions of carbon dioxide from the burning of fossil fuels have ushered in a new epoch where human activities will largely determine the evolution of Earth’s climate.

However, as documented in another Academy report

National Research Council, 2005: Radiative forcing of climate change: Expanding the concept and addressing uncertainties.Committee on Radiative Forcing Effects on Climate Change, Climate Research Committee, Board on Atmospheric Sciences and Climate, Division    on Earth and Life Studies, The National Academies Press, Washington,D.C., 208 pp

and summarized in the article

Pielke Sr., R., K.  Beven, G. Brasseur, J. Calvert, M. Chahine, R. Dickerson, D.  Entekhabi, E. Foufoula-Georgiou, H. Gupta, V. Gupta, W. Krajewski, E.  Philip Krider, W. K.M. Lau, J. McDonnell,  W. Rossow,  J. Schaake, J.  Smith, S. Sorooshian,  and E. Wood, 2009: Climate change: The need to consider human forcings besides greenhouse gases.   Eos, Vol. 90, No. 45, 10 November 2009, 413. Copyright (2009) American   Geophysical Union.

the natural causes of climate variations and changes are important, as are the human influences. The human climate forcings involve a diverse range of first-order climate forcings, including, but not limited to, the human input of carbon dioxide (CO2). Most, if not all, of these human influences on regional and global climate will continue to be of concern during the coming decades.

As reported in the NRC (2005) report and written  in the Pielke et al 2009 article with respect to human climate forcings

In addition to greenhouse gas emissions, other first-order human climate forcings are important to understanding the future behavior of Earth’s climate. These forcings are spatially heterogeneous and include the effect of aerosols on clouds and associated precipitation [e.g., Rosenfeld et al., 2008], the infl uence of aerosol deposition (e.g., black carbon (soot) [Flanner et al. 2007] and reactive nitrogen [Galloway et al., 2004]), and the role of changes in land use/land cover [e.g., Takata et al., 2009]. Among their effects is their role in altering atmospheric and ocean circulation features away from what they would be in the natural climate system [NRC, 2005]. As with CO2, the lengths of time that they affect the climate are estimated to be on multidecadal time scales and longer.

With respect to natural climate forcings and feedbacks, in the article

Rial, J., R.A. Pielke Sr., M. Beniston, M. Claussen, J. Canadell, P. Cox,  H. Held, N. de Noblet-Ducoudre, R. Prinn, J. Reynolds, and J.D. Salas,  2004: Nonlinearities, feedbacks and critical thresholds within the Earth’s  climate system. Climatic Change, 65, 11-38.

we wrote

The Earth’s climate system is highly nonlinear: inputs and outputs are not proportional, change is often episodic and abrupt, rather than slow and gradual, and multiple equilibria are the norm.

Thus, the assumption of a stable climate system, in the absence of human intervention, is a mischaracterization of the behavior of the real climate system.

“Climate change’ is, and always has been occuring. Humans are now adding to the complexity of forcings and feedbacks, but change has always been a part of the climate system.

Thus, rather than using terminolgy such as “climate change” [which has come to mean the human caused part mostly due to added greenhouse gases], I recommend just using the term “climate” or “climate system”. When change is discussed, the specific component that is being discussed should be presented, such as an increase in annual averaged surface air temperatures, a decrease in the length of growing season etc.  Phrases such as “changes in regional and global climate statistics” could be used.

There is a very important reason to scrap the use of “climate change” by the impacts community. Key societal and environental resources, such as water, food, energy, ecosystem function, and human health respond to climate not just to an incremental change in the climatic conditions.

Another misused term is “global change“, when really what is almost always meant is a local and/or regional change in the environmental conditions, including from climate. The  accurate terminology should be “environmental change“.

Thus, my recommendations are to replace terminology such as climate change, climate stabilization, climate distruption and global change with accurate terminology. With respect to impacts on key resources, climate is one of the stressors, not just the “change” part. When changes in climatic conditions are discussed, present the actual climate variable(s) that are being altered.

This issue of terminology has been important as we work to complete the 5 volume set of books for Elsevier titled

“Climate Vulnerability  – Understanding and Addressing Threats to Essential Resources”. 2013:  Eds  R.A. Pielke Sr., Faisal Hossain, Dev Niyogi, George Kallos, Jimmy Adegoke, Caradee Y. Wright, Timothy Seastedt, Katie Suding and Dallas Staley. Elsevier

which will appear early in 2013. Our edits for the chapters have required us to address the improper use of the terminology by some of the authors. The current weblog post is intended to alert others to the frequent mischaracterization of the climate system.

source of image 

Comments Off

Filed under Climate Science Misconceptions, Climate Science Op-Eds

A Summary Of Why The Global Annual-Average Surface Temperature Is A Poor Metric To Diagnose Global Warming

Figure from Ellis et al 1978

The use of a global average surface temperature trend as a diagnostic to monitor global warming is, at best a crude approach, and at worst an erroneous tool for that purpose.  This post  summarizes why.

First, to describe global warming, let’s use the seminal paper

Ellis et al. 1978: The annual variation in the global heat balance of the Earth. J. Geophys. Res., 83, 1958-1962

While the specific values they reported in their paper can be updated with the newer data since 1978, the framework and general conclusions are equally valid today. Excerpts read [highlight added] – with their figure 4 presented at the top of this post:

A graph of the global components is shown in Figure 4 . It shows that the rate of ocean storage is in close agreement with the net radiation flux except for the months of January and February. (This disagreement may be due in large part to possible errors in southern hemisphere ocean data).

The annual variation in the earth’s net radiation balance may largely be accounted for by considering the effects which the present day earth-sun geometry and the asymmetrical distribution of continents between the northern and southern hemispheres have on the net radiation balance. The orbit of the earth about the sun is such that the earth is closest to the sun in January and farthest from the sun in July. This creates an annual 11.2 W per meter squared amplitude variation in the solar flux received by the planet earth. This variation is a purely external driving mechanisms, since it depends only on earth-sun geometry.

When a value of 30.4% (Table 2) for annual mean global albedo is used, the annual 11.2 W per meter squared amplitude variation of incoming solar flux translates into an approximate 7 .8 W per meter squared variation in absorbed solar flux at the top of the atmosphere.

Atmospheric data show an annual cycle in the global average near-surface temperature with an amplitude of 2C [Van Loon,1972]. Maximum and minimum values are found in July and January, respectively. This temperature variation may be interpreted as an amplitude variation of 7 W per m squared in the long-wave flux emission to space if typical atmospheric emissions are considered and all temporal variations in the intervening atmosphere are ignored [Ellis,1977]. This effect in the long-wave flux combines with the effect in the absorbed flux to give a 15 W per meter squared amplitude variation in the annual net radiation balance profile.

The global ocean can maintain equilibrium by an average change in its heat content between times of maximum storage and maximum release of less than 1C over a 50-m-thick layer.

Unfortunately, instead of basing a global warming analysis on the framework such as in the Ellis et al 1978 paper, the long-term annual-averaged, global surface temperature trend is typically used as the icon to describe global warming (e.g. see). The value of +2C is often presented as a threshold beyond which major climate disruption will occur (e.g. see).

However, there are fundamental problems with the use of the global surface temperature anomaly to diagnose global warming.  I have presented one of this issues in the post

Torpedoing Of The Use Of The Global Average Surface Temperature Trend As The Diagnostic For Global Warming

where I wrote

1.  If heat is being sequestered in the deeper ocean, it must transfer through the upper ocean. In the real world, this has not been seen that I am aware of. In the models, this heat clearly must be transferred  (upwards and downwards) through this layer. The Argo network is spatially dense enough that this should have been see.

2. Even more important is the failure of the authors to recognize that they have devalued the use of the global average surface temperature as the icon to use to communicate the magnitude of global warming.  If this deeper ocean heating actually exists in the real world, it is not observable in the ocean and land surface temperatures. To monitor global warming, we need to keep track of the changes in Joules in the climate system, which, as clearly indicated in the new study by Meehl and colleagues, is not adequately diagnosed by the global, annual-averaged surface temperature trends.

and that

A final comment on this paper, if heat really is deposited deep into the ocean (i.e. Joules of heat) it will dispersed through the ocean at these depths and unlikely to be transferred back to the surface on short time periods, but only leak back upwards if at all. The deep ocean would be a long-term damper of global warming, that has not been adequately discussed in the climate science community.

In the paper

Barnett, T.P., D.W. Pierce, and R. Schnur, 2001: Detection of anthropogenic  climate change in the world’s oceans. Science, 292, 270-274

they wrote

“…..a climate model that reproduces the observed change in global air temperature over the last 50 years, but fails to quantitatively reproduce the observed changed in ocean heat content, cannot be correct. The PCM [Parallel Climate Model] has a relatively low sensitivity (less anthropogenic impact on climate) and captures both the ocean- and air-temperature changes. It seems likely that models with higher sensitivity, those predicting the most drastic anthropogenic climate changes in the future, may have difficulty satisfying the ocean constraint.”

This text, as well as the Ellis et al 1978 study, seem to have been forgotten by the climate modeling community.  In the post

“World Ocean Heat Content And Thermosteric Sea Level Change (0-2000), 1955-2010″ By Levitus Et Al 2012

Levitus et al concluded that

The world ocean accounts for approximately 90% of the warming of the earth system that has occurred since 1955.

One third of the observed warming occurs in the 700-2000 m layer of the ocean.

The heat content of the world ocean for the 0-700 m layer increased by 16.7×10**22 J corresponding to a rate of 0.27 W per meter squared (per unit area of the world ocean) and a volume mean warming of 0.18ºC

The obvious conclusion should be to the climate community that

  • Using a global annual-averaged surface temperature trend  is only a small part of the analysis used to create figure 4 in Ellis et al. Their framework requires the absolute value of temperatures both spatially and temporally in order to construct a global annual-average surface long wave emission.
  • Seeking to diagnose the magnitude of global warming (as represented by an annual average global radiative imbalance of the ocean-land-atmosphere using a global average surface (~2m) temperature trend) is fundamentally flawed. If ~90% of the heating is in the oceans, what is the value of diagnosing global warming using the land portion of the surface temperature record even if it did not have a warm systematic associated with the minimum temperatures as we have reported on.
  • The annual variation in the radiative imbalance is on the order of 3o Watts per meter squared. Diagnosing a multi-decadal chane in its mean annual value to the order of tenths of a Watt per meter squared is hard enough using long-term annual-averaged changes in ocean heat storage change. Trying to do this with the surface temperatures in values of tenths of a degree C, even up to 2C, with its large spatial and temporal variations, is an even more difficult task.

My recommendation to the climate community is that an updated version of the figure at the top of this post be presented for as many years as possible.  With the new data, such as the more robust Argo data and satellite monitoring of tropospheric temperatures and top of the atmosphere radiative fluxes, this should become the gold standard of monitoring global warming. The use of the global annual-averaged surface temperature trends for this purpose would be relegated to where it deserves to be – an historical relic.

Comments Off

Filed under Climate Change Metrics, Climate Science Op-Eds

Climate Science Malpractice – The Promotion Of Multi-Decadal Regional Climate Model Projections As Skillful

If a company developed a drug for the treatment of a disease but did not do clinical tests, it would not be prescribed by reputable physicians. Indeed, there are claims by pill companies that promote health benefits, yet the Federal Drug Administration requires adding

“This statement has not been evaluated by the FDA. This product is not intended to diagnose, treat, cure, or prevent any disease”?

There is a clear analog with multi-decadal climate model predictions where no skill has been shown in hindcast predictions of changes in multi-decadal regional climate statistics.  As we have reported in our paper

Pielke Sr., R.A., and R.L. Wilby, 2012: Regional climate downscaling – what’s the point? Eos Forum,  93, No. 5, 52-53, doi:10.1029/2012EO050008

“It is ….. inappropriate to present [multi-decadal regional climate forecasts]…… to the impacts community as
reflecting more than a subset of possible future climate risks.”

Skill in multi-decadal regional climate model predictions of changes in climate statistics has not been shown (i.e. there are no “clinical trials” to show that the approach is robust) .

For future studies in the literature and media releases to present their results as anything more than a model sensitivity experiment (and that they should only be interpreted as, at best, a subset of what is plausible for the future climate), they would be guilty of climate science malpractice.

As just one text example of what this means, statements such as “temperatures will increase by 1C”, for example, should be written as “temperatures could increase by 1C”.  The use of the term “will” indicates a certainty in the climate prediction which is not correct.  The term “could” means the prediction is plausible.

Also, if they still insist on presenting their model results in figures with decadal time periods on them (e.g. 2040-2049, etc), they must make it clear that the results are intended to improve our understanding of climate processes and not an actual forecast for those decades that should be used by the impacts community to represent the envelope of what the regional climate will be decades from now.

Even for those studies that present their results as sensitivity studies, their paper should have an FDA-like disclaimer;

“The multi-decadal regional climate model results presented in this paper have not shown skill at predicting changes in multi-decadal climate statistics. The model results in our study should not be used to quantify the envelope of the risks from climate to societal and environmental resources in the coming decades. Our model sensitivity results are provided only to assist in improving our understanding of climate processes.  “

Without this disclaimer in papers, assessments and other communications which report on multi-decadal regional and local simulations of changes in climate statistics, they are committing climate science malpractice.

This label, of course, can be avoided if the researchers provide  quantitiative model and observational  comparisons of multi-decadal regional and local predictions of changes in climate statistics, and show them to be skillful in terms of what metrics are needed by the impacts community. I invite anyone who has published such a study to present a guest post on this weblog alerting us to such a robust scientific study.

source of image

Comments Off

Filed under Climate Science Misconceptions, Climate Science Op-Eds