Monthly Archives: December 2011

Happy New Year!

I am taking off this week from posting and will start again on January 2 2012. Meanwhile, enjoy this Holiday week!

source of image

Comments Off on Happy New Year!

Filed under Uncategorized

Merry Christmas!!!

I wish everyone, on all sides of the climate science debate, an enjoyable Holiday Season!!!

source of image

Comments Off on Merry Christmas!!!

Filed under Uncategorized

Further Confirmation Of Klotzbach Et Al 2009

Update:

After I wrote and scheduled  this post, I see that John Christy and Roy Spencer have a directly related weblog post titled

Addressing Criticisms of the UAH Temperature Dataset at 1/3 Century

which appeared yesterday.

As many of you may know, Gavin Schmidt was critical of our paper

Klotzbach, P.J., R.A. Pielke Sr., R.A. Pielke Jr., J.R. Christy, and R.T. McNider, 2009: An alternative explanation for differential temperature trends at the surface and in the lower troposphere. J. Geophys. Res., 114, D21102, doi:10.1029/2009JD011841

where we concluded that

 “This paper investigates surface and satellite temperature trends over the period from 1979 to 2008. Surface temperature data sets from the National Climate Data Center and the Hadley Center show larger trends over the 30-year period than the lower-tropospheric data from the University of Alabama in Huntsville and Remote Sensing Systems data sets. The differences between trends observed in the surface and lower-tropospheric satellite data sets are statistically significant in most comparisons, with much greater differences over land areas than over ocean areas. These findings strongly suggest that there remain important inconsistencies between surface and satellite records.”

We looked at the issues he raised and clarified our analysis further in

Klotzbach, P.J., R.A. Pielke Sr., R.A. Pielke Jr., J.R. Christy, and R.T. McNider, 2010: Correction to: “An alternative explanation for differential temperature trends at the surface and in the lower troposphere. J. Geophys. Res., 114, D21102, doi:10.1029/2009JD011841″, J. Geophys. Res., 115, D1, doi:10.1029/2009JD013655

where we concluded that the findings in our 2009 paper are robust. In our 2010 paper, we concluded that, with respect to our 2009 paper,

“….no changes are needed in our paper’s conclusions.”

There is now an independent confirmation of our results as reported by Steve McIntyre in his weblog post

Un-Muddying the Waters

on November 7 2011.  I expected a further response from Gavin Schmidt on this issue, but have finally decided to post on Steve’s post and comments as there has been no additional interaction’s on this subject that I am aware of.

In Steve’s post, he concluded that

“….the discrepancy between the revised downscaling of Klotzbach et al 2010 and Schmidt’s Nov 2009 realclimate post – is now totally reconciled. The amended numbers of Klotzbach et al 2010 appear reasonable.

Gavin Schmidt’s criticism of the downscaling over land in Klotzbach et al 2009 and of my original graphic in Closing BEST Comments post was justified, but his own figures for downscaling were incorrect. The diagnosis of the discrepancy was complicated by the fact that his actual method did not correspond to the most reasonable interpretation of his realclimate article. Thanks to Gavin’s clarifications, we now have what seems to be a definitive diagnosis of the discrepancy and where Gavin got wrongfooted. It seems to me that it would be constructive to note the resolution of the discrepancy in the original RC post.”

The significance of the discussion between Steve and Gavin is that the large discrepancy between the global annual average lower tropospheric and surface temperature anomalies remains.  The Muller BEST analysis does not in any way alter this conclusion.

The current discrepancy can be see in the three figures below. The BEST analysis is only for the land, but the ocean anomalies would have to be significantly negative to result in an anomaly plot close to the RSS analysis of lower tropospheric temperatures. The GISS analysis presented in the third figure shows that the positive temperature anomalies, which are much higher than the RSS (and, the UAH) lower tropospheric anomalies, persist in the global average.

Fig 1. Lower tropospheric global average temperature anomaly from RSS with the bottom axis  1979 to 2012 in yearly tick marks [from http://www.ssmi.com/msu/msu_data_description.html]

Fig 2. BEST global-land average surface temperature anomalies [from  http://berkeleyearth.org/faq/]

Fig 3. Global surface temperature anomalies from NASA GISS [ http://data.giss.nasa.gov/gistemp/graphs/]

 

source of image at top of post

Comments Off on Further Confirmation Of Klotzbach Et Al 2009

Filed under Climate Change Metrics, Research Papers

Example Of A Bottom-Up, Resource-Based Perspective On Environmental Risks – CBS 60 Minutes – “The Gardens of the Queen”

CBS 60 minutes presented an excellent example of the value of a bottom-up, resource-based prespective of risk as we have proposed in our paper

Pielke Sr., R.A., R. Wilby, D. Niyogi, F. Hossain, K. Dairuku, J. Adegoke, G. Kallos, T. Seastedt, and K. Suding, 2011: Dealing  with complexity and extreme events using a bottom-up, resource-based  vulnerability perspective. AGU Monograph on Complexity and  Extreme Events in Geosciences, in press.

We urge focusing on 5 resources –  water, food, energy, human health and ecosystem function.

The 60 minutes show, televised on December 18 2011, and reported by Anderson Cooper,  is titled

The Gardens of the Queen

and looks at ecosystem function of a tropical coral reef.

Excerpts of text from the show include from Anderson [highlights added]

Coral reefs are often called “the rainforests of the ocean.” They’re not just biologically diverse and stunningly beautiful, they’re a source of food and income for nearly a billion people. They’re also in danger. Scientists estimate that 25 percent of the world’s reefs have died off and much of what’s left is at risk. There is, however, one spot in the Caribbean that marine biologists describe as a kind of “under-water Eden,” a coral reef largely untouched by man.

Anderson Cooper stated that

 I’ve been diving in many places all over the world and I’ve never seen so many large fish. Like this grouper here. There’s about six or seven Caribbean reef sharks like this circling around right now. Scientists will tell you the presence of so many sharks and different species of sharks, is a sign of a very healthy reef.

David Guggenheim [an American marine biologist and a senior fellow at the Ocean Foundation in Washington, D.C.} said that

I went to Veracruz, Mexico, and I was told about the magnificence of the Veracruz Reefs. And when we got there, we saw that 95 percent of that reef had died and it had died quickly since the last time scientists were there. And I felt like I was going through a city, a magnificent civilization that had once stood there, but it was burned out. Nobody was there.

Scientists say the world’s reefs are being harmed by a complex combination of factors; including pollution, agricultural runoff, coastal development, and overfishing. It turns out fish are essential to the health of a reef. Researchers at the National Oceanic and Atmospheric Administration and other leading institutions are also very concerned about climate change because they believe rising ocean temperatures are triggering a process called “bleaching” in which the coral weakens, turns white and often dies.

and

The reason this reef’s doing so well, Fabian Pina believes, is that it’s far from the mainland and well-protected.

Anderson Cooper stated that

Fabian [Fabian Pina is a Cuban marine biologist] and David have noticed some bleaching here, but the coral tends to recover after a few months, leading them to wonder whether there’s something about this reef that’s making it more resistant to threats.

David Gugenheim stated that

Maybe it’s because this ecosystem is being protected, it’s got a leg up on other ecosystems around the world that are being heavily fished and heavily impacted by pollution. So that makes it more resilient. That’s one of the theories that if we do what we can locally that these reefs have a better chance of being resilient to what’s happening globally.

There is a very important message from this news report. The risks to coral reefs are dominated by local interference by humans on its ecosystem function.  Such effects include local pollution (e.g. runoff from rivers and shorelines and from shipping; overfishing including the major predator species such as sharks).

What seems to be a minor, or even an inconsequential effect, is any warming of the ocean (i.e. global warming) despite the reference by NOAA in the CBS show  to bleaching (they also showed a calving glacier :-)).

Despite this short reference to  global warming in the CBS report, the report is quite an important addition to the broadening out of environmental issues beyond the myopic focus on global warming. The contrast between reef health near Veracruz, Mexico and the Cuba Preserve should convincingly show objective readers that coral bleaching from global warming is clearly not the largest threat to the health of tropical coral reefs.

source of image

Comments Off on Example Of A Bottom-Up, Resource-Based Perspective On Environmental Risks – CBS 60 Minutes – “The Gardens of the Queen”

Filed under Climate Science Reporting, Vulnerability Paradigm

The Proposed Multi-Dimensional Growth Of The EPA In Climate Science

There is a news article by  of Fox News titled

EPA Ponders Expanded Regulatory Power In Name of ‘Sustainable Development’

which includes the text [highlight added]

“Environmental impact assessment tends to focus  primarily on the projected environmental effects of a particular action and  alternatives to that action,” the study says. Sustainability impact assessment  examines “the probable effects of a particular project or proposal on the  social, environmental, and economic pillars of sustainability”—a greatly  expanded approach.

One outcome: “The culture change being proposed here will require EPA to conduct an expanding number of assessments.”

As a result, “The agency can become more  anticipatory, making greater use of new science and of forecasting.”

The catch, the study recognizes, is that under the  new approach the EPA becomes more involved than ever in predicting the  future.”

In my post on May 15 2009

Comments On The EPA “Proposed Endangerment And Cause Or Contribute Findings For Greenhouse Gases Under The Clean Air Act”

I wrote

I have generally supported most EPA actions which have been designed to support environmental improvement. These regulations have resulted in much cleaner water and air quality over the past several decades; e.g. see

National Research Council, 2003: Managing carbon monoxide pollution in meteorological and topographical problem areas. The National Academies Press, Washington, DC, 196 pp.

However, the EPA Endangerment Findings for CO2 as a climate forcing falls far outside of the boundary of the type of regulations that this agency should be seeking.

The EPA on April 17, 2009 released this finding in “Proposed Endangerment and Cause or Contribute Findings for Greenhouse Gases under the Clean Air Act”.

This report is a clearly biased presentation of the science which continues to use the same reports (IPCC and CCSP) to promote a particular political viewpoint on climate (and energy) policy).

The text includes the statements

“The Administrator signed a proposal with two distinct findings regarding greenhouse gases under section 202(a) of the Clean Air Act:

Action

“The Administrator is proposing to find that the current and projected concentrations of the mix of six key greenhouse gases—carbon dioxide (CO2), methane (CH4), nitrous oxide (N2O), hydrofluorocarbons (HFCs), perfluorocarbons (PFCs), and sulfur hexafluoride (SF6)—in the atmosphere threaten the public health and welfare of current and future generations. This is referred to as the endangerment finding.

The Administrator is further proposing to find that the combined emissions of CO2, CH4, N2O, and HFCs from new motor vehicles and motor vehicle engines contribute to the atmospheric concentrations of these key greenhouse gases and hence to the threat of climate change. This is referred to as the cause or contribute finding.”

As Climate Science has shown in the past; e.g. see

New Plans To Regulate CO2 As A Pollutant

Comments On The Plan To Declare Carbon Dioxide as a Dangerous Pollutant

A Carbon Tax For Animal Emissions – More Unintended Consequences Of Carbon Policy In The Guise Of Climate Policy

Will Climate Effects Trump Health Effects In Air Quality Regulations?

Supreme Court Rules That The EPA Can Regulate CO2 Emissions

Science Issues Related To The Lawsuit To The Supreme Court As To Whether CO2 is a Pollutant

the “cause” for their endangerment finding can cover any human caused climate forcing.

In my May 15 2009 post, I gave an example of how their finding could be rewritten to cover other human climate forcings. As another example, based on our paper

Pielke Sr., R.A., A. Pitman, D. Niyogi, R. Mahmood, C. McAlpine, F. Hossain, K. Goldewijk, U. Nair, R. Betts, S. Fall, M. Reichstein, P. Kabat, and N. de Noblet-Ducoudré, 2011: Land  use/land cover changes and climate: Modeling analysis  and  observational evidence. WIREs Clim Change 2011. doi: 10.1002/wcc.144

the paragraph above for an EPA Action could be rewritten as

The Administrator is further proposing to find that certain land use changes result in a threat of climate change. This is referred to as the cause or contribute finding.”

The EPA. according to this news report, could be developing justification to move into areas of regulation that they have not been involved with in the past, including land management.

They also, as implied in the article,  reply on multi-decadal climate predictions of societal and environmental impacts, which, as has been shown in our paper

Pielke Sr., R.A., R. Wilby, D. Niyogi, F. Hossain, K. Dairuku, J. Adegoke, G. Kallos, T. Seastedt, and K. Suding, 2011: Dealing  with complexity and extreme events using a bottom-up, resource-based  vulnerability perspective. AGU Monograph on Complexity and  Extreme Events in Geosciences, in press

and weblog posts; e.g. see

The Huge Waste Of Research Money In Providing Multi-Decadal Climate Projections For The New IPCC Report

have NO predictive skill.  The EPA would be seeking broader regulatory ability to influence policy but without a sound scientific basis.

I have always been a strong supporter of clean air and water, as exemplified with my two terms on the Colorado Air Quality Commission during the administration of Governor Romer (D).  I have published numerous papers and taught classes on air quality including  the use of mesoscale and boundary layer models to develop improved proceedures to assess the risk of pollution from power plant plumes, vehicular emissions, and other sources of these contaniments.  The EPA has been a leader in the effort to reduce human and environmental exposure to toxic and hazardous pollutants.

However, the broadening of the EPA into climate forcings based on model predictions, as reported in the Fox News article,  is a significant concern.

I would be interested in a dialog with them, based on the bottom-up, resource-based vulnerabiltiy persepective presented in our paper

Pielke Sr., R.A., R. Wilby, D. Niyogi, F. Hossain, K. Dairuku, J. Adegoke, G. Kallos, T. Seastedt, and K. Suding, 2011: Dealing  with complexity and extreme events using a bottom-up, resource-based  vulnerability perspective. AGU Monograph on Complexity and  Extreme Events in Geosciences, in press

to see what areas of risk should fit within their regulatory framework. As we wrote in that paper, the bottom-up, resource-based framework

concept requires the determination of the major threats to local and regional water, food, energy, human health, and ecosystem function resources from extreme events including climate, but also from other social and environmental issues. After these threats are identified for each resource, then the relative risks can be compared with other risks in order to adopt optimal preferred mitigation/adaptation strategies.”

In my view, this is the way forward with respect to assessing “sustainability”, and discussions should be undertaken to ascertain if the EPA is the right venue to do this.

As reported in the Fox News article, however, the EPA is considering the broadening out of their regulatory authority, but without building on a sound scientific foundation.  There is no evidence that their approach to sustainability uses the inclusive, bottom-up assessment approach, such as given in our 2011 paper.

If the EPA persists in using the top-down IPCC approach to develop impact assessments, they will inevitably develop seriously flawed policy responses.

source of image

Comments Off on The Proposed Multi-Dimensional Growth Of The EPA In Climate Science

Filed under Climate Change Regulations, Climate Science Op-Eds, Examples Of Waste Of Funding

Meeting Announcement “Non-CO2 Influences Of Land Cover Changes On Climate” At The European Geosciences Union General Assembly – Vienna, 22 – 27 April 2012

I received this meeting announcement today. I am pleased to see this much need broadening beyond CO2 and a few other greenhouse gases. The use of carbon stocks as the measure of climate change is, by itself, insufficient to characterize the climate system. The meeting will add important new insight into non-CO2 climate forcings. The information on the meeting follows.

I wanted to bring to your attention a session entitled “Non-CO2 influences of land cover changes on climate” at the European Geosciences Union General Assembly (Vienna, 22 – 27 April 2012), that Pierre Bernier, Vivek Arora, Alvaro Montenegro and I are organizing. Abstracts must be submitted by 17 January 2012. We hope that you can participate. It would be a chance to get researchers in this field to come together to present and discuss findings.

Sincerely

Neil Bird (on behalf of Pierre, Vivek and Alvaro)

Ps. Please forward this email to colleagues that may be interested

BG2.4 Non-CO2 influences of land cover changes on climate

Convener: P.Y. Bernier [cid:image001.gif@01CCBB48.AF8D3C50] <javascript:void(0)> Co-Conveners: D. N. Bird [cid:image001.gif@01CCBB48.AF8D3C50] <javascript:void(0)> , V. Arora [cid:image001.gif@01CCBB48.AF8D3C50] <javascript:void(0)> , A. Montenegro [cid:image001.gif@01CCBB48.AF8D3C50] <javascript:void(0)> Abstract Submission<http://meetingorganizer.copernicus.org/EGU2012/abstractsubmission/10343>

Convener Login<http://meetingorganizer.copernicus.org/EGU2012/sessionmodification/10343>

Changes in land cover properties that accompany land use changes can impact climate. Changes in carbon stocks are used as a convenient proxy for these climate impacts, but changes in other land cover properties can also affect the climate in ways that can amplify or diminish the effect of carbon stock changes. This session will be open to presentations on changes in albedo, latent heat transfer and other mechanisms through which land cover changes affect climate at regional to global scales.

http://meetingorganizer.copernicus.org/EGU2012/session/10343

source of image 

Comments Off on Meeting Announcement “Non-CO2 Influences Of Land Cover Changes On Climate” At The European Geosciences Union General Assembly – Vienna, 22 – 27 April 2012

Filed under Climate Science Meetings

What Climategate 2.0 Says About The Prediction Of Multi-Decadal Regional Climate Change

Marcel Crok has alerted us to the post by Maurizio Morabito titled

On The Slow, Painful (and Deadly) Demise Of The IPCC

at the weblog The Unbearable Nakedness of Climate Change

Their post involves the discussion of the predictive skill of multi-decadal regional climate change. It is essential to recall that the climate models must not only be able to skillfully predict current climate statistics but also how these statistics would change due to the human intervention of the climate system.

Vast amounts of money are being spent (and wasted) claiming that such multi-decadal climate change predictions are accurate and can be used by the impacts community. See, for example, my recent post

The Huge Waste Of Research Money In Providing Multi-Decadal Climate Projections For The New IPCC Report

Misleading Climate Science – An Example Of Multi-Decadal Regional Climate Predictions With No Demonstrated Skill On That Time Scale

I have reproduced  On The Slow, Painful (and Deadly) Demise Of The IPCC post below, including retaining the highlighted text.

Climategate 2.0 is helping filling some knowledge gaps, for example in the way the IPCC has been slowing killing itself, and several thousands humans to. The following concerns Regional Projections, and it’s a tragedy of communication.

Willingly or not, the IPCC has become a source of deadly confusion exactly because it has provided the information its audience wanted, even if it was scientifically unprepared to prepare that information.

It’s the year 2000 and the IPCC Third Assessment Report (TAR) is being prepared. As we can see in file 0598.txt, there is a frank exchange of opinions about WG1-Chapter 10 “Regional Climate Information – Evaluation and Projections”, between Filippo Giorgi (Coordinating Lead Author), Hans Von Storch (Lead Author), Jens H. Christensen (Lead Author) (my emphasis of course):

Giorgi:

Under the “encouragement” of Sir John, we also decided to add a text box on what we can say about regional climate change over different continents. This will probably be the most-read part of the chapter, so we need to be very careful with it. I and Peter will produce a draft to circulate. I know that originally we did not want to do this, but this is what they are asking us to do and it is now very clear that it is the main purpose of the chapter, so we have to do it.

Von Storch:

First, I don’t think that John Houghton is particularly qualified in saying anything about regional assessments. So far as I know he has no relevant official capacity in the process,and he has not been particularly helpful in SAR. Actually, I consider him a politically interested activist and not as a scientist.[…] I do not agree [about adding the text box]. What were the arguments we originally did not want to do this? What are the new arguments overriding our previous concerns? I am sure that people would love to read this statement in New York Times. We don’t feel confident to make a statement, and then, suddenly, under the encouragement of Sir John, we include it? This is truly embarrassing. If the purpose of the Chapter is to produce statements on regions, and we found we can not do that, what should the assessment be? Simply: “We can not do it at this time, but we have a variety of techniques to derive scenarios. However, for various reasons, we can not say that they are consistent, even if there is some convergence.”

Giorgi:

This is an important point […] In my eyes Sir John represents the typical reader of this report and if he made that comment and “encouragement” it means that our chapter is not sending the proper message (after all he is one of the chairs of IPCC WGI). You may remember that I was always of the opinion that we were talking too much about techniques and too little about climate change. Now I think that we need to change that to the extent possible: reduce technical issues, increase climate change information. We actually already have a lot of that information in there, especially in the AOGCM part. What Sir John asked was to make it more “legible”, and we decided in Auckland to make it in the form of a box. We cannot invent information of course, but we can condense it in this box by including 1) the info relative to what AOGCMs sy for different continent, which is already there; 2) all possible other info from the techniques. If there is none or if we can say nothing we’ll say we cannot do it for that specific region. but I think we need to do something because the way it is, the chapter does not address the right audience, which is not only made up only of scientists.

AGAIN, I WOULD LIKE TO KNOW FROM ALL OF YOU WHETHER WE SHOULD HAVE A BOX IN THERE WITH INFORMATION FOR REGIONS OR NOT. I DO BELIEVE THAT THIS WOULD BE THE MOST READ PART OF THE CHAPTER, WHICH WOULD BE A PROOF THAT NOW WE ARE MISSING THE TARGET. SO LET EVERYONE ELSE KNOW.

Christensen:

I just want to add my opinion on this. I do agree with the point that we have to offer the regional information available. By setting up the box with the regions, we will provide the obvious assessment over many regions, which Hans has put forward so simple: The quality of the global models are too poor to give any clear information about regional climate change. We can state for the various regions, where there is some information, to what extend there is agreement between models etc. However, even agreement amongst models does not at this stage allow for any thorough assessment about uncertainties about changes. This must come out crystal clear, even if this will be the message for all regions! At least we will make out point about assessing regional climate change very clear this way.

Months later, the report comes out. Houghton’s text box has become “Box 10.1: Regional climate change in AOGCMs which use SRES emission scenarios” (page 600 here). Caveats are in place:

Introduction This box summarises results on regional climate change obtained from a set of nine AOGCM simulations undertaken using SRES preliminary marker emission scenarios A2 and B2. […].These results should be treated as preliminary only.

However, Christensen’s cautionary suggestion is totally reversed, and agreement among models is seen as a measure of certainty of changes “relatively speaking”:

Agreement across the different scenarios and climate models suggests, relatively speaking, less uncertainty about the nature of regional climate change than where there is disagreement

It’s now 2007. The equivalent chapter is AR4 WG 1-Chapter 11 “Regional Climate Projections“. Christensen is now a Coordinating Lead Author, Giorgi a Review Editor. And what has happened to the chapter?

  1. The “Summary of the Third Assessment Report” is mostly a summary of Box 10.1 described above. Everything else has been thrown in the classical bin
  2. The whole chapter in 2007 is actually a giant version of Box 10.1 in 2003
  3. Amazingly (and unscientifically) we’re being spoken of some “robust findings on regional climate change for mean and extreme precipitation, drought and snow”
Giorgi’s “target” has been achieved. The “most read part in the chapter” has become the whole chapter. A description of the current knowledge has become less important than providing what people asked. The audience has won, and the science has lost.
Then it gets worse.
Even in 2007, regional changes described by the IPCC are for the 2080-2099. Captions are very explicit on the subject. For example look IPCC AR4 WG1 report, section 11.2  about Africa. In particular, figure 11-2 is about “Temperature and precipitation changes over Africa from the MMD-A1B simulations“. Both in the text and in the caption, the projected time period of 2080-2099 is clearly indicated.
In 2011 however, Chris Funk feels compelled to write a column for Nature, published on Aug 3 as “We thought trouble was coming“, describing “how his group last year forecast the drought in Somalia that is now turning into famine — and how that warning wasn’t enough” and in particular lamenting that
The global climate models used by the Intergovernmental Panel on Climate Change were never intended to provide rainfall trend projections for every region. These models say that East Africa will become wetter, yet observations show substantial declines in spring rainfall in recent years. Despite this, several agencies are building long-term plans on the basis of the forecast of wetter conditions..

Those agencies might have foolishly misunderstood the IPCC message. Perhaps, they believe too much in it, missing therefore the small print indicating wetter conditions are expected 70 years since.

And so we have gone full circle. Originally provided by scientists ready to stretch the science on the “encouragement” by Sir John Houghton, in the space of a single decade Regional Projections have gone on to become an unwittingly deadly tool.

As added information, I was invited to serve as one of the contributing authors of the regional modeling part of the 1995 IPCC report [of which Fillipo Giorgi was also involved with]. I resigned from the IPCC as documented in the letter below (see also my post on this letter)

The erroneous IPCC presentation of multi-decadal regional climate prediction skill continues today (2011).  The views I expressed in the letter are further bolstered by these Climategate 2.0 e-mails.

source of image

Comments Off on What Climategate 2.0 Says About The Prediction Of Multi-Decadal Regional Climate Change

Filed under Climate Models, Climategate e-mails

November 2011 University Of Alabama Analysis Of The Global Lower Tropospheric Temperature Analysis

Phil Gentry has provided the November 2011 update of the University of Alabama lower troposheric temperature analysis.

Global temperature record reaches one-third century

Global Temperature Report: November 2011

Global climate trend since Nov. 16, 1978: +0.14 C per decade

November temperatures (preliminary)

Global composite temp.: +0.12 C (about 0.22 degrees Fahrenheit) above 30-year average for November.

Northern Hemisphere: +0.07 C (about 0.15 degrees Fahrenheit) above 30-year average for November.

Southern Hemisphere: +0.17 C (about 0.31 degrees Fahrenheit) above 30-year average for November.

Tropics: +0.02 C (about 0.04 degrees Fahrenheit) above 30-year average for November.

October temperatures (revised):

Global Composite: +0.12 C above 30-year average

Northern Hemisphere: +0.17 C above 30-year average

Southern Hemisphere: +0.06 C above 30-year average

Tropics: -0.05 C below 30-year average

(All temperature anomalies are based on a 30-year average (1981-2010) for the month reported.)

Notes on data released Dec. 16, 2011:

The end of November 2011 completes 33 years of satellite-based global temperature data, according to John Christy, a professor of atmospheric science and director of the Earth System Science Center (ESSC) at The University of Alabama in Huntsville. Globally averaged, Earth’s atmosphere has warmed about 0.45 Celsius (about 0.82° F) during the almost one-third of a century that sensors aboard NOAA and NASA satellites have measured the temperature of oxygen molecules in the air.

This is at the lower end of computer model projections of how much the atmosphere should have warmed due to the effects of extra greenhouse gases since the first Microwave Sounding Unit (MSU) went into service in Earth orbit in late November 1978, according to satellite data processed and archived at UAHuntsville’s ESSC.

“While 0.45 degrees C of warming is noticeable in climate terms, it isn’t obvious that it represents an impending disaster,” said Christy. “The climate models produce some aspects of the weather reasonably well, but they have yet to demonstrate an ability to confidently predict climate change in upper air temperatures.”

The atmosphere has warmed over most of the Earth’s surface during the satellite era. Only portions of the Antarctic, two areas off the southwestern coast of South America, and a small region south of Hawaii have cooled. On average, the South Pole region has cooled by about 0.05 C per decade, or 0.16 C (0.30° F) in 33 years. The globe’s fastest cooling region is in the central Antarctic south of MacKenzie Bay and the Amery Ice Shelf. Temperatures in that region have cooled by an annual average of about 2.36 C (4.25° F).

The warming trend generally increases as you go north. The Southern Hemisphere warmed 0.26 C (0.46° F) in 33 years while the Northern Hemisphere (including the continental U.S.) warmed by an average of 0.65 C (1.17° F).

The greatest warming has been in the Arctic. Temperatures in the atmosphere above the Arctic Ocean warmed by an average of 1.75 C (3.15° F) in 33 years. The fastest warming spot is in the Davis Strait, between the easternmost point on Baffin Island and Greenland. Temperatures there have warmed 2.89 C (about 5.2° F).

While Earth’s climate has warmed in the last 33 years, the climb has been irregular. There was little or no warming for the first 19 years of satellite data.  Clear net warming did not occur until the El Niño Pacific Ocean “warming event of the century” in late 1997.  Since that upward jump, there has been little or no additional warming.

“Part of the upward trend is due to low temperatures early in the satellite record caused by a pair of major volcanic eruptions,” Christy said. “Because those eruptions pull temperatures down in the first part of the record, they tilt the trend upward later in the record.”

Christy and other UAHuntsville scientists have calculated the cooling effect caused by the eruptions of Mexico’s El Chichon volcano in 1982 and the Mt. Pinatubo volcano in the Philippines in 1991. When that cooling is subtracted, the long-term warming effect is reduced to 0.09 C (0.16° F) per decade, well below computer model estimates of how much global warming should have occurred.

Although volcanoes are a natural force, eruptions powerful enough to affect global climate are rare and their timing is random. Since that timing has a significant impact on the long-term climate trend (almost as much as the cooling itself), it makes sense to take their chaotic effect out of the calculations so the underlying climate trend can be more reliably estimated.

What it doesn’t do is tell scientists how much of the remaining warming is due to natural climate cycles (not including volcanoes) versus humanity’s carbon dioxide emissions enhancing Earth’s natural greenhouse effect.

“That is the Holy Grail of climatology,” said Roy Spencer, a principal research scientist in the ESSC, a former NASA scientist and Christy’s partner in the satellite thermometer project for more than 20 years. “How much of that underlying trend is due to greenhouse gases? While many scientists believe it is almost entirely due to humans, that view cannot be proved scientifically.”

When the first MSU went into orbit in 1978, it wasn’t designed for monitoring long-term changes in the climate. Instead, it was built to give meteorologists two temperature readings a day over about 96 percent of the planet to provide input into computerized weather prediction models, the forerunners of climate models.

“All of the satellite instruments but one were designed to measure day-to-day weather changes, not long-term climate,” said Spencer. “It has been a challenge to make the necessary corrections to the data so we can use the instruments for long-term climate monitoring.”

While the satellite data record is shorter than the surface thermometer record, it has several strengths. It has the greatest global coverage: With 96 percent coverage of the globe (except for small areas around the north and south poles), the satellite sensors cover more than twice as much of Earth’s surface as do thermometers.

It is also less likely than surface-based thermometers to be influenced by local development, Spencer said.  Urbanization typically contributes to local warming due to the asphalt effect, when paving and buildings absorb and convert into heat sunlight that would naturally have been reflected back into space.

While that heat can raise temperatures recorded by thermometers at surface weather stations, the effect on the atmosphere is so local and so shallow that it dissipates before it can heat the deep atmosphere above it. As a result, satellite measurements have shown no indication of an urban contamination effect, Spencer said.

Another strength is that the microwave sensors gather temperature data for a deep layer of the atmosphere, rather than just at the surface.

“What we look at is a bulk measurement of the atmosphere’s heat content,” Christy said. “That is the physical quantity you want to measure to best monitor changes in the climate.  Plus, it’s consistent. You can take a single satellite ‘thermometer’ and measure the temperature of the whole Earth, rather than just at a single spot.”

While the satellite dataset has its strengths, unlike thermometers and temperature probes used on weather balloons the Microwave Sounding Units were new, largely untested tools when they were put into space. Spencer, Christy and other scientists have had to develop small corrections that they use every month to reduce errors caused by the satellites losing altitude or drifting in their orbits.

While year-to-year temperature variations measured by the satellite sensors closely match those measured by both surface thermometers and weather balloons, it is the long-term warming trend on which the satellites and the surface thermometers disagree, Spencer said, with the surface warming faster than the deep layer of the atmosphere.

If both instruments are accurate, that means something unexpected is happening in the atmosphere.

“The satellites should have shown more deep-atmosphere warming than the surface, not less” he said. “Whatever warming or cooling there is should be magnified with height. We believe this is telling us something significant about exactly why the climate system has not warmed as much as expected in recent decades.”

Publication of the November 2011 Global Temperature Report was delayed by several days due to a ground station malfunction.

Archived color maps of local temperature anomalies are available on-line at:

http://nsstc.uah.edu/climate/

The processed temperature data is available on-line at:

vortex.nsstc.uah.edu/data/msu/t2lt/uahncdc.lt

As part of an ongoing joint project between UAHuntsville, NOAA and NASA, Christy and Spencer use data gathered by advanced microwave sounding units on NOAA and NASA satellites to get accurate temperature readings for almost all regions of the Earth. This includes remote desert, ocean and rain forest areas where reliable climate data are not otherwise available.

The satellite-based instruments measure the temperature of the atmosphere from the surface up to an altitude of about eight kilometers above sea level. Once the monthly temperature data is collected and processed, it is placed in a “public” computer file for immediate access by atmospheric scientists in the U.S. and abroad.

Neither Christy nor Spencer receives any research support or funding from oil, coal or industrial companies or organizations, or from any private or special interest groups. All of their climate research funding comes from federal and state grants or contracts.

Comments Off on November 2011 University Of Alabama Analysis Of The Global Lower Tropospheric Temperature Analysis

Filed under Climate Change Metrics

Interesting Article From Reason.Com On The Relative Roles Of Climate And Societal Influences On Vulnerability

There is an excellent article on Reason.com which fits within the bottom-up, resource-based vulnerability perspective that we present in our paper

Pielke Sr., R.A., R. Wilby, D. Niyogi, F. Hossain, K. Dairuku, J. Adegoke, G. Kallos, T. Seastedt, and K. Suding, 2011: Dealing  with complexity and extreme events using a bottom-up, resource-based  vulnerability perspective. AGU Monograph on Complexity and  Extreme Events in Geosciences, in press.

It also further supports the conclusions on the relative role of climate change and societal change that has been reported in my son’s book

The Climate Fix

and in his other publications.

It is also critically important to discriminate between risks from extreme weather events that have occurred in the historical and recent paleo-record, from possible CHANGES in the statistics of extreme weather events due to the human influences on the climate system.

The article by Ronald Bailey is titled

Weathering Man-Made Climate Change

The article read [highlight added]

Poverty, not global warming, is the cause of death and destruction in the face of extreme weather.

Ronald Bailey | November 22, 2011

A new United Nations report projects man-made global warming will boost the damage caused by heat waves, coastal floods, and droughts as they get worse by the end of the century.

In a press release about the report, Special Report for Managing the Risks of Extreme Events and Disasters to Advance Climate Change Adaptation (SREX), Intergovernmental Panel of Climate Change Co-Chair Qin Dahe expressed high confidence that temperatures have increased due to man-made global warming. The study further expressed medium confidence that droughts had increased in some areas as a result of man-made climate change. However, the researchers could not draw firm conclusions about the effects of climate change on any trends in hurricanes, typhoons, hailstorms, or tornadoes. (The full report detailing the scientific work behind the study will not be released until February.)

It is generally agreed that the average temperatures over land have increased by about 1° Celsius[PDF] since the 1950s. Looking toward the end of the 21st century, the report relies on computer model projections which suggest that 1-in-20 year hottest day events are to become a 1-in-2 year events. The report also projects that inundations that once happened every 20 years are likely to occur every five years.

Sounds bad, but that’s a hundred years from now. With regard to the next few decades, the researchers more sanguinely report,“Projected changes in climate extremes under different emissions scenarios generally do not strongly diverge in the coming two to three decades, but these signals are relatively small compared to natural climate variability over this time frame. Even the sign of projected changes in some climate extremes over this time frame is uncertain.” That means that weather extremes for the next several decades will likely be within the limits of natural variation, making it almost impossible to discern any effect of man-made climate change on them. In other words, whatever weather disasters do occur will not be on a scale or frequency beyond those that humanity has experienced in recent decades.

More crucially, the U.N. report acknowledges, “In many regions, the main drivers for future increases in economic losses due to some climate extremes will be socioeconomic in nature.” The upshot is that any increase in weather disaster damage is largely due to an increase in what can potentially be destroyed and the number of people exposed to it.

Can researchers discern any effect that the recent increase in global average temperature has had on people and their property? Not really.

For example, a recent Reason Foundation report [PDF],Wealth and Safety: The Amazing Decline in Deaths from Extreme Weather in an Era of Global Warming, 1900–2010, notes,“Aggregate mortality attributed to all extreme weather events globally has declined by more than 90 percent since the 1920s, in spite of a four-fold rise in population and much more complete reporting of such events.” The death rate from droughts is 99.9 percent lower than it was in the 1920s; the death rate from floods is 98 percent lower; and the death rate from big storms like hurricanes has declined more than 55 percent since the 1970s.

Keep in mind that the death rate due to extreme weather between 2001 and 2010 averaged about 38,000 per year compared to about 59 million annual deaths for all causes. The Reason Foundation report concludes, “While extreme weather-related events, because of their episodic nature, garner plenty of attention worldwide, their contribution to the global mortality burden—0.07 percent of global deaths—is relatively minor.”

What about economic losses? Proponents of catastrophic man-made climate change have been seeking evidence that it is boosting risks among the weather damage and loss data. A recent review article[PDF], “Have Disaster Losses Increased Due to Anthropogenic Climate Change?” by Laurens Bouwer, published in the Bulletin of the American Meteorological Society (BAMS), surveyed 22 studies looking at trends in natural hazard losses. Bouwer, a researcher in the Institute for Environmental Studies at Vrije University in the Netherlands, included studies that all looked at economic losses, covered at least 30 years of data, and were peer reviewed.

Generally loss data are normalized to take into account inflation, and changes in exposure and vulnerability associated with increases in wealth and population. The BAMS review found, “The studies show no trends in the losses, corrected for change (increases) in population and capital at risk, that could be attributed to anthropogenic climate change. Therefore, it can be concluded that anthropogenic climate change so far has not had a significant impact on losses from natural disasters.”

Another recent study, “Normalizing economic loss from natural disasters: global analysis,” by Eric Neumayer and Fabian Barthel, two researchers associated with the Grantham Research Institute on Climate Change and the Environment at the London School of Economics and Political Science, also probed trends in weather disaster loss data in search of a global warming signal. Besides using conventional techniques that take into account increases in population and wealth to normalize losses, they also develop an alternative technique that looks at relative losses over time. Briefly, their new measure looks at how much actual loss occurred relative to the amount that was at risk. For example, what percentage of wealth in Miami was destroyed by hurricanes in 1920 versus 2010? If the actual-to-potential-loss ratio is increasing over time, this suggests that the weather is having a growing impact.

Analyzing weather disasters between 1980 and 2009, Neumayer and Barthel find, “Both methods lead to the same result for all disasters: no significant trend over time according to the conventional method, a marginally significant downward trend according to the alternative method.” Applying both normalization methods, they find no significant trends in weather related losses for both developed and developing countries. Looking regionally at North America, Western Europe, Latin America and the Caribbean, and South and East Asia also uncovers no statistically significant trend in losses caused by weather disasters. In addition, two 2009 studies found no upward trend in normalized losses dues to windstorm or floods in Western Europe since 1970. One concluded,“Results show no detectable sign of human-induced climate change in normalized flood losses in Europe.”

Neumayer and Barthel, using their alternative normalization method, do identify a “strongly negative trend” in normalized weather disaster damages in developed countries. They speculate,“This could possibly indicate a stronger capability of richer nations to fund defensive mitigating measures, which decrease vulnerability to natural disasters over time.” Richer societies are likely reducing their weather losses by establishing better early warning systems, enacting stronger building codes, and constructing firmer levees. People may be protecting themselves ever better against the consequences of storms and floods, even though the weather is getting worse.

Although no upward trend in weather damages can be found in developing countries, the U.N.’s SREX report does note that fatality rates and economic losses as a proportion of GDP from weather disasters are higher in poor countries. In fact, between 1970 and 2008, 95 percent of deaths from natural disasters occurred in developing countries. Bad weather produces death and destruction largely when it encounters poverty.

Let’s conclude with two observations: First, recent research indicates that man-made climate change has not been nor is it likely to be a big contributor to losses stemming from weather disasters in the next few decades. Second, boosting the wealth of poor people through economic growth is their best protection against meteorological disasters in the long run, whether fueled by future man-made climate change or not.

Ronald Bailey is Reason’s science correspondent. His book Liberation Biology: The Scientific and Moral Case for the Biotech Revolution is available from Prometheus Books.

source of image

Comments Off on Interesting Article From Reason.Com On The Relative Roles Of Climate And Societal Influences On Vulnerability

Filed under Vulnerability Paradigm

An Important 2011 AGU Presentation “Tree-Ring Extension Of Precipitation Variability For Eastern Nevada: Implications For Drought Analysis In The Great Basin Region, USA By F. Biondi And S. Strachan

There is an interesting abstract presented at the 2011 AGU Conference in San Francisco by Franco Biondi

The title and abstract are [highlight added]

ABSTRACT FINAL ID: H42G-08;
TITLE: Tree-Ring Extension of Precipitation Variability for Eastern Nevada: Implications for
Drought Analysis in the Great Basin Region, USA
SESSION TYPE: Oral
SESSION TITLE: H42G. The Past, Present, and Future of Global and Regional Droughts I
AUTHORS (FIRST NAME, LAST NAME): Franco Biondi1, Scotty D Strachan1
INSTITUTIONS (ALL): 1. DendroLab, University of Nevada, Reno, NV, United States.

ABSTRACT BODY: In the Great Basin of North America, ecotonal environments characterized as lower forest border sites are ideally suited for tree-ring reconstructions of hydroclimatic variability. A network of 22 tree-ring chronologies, some longer than 800 years, from single-leaf pinyon (Pinus monophylla) tree-ring samples for eastern Nevada, in the central Great Basin of North America was used to analyze long-term precipitation variability. The period in common among all tree-ring chronologies, i.e. 1650-1976, was used to reconstruct October-May total precipitation using the Line of Organic Correlation (LOC) method. Individual site reconstructions were then combined using spatio-temporal kriging to produce annual maps of drought on a 12×12 km grid. Hydro-climatic episodes were numerically identified and modeled using their duration, magnitude, and peak, to estimate the likelihood of severe and sustained drought in this region. According to a numerical scoring rule explained in detail by Biondi et al. 2008, the most remarkable episode in the entire reconstruction was the early 1900s pluvial, followed by the late 1800s drought. The 1930s ‘Dust Bowl’ drought was in 8th position, making it one of the more remarkable episodes in the past few centuries. This result is consistent with other studies that show how regional drought severity varies going from western to eastern Nevada, and directly addresses the needs of water managers with respect to planning for ‘worst case’ scenarios of drought duration and magnitude. For instance, it is possible to analyze which geographical areas and hydrographic basins are more likely to be impacted during the most extreme droughts, at the annual (see Figure) or multiannual timescale. In the semi-arid western USA, multi-century long dendroclimatic records with km-scale spatial resolution can therefore provide water managers with a quantitative evaluation of climate episodes well beyond the envelope of instrumental records, thereby increasing the ability to design management practices for single watersheds with the objective to achieve drought resiliency.
http://dendrolab.org

The publications of this outstanding research group can be viewed here.

One implication from their study is that we can use this paleo- and recent historical data to estimate what would be the consequences if these past climatic events reoccurred but with today’s society and environment? This type of analysis fits with our bottom-up, resource-based [contextual] vulnerability perspective that we have proposed in our paper

Pielke Sr., R.A., R. Wilby, D. Niyogi, F. Hossain, K. Dairuku, J. Adegoke, G. Kallos, T. Seastedt, and K. Suding, 2011: Dealing  with complexity and extreme events using a bottom-up, resource-based  vulnerability perspective. AGU Monograph on Complexity and  Extreme Events in Geosciences, in press.

source of image

Comments Off on An Important 2011 AGU Presentation “Tree-Ring Extension Of Precipitation Variability For Eastern Nevada: Implications For Drought Analysis In The Great Basin Region, USA By F. Biondi And S. Strachan

Filed under Climate Change Metrics, Vulnerability Paradigm