Monthly Archives: August 2011

Study Of Value-Added By Type 2 Dynamic Downscaling – “Regional Climate Models Add Value To Global Model Data” By Feser Et Al 2011

UPDATE #2 September 6 2011 I have corrected the journal where the Feser et al paper is in press. It will appear in the Bulletin of the American Meteorological Society (BAMS).

UPDATE September 2 2011

I sent the e-mail below to the authors of the paper, and the reply follows

COMMENT E-MAIL

From: Roger Pielke Date: Sat, Aug 27, 2011 at 5:47 PM
Subject: Your in press J of Climate paper Feser et al 2011
To: Hans von Storch , Burkhardt Rockel, Frauke.Feser@hzxx

Hi Dr. Feser, Hans and Burkhardt

I plan to post the information below later this coming week. Your
paper is an important contribution but I am concerned that readers
will interpret that your conclusions apply to the value of dynamic
downscaling from multi-decadal global climate predictions. I would be
glad if you could respond and I will post as a Reply on my weblog.

With Best Regards

Roger

REPLY

From: Frauke Feser Frauke.Feser@xxx  to pielkesr@xxx cc Hans von Storch , Burkhardt Rockel

date: Fri, Sep 2, 2011 at 2:21 AM
subject :  Your in press J of Climate paper Feser et al 2011

Dear Prof. Pielke,

thanks a lot for your comments on our article. So far it has to our knowledge not been shown that the RCM added value results also apply for future scenario simulations of Type 4. We will take this as a motivation to look into this more systematically soon, until then our results are valid only for Type 2 RCM hindcast simulations.

With best regards,

Frauke Feser

*************************************************

There is a new paper which adds to our understanding of value-added from Type 2 dynamic downscaling.  Type 2 dynamic downscaling is defined in Castro et al (2005) and has recently been defined further in a new submitted paper with Rob Wilby [which I will post on in the future]  as

Type 2  dynamic downscaling refers to regional weather (or climate) simulations. In this case, the regional model’s initial atmospheric conditions are forgotten, but results are still depend on the lateral boundary conditions from a numerical global model weather prediction (in which initial observed atmospheric conditions are not yet be forgotten), or a global reanalysis, along with the land surface boundary conditions.  Reanalyses such as ERA-40, NCEP, and JRA-55 assimilate spatially discontinuous weather observations in order to estimate temperature, humidity, wind speeds and so forth at grid points covering the entire globe. Downscaling from reanalysis products defines the maximum forecast skill that is achievable with Types 3 and 4 downscaling.

Type 2 dynamic downscaling is quite distinct from Type 4 downscaling [Type 1 is with respect to numerical weather prediction and Type 3 is with respect to seasonal weather prediction, for example, where certain parts of the climate system, such as sea surface temperatures are prescribed]. Type 4 is defined below.

Type 4 dynamic downscaling takes lateral boundary conditions from an earth system model in which coupled interactions between the atmosphere, ocean, biosphere and cryosphere are predicted. Other than terrain, all other components of the climate system are predicted except for human forcings, including greenhouse gas emissions scenarios, which are prescribed. Type 4 downscaling is widely used to provide policymakers with impacts from climate decades into the future.

Unfortunately, the new paper does not highlight that their findings do not apply to Type 4 dynamic downscaling (which is the IPCC multi-decadal climate predictions) except as an upper bound as to what value-added is possible.

Type 2 dynamic downscaling from reanalyses, in particular, has the advantage that the lateral boundary conditions and interior nudging is based on sampling from the continuous real world atmospheric conditions. In stark contrast, Type 4 dynamic downscaling knows nothing about spatial scale smaller than resolved by the global model.

The new paper is

Frauke Feser, Burkhardt Rockel, Hans von Storch, Jörg Winterfeldt, and Matthias Zahn, 2011: Regional Climate Models add Value to Global Model Data – A Review and selected Examples. J of Climate. BAMS, in press.

The abstract reads [highlight added]

An important challenge in current climate modeling is to realistically describe small scale weather statistics such as topographic precipitation, coastal wind patterns or regional phenomena like polar lows. Global climate models simulate atmospheric processes with increasingly higher resolutions, but still regional climate models have a lot of advantages. They consume less computation time due to their limited simulation area and thereby allow for higher resolution both in time and space as well as for longer integration times. Regional climate models can be used for dynamical downscaling purposes, as their output data can be processed to produce higher resolved atmospheric fields, allowing the representation of small-scale processes and a more detailed description of physiographic details (such as mountain ranges, coastal zones, and details of soil properties).

But does higher resolution add value when compared to global model results? Most studies implicitly assume that dynamical downscaling leads to output fields superior to the driving global data, but little work has been carried out to substantiate these expectations. Here, we review a series of articles that evaluate the benefit of dynamical downscaling by explicitly comparing results of global and regional climate model data to observations. These studies show that the regional climate model generally performs better for the medium spatial scales, but not always for the larger spatial scales.

We conclude that regional models can add value, but only for certain variables and locations; particularly those influenced by regional specifics such as coasts or mesoscale dynamics such as polar lows. Therefore, the decision of whether a regional climate model simulation is required depends crucially on the scientific question being addressed.

The following text documents that this study is about Type 2 downscaling and not Type 4 downscaling

“In this article, efforts to determine such added value in case studies as well as in multi-decadal simulations with different RCMs are summarized and evaluated. The simulations presented here comprise mostly ‘reconstructions’, e. g. simulations of the weather dynamics since 1948 until today of Western Europe or the Northwestern Pacific. Most of these simulations use a grid distance of about 50 km, have been constrained with spectral nudging (von Storch et al., 2000) and use global NCEP/NCAR reanalysis (Kalnay et al. 1996; hereafter referred to as the NCEP reanalysis) as forcing data.”

Among their findings is

The regional model does not add value over the open ocean, due to the lack of orographic details and infrequent meso-scale phenomena here. It may even be worse than the reanalyses, which is reflected by the negative BSSs [Brier Skill Score].

Their statement that

“We conclude that RCMs do indeed add value to global models for a number of
applications, variables, and areas. If examined only at the regional scale, added value emerges very distinctly for many model variables, justifying the additional computational effort of RCM simulations.”

is correct but ONLY for Type 2 dynamic downscaling.

source of image

Comments Off on Study Of Value-Added By Type 2 Dynamic Downscaling – “Regional Climate Models Add Value To Global Model Data” By Feser Et Al 2011

Filed under Climate Models, Research Papers

New Paper “Observed Changes In Surface Atmospheric Energy Over Land” By Peterson Et Al 2011

Several years ago, we proposed the use of surface moist enthalpy as the prefered metric to diagnose the heat of the near surface air;

Pielke Sr., R.A., C. Davey, and J. Morgan, 2004: Assessing “global warming” with surface heat content. Eos, 85, No. 21, 210-211.

As we write in that paper [highlight added]

Surface air temperature alone does not capture the real changes in surface air heat content of the Earth system. Even using the limited definition of the term “global warming,”the moisture content of the surface air must be included. Future assessments should include trends and variability of surface heat content in addition to temperature.”

In our follow-up paper

Davey, C.A., R.A. Pielke Sr., and K.P. Gallo, 2006: Differences between near-surface equivalent temperature and temperature trends for the eastern United States – Equivalent temperature as an alternative measure of heat content. Global and Planetary Change, 54, 19–32.

we wrote

There is currently much attention being given to the observed increase in near-surface air temperatures during the last century. The proper investigation of heating trends, however, requires that we include surface heat content to monitor this aspect of the climate system. Changes in heat content of the Earth’s climate are not fully described by temperature alone. Moist enthalpy or, alternatively, equivalent temperature, is more sensitive to surface vegetation properties than is air temperature and therefore more accurately depicts surface heating trends. The microclimates evident at many surface observation sites highlight the influence of land surface characteristics on local surface heating trends. Temperature and equivalent temperature trend differences from 1982–1997 are examined for surface sites in the Eastern U.S. Overall trend differences at the surface indicate equivalent temperature trends are relatively warmer than temperature trends in the Eastern U.S. Seasonally, equivalent temperature trends are relatively warmer than temperature trends in winter and are relatively cooler in the fall. These patterns, however, vary widely from site to site, so local microclimate is very important.

A new paper has appeared which examines this issue on a global scale [which I read about on Climate Abyss in a comment from Peter Thorne in response to my comment to John Nielsen-Gammon on his latest post on the Texas drought and heat]

Peterson, T. C., K. M. Willett, and P. W. Thorne (2011), Observed changes in surface atmospheric energy over land, Geophys. Res. Lett., 38, L16707, doi:10.1029/2011GL048442

with the abstract

“The temperature of the surface atmosphere over land has been rising during recent decades. But surface temperature, or, more accurately, enthalpy which can be calculated from temperature, is only one component of the energy content of the surface atmosphere. The other parts include kinetic energy and latent heat. It has been advocated in certain quarters that ignoring additional terms somehow calls into question global surface temperature analyses. Examination of all three of these components of atmospheric energetics reveals a significant increase in global surface atmospheric energy since the 1970s. Kinetic energy has decreased but by over two orders of magnitude less than the increases in both enthalpy and latent heat which provide approximately equal contributions to the global increases in heat content. Regionally, the enthalpy or the latent heat component can dominate the change in heat content. Although generally changes in latent heat and enthalpy act in concert, in some regions they can have the opposite signs.”

The Peterson et al article is an effective examination with the current data analyses from the HadCRUH land dataset and Global Historical Climatology Network Monthly (GHCN-M) Version 3.  It should, of course, be reevaluated when all of the uncertainties and biases we have identified, for example,  in Pielke et al 2009, Klotzbach et al 2009 and Fall et al 2011 are  remedied.

They show that global warming, as diagnosed from surface measurements over land, is actually larger than that diagnosed from the dry bulb temperature trends alone (as we also found in the Davey et al 2006 study).  The analysis of moist enthalpy should also be extended into the lower troposphere in order to see if the divergence between the surface and lower tropospheric temperature trends that we identified in Klotzbach et al 2009 can be explained when the trends in water vapor are included.

I disagree, however, with the following text in Peterson et al

The heat content of the upper ocean has become a heavily utilized metric of global climate change [e.g., Palmer et al., 2010]. Some authors argue that the heat content of the surface atmosphere should also be a key metric. Indeed, the “concept of ‘global warming’ requires assessments of units of heat (that is, Joules)” according to Pielke et al. [2004]. Davey et al. [2006] argue that global surface temperature is not a “proper” measure of the heat content of the Earth’s climate system; which is true as it is just a measure of temperature. But Pielke et al. [2007] go even further to claim that “ignoring concurrent trends in surface air absolute humidity therefore introduces a bias in the analysis of surface air temperature trends” and that we “need to include absolute humidity in order to describe observed temperature trends.”

Temperature and humidity are distinctly different physical parameters as implied by their units of K and g kg−1, and they are measured by different instrumentation. Therefore, we do not understand how ignoring humidity could bias an analysis of temperature trends or why an assessment of humidity would be required in order to describe trends in temperature. We do, however, have concerns about the potential for the general public to misinterpret heat content analysis.  Figure 1 shows that heat content tends to be decreasing in Australia despite increases in surface temperature. Presenting heat content as the primary metric for global warming could lead lay readers to erroneously perceive Australia as cooling – after all, its heat (content) is decreasing. Our concern is not just nomenclature. Heat content by any other name if used as a global warming metric has the potential to imply cooling even in places with increasing temperature simply because the location is becoming dryer.

The Peterson et al paper is incorrect that temperature and humidity are distinctly different physical parameters. They both are associated with heat as they clearly show in their equation 3 ;  i.e.

H=  C pT + L q

where C p is the heat capacity at constant pressure and L is the heat of vaporization.

A more important issue is that they have not recognized that the use of surface measurements to diagnose the  radiative imbalance of the climate system requires the identification of where the heat from this imbalance goes to. Their paper is quite effective at bringing in the concept of the Bowen ratio, but they are not understanding that we need to monitor both types of heat (sensible; i.e. dry bulb and latent heat) in order to diagnose the heat content changes in the surface air, and to properly interpret the observed trends in the dry bulb temperatures.

Perhaps this would be clearer to them if instead of writing that we

“need to include absolute humidity in order to describe observed temperature trends.”

if we wrote that we 

“need to include absolute humidity in order to correctly explain observed temperature trends.”

Finally,  they write

“Presenting heat content as the primary metric for global warming could lead lay readers to erroneously perceive Australia as cooling – after all, its heat (content) is decreasing.”

However, if the heat content is decreasing because of drying, even though the dry bulb temperature trend is positive,  it IS cooling in terms of Joules per kilogram of air!  There is less heat energy in this air that when the moist enthalpy was higher.

 I suggest that both dry bulb temperature and moist enthalpy trends and anomalies be presented in climate analyses and in model predictions. This will provide a more complete diagnosis of climate (and surface global warming)  of the atmosphere than using the dry bulb temperatures alone.

source of image

Comments Off on New Paper “Observed Changes In Surface Atmospheric Energy Over Land” By Peterson Et Al 2011

Filed under Climate Change Metrics, Research Papers

Hurricanes And Nor’easters in the late Holecene – Evidence For Large Natural Variability

We have been alerted by Ken Haapala of the Science and Environmental Policy Project to an interestng paper on paleo-climate which documents a large natural variation in the occurrence of hurricanes and nor’easters along the coast of North Carolina.

It is

Mallinson, D.J., Smith, C.W., Mahan, S., Culver, S.J. and McDowell, K. 2011. Barrier island response to late Holocene climate events, North Carolina, USA. Quaternary Research 76: 46-57

The abstract reads [highlight added]

The Outer Banks barrier islands of North Carolina, USA, contain a geologic record of inlet activity that extends from ca. 2200 cal yr BP to the present, and can be used as a proxy for storm activity. Optically stimulated luminescence (OSL) dating (26 samples) of inlet-fill and flood tide delta deposits, recognized in cores and geophysical data, provides the basis for understanding the chronology of storm impacts and comparison to other paleoclimate proxy data. OSL ages of historical inlet fill compare favorably to historical documentation of inlet activity, providing confidence in the technique. Comparison suggests that the Medieval Warm Period (MWP) and Little Ice Age (LIA) were both characterized by elevated storm conditions as indicated by much greater inlet activity relative to today. Given present understanding of atmospheric circulation patterns and sea-surface temperatures during the MWP and LIA, we suggest that increased inlet activity during the MWP responded to intensified hurricane impacts, while elevated inlet activity during the LIA was in response to increased nor’easter activity. A general decrease in storminess at mid-latitudes in the North Atlantic over the last 300 yr has allowed the system to evolve into a more continuous barrier with few inlets.

The conclusion reads

This study demonstrates that OSL is a viable tool for dating subtidal to intertidal barrier island inlet and flood-tide delta facies, and can provide valuable insight into barrier evolution and coastal response to varying climate conditions. Ages of inlet facies along the North Carolina Outer Banks indicate a period of large-scale inlet activity concentrated between Rodanthe and Ocracoke during the MWP, which we attribute to hurricane impacts, and a later period of elevated inlet activity during the Little Ice Age, which we attribute to an increase in nor’easter activity.Closure of most of these inlets occurred over the last 300 yr, probably reflecting more stable climate conditions, fewer storm impacts (both hurricane and nor’easter), and a decrease in the average wind intensity and wave energy field in the mid-latitudes of the North Atlantic.

This study suggests that we have been in a period of reduced storminess along the coast of North Carolina than is typical of the longer time period. It also illustrates the difficulty of extracting a human caused climate change component to this large natural variability.

source of image

Comments Off on Hurricanes And Nor’easters in the late Holecene – Evidence For Large Natural Variability

Filed under Research Papers

Follow On Comment To The Post – New Paper “Why Do Tornados And Hail Storms Rest On Weekends” By Rosenfeld and Bell 2011

I received an e-mail comment on the post

New Paper “Why Do Tornados And Hail Storms Rest On Weekends” By Rosenfeld and Bell 2011

The very informative comment is

Not all aerosols peak during the week. It appears that elemental (also called black) carbon, mainly from diesels, is at its lowest concentrations on Sunday and Monday in rural middle America. Natural dusts are also lower on Sundays and Mondays – perhaps also due to less driving generally, less stirring up of road dust?  Or, is it possible that farmers do less dirt-moving on the weekends, or at least on Sunday?  Is the black carbon missing on Sundays possibly due to less use of farm equipment?  Nitrates (mainly but not exclusively a vehicular emission) exhibit the same weekly cycle. Surprisingly, power plant particle emissions do not, although sulfur dioxide levels (a gas) are lower on weekends. See this link:

http://www.atmos-chem-phys.org/8/2729/2008/acp-8-2729-2008.pdf

Here are parts of the Abstract:

“Data from the Interagency Monitoring of Protected Visual Environments (IMPROVE) network of aerosol samplers and NOAA monitoring sites are examined for weekly cycles. At remote and rural sites, fine particle elemental carbon, crustal elements, and coarse particle mass had pronounced (up to 20%) weekly cycles with minima on Sunday or Monday. Fine particle organic carbon and mass had smaller amplitude cycles, also with Sunday or Monday minima. There was no statistically significant weekly cycle in fine particle sulfate despite a 5 to 15% weekly cycle in power plant SO2 emissions. Although results for nitrate may be more susceptible to sampling artifacts, nitrate also showed a pronounced weekly cycle with an amplitude similar to elemental carbon……These results support a large role of diesel emissions in elemental carbon aerosol over the entire United States and suggest that a large fraction of the airborne soil dust is anthropogenic. They also suggest that studies of  weekly cycles in temperature, cloudiness, precipitation, or other meteorological variables should look for causes more in light-absorbing particles and possible ice nucleation by dust rather than sulfate or total aerosol….”

Hope this is of interest,

Tom Grahame

Comments Off on Follow On Comment To The Post – New Paper “Why Do Tornados And Hail Storms Rest On Weekends” By Rosenfeld and Bell 2011

Filed under Climate Change Forcings & Feedbacks, Guest Weblogs, Research Papers

An Example Of The Need For A Bottom-Up Resourse-Based Perspective Of Vulnerability With Respect To Electric Power

 

Power outages

In our paper

Pielke Sr., R.A., R. Wilby, D. Niyogi, F. Hossain, K. Dairuku, J. Adegoke, G. Kallos, T. Seastedt, and K. Suding, 2011: Dealing with complexity and extreme events using a bottom-up, resource-based vulnerability perspective. AGU Monograph on Complexity and Extreme Events in Geosciences, in press. 

we present a bottom-up, resource-based approach to assess risks to society from extreme events including from hurricanes. We have concluded this is a much more robust approach than replying on a top-down global climate model prediction of changes in climatology (such as hurricane frequency) in the coming decades.

Hurricane Irene presents an example of why we need this bottom-up approach as the primary framework for the reduction of risk.

Hurricane Irene has caused large losses of electric power. For future hurricanes (regardless of how they might change) a program to reduce these outages should be a high priority.

An example of a news article on this vulnerability of the electrical system is by Childs Walker of the Baltimore Sun on August 26 2011, titled

Irene’s greatest aggravation: power outages

Its subtitle reads

Marylanders try to accept outages gracefully, admit their patience might wane after a few days

Excerpts read [highlight added]

Though Irene did not cause widespread flooding in Maryland or smash buildings to the degree many feared, the storm left as many as 800,000 businesses and households without power.

Maryland appeared to rank second in outages among states hit by Irene. Virginia reported about 2.5 million residents without power, the second-most in state history. North Carolina, where Irene made landfall, reported more than 400,000 customers without power. Pennsylvania, New York, New Jersey, Connecticut and Massachusetts each reported between 300,000 and 500,000 outages at various times Sunday.

Since hurricanes will occur along the east coast, the improvement of the resiliency of the electic power grid should be goal. We do not need to know anything about how the climatology of these storms might change in the coming decades. Just pruning trees before each hurricane season (as homeowners to in the western USA before the fire season) would be a cost-effective way to reduce risk. 

source of image

Comments Off on An Example Of The Need For A Bottom-Up Resourse-Based Perspective Of Vulnerability With Respect To Electric Power

Filed under Vulnerability Paradigm

Set #3 Of The Photographs Of Surface Climate Observing Sites

This post presents the next three photographs of surface climate observing sites that I introduced in my post on August 11 2011

Quality Of Global Climate Surface Observing Sites

Some of these sites are reasonably well-sited while others are not. There is, however, a clear need to document each of those sites that are used in the Global Climate Reference Network

1.Veracruz, Mexico

2. Tampico, Mexico

3. Salina Cruz, Mexico

Comments Off on Set #3 Of The Photographs Of Surface Climate Observing Sites

Filed under Climate Change Metrics

Kudos To The National Weather Service And The National Hurricane Center For An Excellent Forecast Of Hurricane Irene!

The successful forecast of Hurricane Irene from its travels across the Bahamas and up and along the east coast of the United States needs to be widely appreciated and applauded!

This forecast was based on a suite of numerical weather prediction models which provided a solid basis for predicting its track and intensity.  Radar and satellite provided detailed real-time information.

Many lives have been saved by this outstanding service by this federal agency. 

Congratulations to the National Weather Service and the National Hurricane Center for a job well done!

source of image

Comments Off on Kudos To The National Weather Service And The National Hurricane Center For An Excellent Forecast Of Hurricane Irene!

Filed under Climate Science Reporting

Another Climate Forecast Paper Masquerading As A Robust Scientific Result

Readers of my weblog know I have been very critical of peer-reviewed papers and research projects that present multi-decadal global climate model predictions as robust science; e.g. see

A New AGU EOS Article Titled “Guidelines For Constructing Climate Scenarios” By Mote Et Al 2011 Which Inadvertently Highlights This Flawed Climate Science Approach

The models predictions used to create these scenarios cannot even be verified until decades from now, and, even in  a hindcast mode, they have never been able to satisfactorily predict regional climate variability under the current climate, much less how the regional weather features would change under human climate forcings.

 The paper below is just one example of this genre of publication, but it illustrates this issue quite well. The article is
 
Shongwe, Mxolisi E., Geert Jan van Oldenborgh, Bart van den Hurk, Maarten van Aalst, 2011: Projected Changes in Mean and Extreme Precipitation in Africa under Global Warming. Part II: East Africa. J. Climate, 24, 3718–3733. doi: 10.1175/2010JCLI2883.1

The abstract reads [highlight added]

Probable changes in mean and extreme precipitation in East Africa are estimated from general circulation models (GCMs) prepared for the Intergovernmental Panel on Climate Change Fourth Assessment Report (AR4). Bayesian statistics are used to derive the relative weights assigned to each member in the multimodel ensemble. There is substantial evidence in support of a positive shift of the whole rainfall distribution in East Africa during the wet seasons. The models give indications for an increase in mean precipitation rates and intensity of high rainfall events but for less severe droughts. Upward precipitation trends are projected from early this (twenty first) century. As in the observations, a statistically significant link between sea surface temperature gradients in the tropical Indian Ocean and short rains (October–December) in East Africa is simulated in the GCMs. Furthermore, most models project a differential warming of the Indian Ocean during boreal autumn. This is favorable for an increase in the probability of positive Indian Ocean zonal mode events, which have been associated with anomalously strong short rains in East Africa. On top of the general increase in rainfall in the tropics due to thermodynamic effects, a change in the structure of the Eastern Hemisphere Walker circulation is consistent with an increase in East Africa precipitation relative to other regions within the same latitudinal belt. A notable feature of this change is a weakening of the climatological subsidence over eastern Kenya. East Africa is shown to be a region in which a coherent projection of future precipitation change can be made, supported by physical arguments. Although the rate of change is still uncertain, almost all results point to a wetter climate with more intense wet seasons and less severe droughts.”

An excerpt from the text reads

“The higher frequency of flooding observed in East Africa in recent years could give indications that the CGCM-simulated precipitation responses are already occurring. Although the time series of the simulated precipitation show upward trends from early in the present century, parts of East Africa could still be experiencing drier conditions. For example, local trends in Rwanda and Burundi (region III) have been negative over the last decades of the twentieth century, either because of natural variability or model deficiencies in this complex region. From an applications perspective, there have also been reports of continued decline in streamflow and water levels in, for example, Lake Victoria, which may seem paradoxical given the recent high frequency of flooding in East Africa. We note that river/ dam levels are also determined by other factors (e.g., water use, drainage, and evaporation), which have not been considered in this paper.”

This text shows that even with the limited hindcast validation of their model results, they do not agree in some regions with their forecast trend. They provide qualitative rationalizations to dismiss these areas of disagreement. Such inadequacies should have alerted the Editor who handled this paper that the study is not robust, as it is unable to even accurately simulate the current climate.

Until, the funding agencies and journals recognize that this is not robust science, we will continue to read such flawed studies.

source of image

Comments Off on Another Climate Forecast Paper Masquerading As A Robust Scientific Result

Filed under Climate Science Misconceptions, Research Papers

An Independent Way To Assess The Fossil Fuel Input Of Gases, Including CO2, Into The Atmosphere

GLOBALVIEW movie

The post on Watts Up With That titled

The Emily Litella moment for climate science and CO2 ?

starts with the text

“There is quite a bit of buzz surrounding a talk and pending paper from Prof. Murry Salby  the Chair of Climate, of Macquarie University. Aussie Jo Nova has excellent commentary, as has Andrew Bolt in his blog. I’m sure others will weigh in soon.

In a nutshell, the issue is rather simple, yet powerful. Salby is arguing that atmospheric CO2 increase that we observe is a product of temperature increase, and not the other way around, meaning it is a product of natural variation.”

My view is that the added carbon dioxide in the last century and up to the present, as documented at a variety of observation sites around the globe; e.g. see  ESRL Global Monitoring Division, is indeed from human causes (industrial and vehicular emissions, biomass burning, etc.).  

There is a new paper that could help further resolve this issue. The paper is

Sano, Y., Furukawa, Y. and Takahata, N. Atmospheric helium isotope ratio: Possible temporal and spatial variations. Geochimica et Cosmochimica Acta 74, 4893-4901, 2010.

The abstract reads [highlight added]

The atmospheric 3He/4He ratio has been considered to be constant on a global scale, because the residence time of helium is significantly longer than the mixing time in the atmosphere. However, this ratio may be decreasing with time owing to the anthropogenic release of crustal helium from oil and natural gas wells, although this observation has been disputed. Here, we present the 3He/4He ratios of old air trapped in historical slags in Japan and of modern surface air samples collected at various sites around the world, measured with a newly developed analytical system. In air helium extracted from metallurgical slag found at refineries in operation between AD 1603 and 1907 in Japan, we determined a mean 3He/4He ratio of (5106 ± 108)  10–5 RHESJ (where RHESJ is the 3He/4He ratio of the Helium Standard of Japan), which is consistent with the previously reported value of (5077 ± 59)  10–5 RHESJ for historical slags in France and United Arab Emirates and about 4% higher than that of average modern air, (4901 ± 4)  10–5 RHESJ. This result implies that the air 3He/4He ratio has decreased with time as expected by anthropogenic causes. Our modern surface air samples revealed that the 3He/4He ratio increases from north to south at a rate of (0.16 ± 0.08)  10–5 RHESJ/degree of latitude, suggesting that the low 3He/4He ratio originates in high-latitude regions of the northern hemisphere, which is consistent with the fact that most fossil fuel is extracted and consumed in the northern hemisphere.

The introduction starts with the text

The present terrestrial atmosphere has a helium concentration of 5.24 ppm by volume (Glukauf, 1946), which reflects an approximate balance between the supply from degassing of the solid Earth and thermal escape from the top of the atmosphere into space (Kockarts, 1973). Since the mixing time in the atmosphere is less than 10 years, much shorter than the residence time of helium, about 106 years, the atmospheric 3He/4He ratio has been considered to be globally uniform (Ozima and Podosek, 1983; Mamyrin and Tolstikhin, 1984). On this basis, most noble gas laboratories use air helium as an isotopic standard. Human activities such as oil and natural gas production and coal mining, however, may release non-negligible amounts of crustal helium with a low 3He/4He ratio (Oliver et al., 1984).

The text ends with the information

Because helium is chemically inert, further study of 3He/4He changes in space and time may provide a marker for calibration of the absolute flux and retention of anthropogenic CO2 in the atmosphere if we can separate production- produced helium from consumption-produced helium.

  This is an important new study that should be followed closely.

source of image

Comments Off on An Independent Way To Assess The Fossil Fuel Input Of Gases, Including CO2, Into The Atmosphere

Filed under Climate Change Forcings & Feedbacks, Climate Change Metrics, Research Papers

More Complexity Found In The Climate System

It is clear that as we further study the climate system, it becoming finally recognized that it is more complex than concluded by the 2007 IPCC assessment. Today an article by Geoff Brumfiel appeared in Nature  titled (h/t to Don Bishop)

Cloud formation may be linked to cosmic rays. Published online 24 August 2011 | Nature  doi:10.1038/news.2011.504

where an excerpt reads [highlight added]

“It sounds like a conspiracy theory: ‘cosmic rays’ from deep space might be creating clouds in Earth’s atmosphere and changing the climate. Yet an experiment at CERN, Europe’s high-energy physics laboratory near Geneva, Switzerland, is finding tentative evidence for just that.”

In the associated paper

Kirkby, J., et al, 2011:Role of sulphuric acid, ammonia and galactic cosmic rays in atmospheric aerosol nucleation. Nature Volume: 476, Pages: 429–433 Date published: (25 August 2011) DOI: doi:10.1038/nature10343 Received09 September 2010 Accepted24 June 2011Published online24 August 2011

it is concluded that

“Time-resolved molecular measurements reveal that nucleation proceeds by a base-stabilization mechanism involving the stepwise accretion of ammonia molecules. Ions increase the nucleation rate by an additional factor of between two and more than ten at ground-level galactic-cosmic-ray intensities, provided that the nucleation rate lies below the limiting ion-pair production rate.”

This paper is discussed by Michael Le Page in a New Scientist article titled

Cloud-making: Another human effect on the climate

Excerpts from the article include

Organic vapours released by organisms such as trees, marine bacteria and livestock appear to play a far more important role in cloud formation than suspected.

Anything that affects cloud formation can in theory affect climate, because clouds can either reflect or trap the sun’s heat depending on conditions. Cloud droplets can form only on particles above 50 nanometres. In much of the atmosphere, dust, smoke and sea-spray provide more than enough of these cloud condensation nuclei, or CCNs.

Aerosol nucleation is known to require sulphuric acid, but Kirkby’s team found that it is not enough by itself at low altitudes – the presence of an additional organic trace vapour is needed (Nature, DOI: 10.1038/nature10343). “If there is too little of either component then nucleation will not occur at an appreciable rate in the low atmosphere,” says Kirkby. That means the organic component – and thus the role of living organisms – is more important than had been thought, although the full implications are not yet understood.

If it is significant on a global scale, it might mean that the natural emissions of organics is also important in cloud formation,” says Bart Verheggen of the Energy Research Centre of the Netherlands in Petten.”

The role of organics in cloud formation has actually been known for quite some time, but now there is evidence of a cosmic ray interaction.

Examples of seminal papers that show that biogenic emissions exert an important role on clouds and precipitation include

Schnell, R. C. and Vali, Gabor, 1976: Biogenic Ice Nuclei: Part I. Terrestrial and Marine Sources. Journal of Atmospheric Sciences. vol. 33, Issue 8, pp.1554-1564

who report

“Using numerous measurements from around the globe, atmospheric ice nucleus concentrations, and also freezing nucleus concentrations in rainfall, were shown to exhibit a climatic dependence similar to that of biogenic nuclei sources at the surface. This correlation suggests that large proportions of atmospheric ice nuclei are possibly of biogenic origin.”

and also

Vali, G., M. Christensen, R. W. Fresh, E. L. Galyan, L. R. Maki, R. C. Schnell, 1976: Biogenic Ice Nuclei. Part II: Bacterial Sources. J. Atmos. Sci., 33, 1565–1570.
doi: 10.1175/1520-0469(1976)033<1565:BINPIB>2.0.CO;2

Transient appearance of ice nuclei active at temperatures of −2 to −5°C has been noted to accompany the natural decay of plant leaf materials. It was shown that the development of these nuclei results from the presence of a bacterium which was identified as Pseudomonas syringae. These bacteria produce highly active nuclei in a variety of growth media. Evidence points to the fact that the bacterial cells themselves are the nuclei, but that nucleating capacity is a rare and changeable property of the cells. The findings raise the possibility that bacteria may play a role in atmospheric precipitation processes.

Since land use/land cover change necessarily alters the patterning and amount of these biogenic (organic) sources of nuclei, there is a very poorly understood complex interaction between human climate forcing and that from cosmic rays, if this extraterrestrial component is an important aspect of the climate, as this new research suggests.

Anthony Watts also presents a report on this new research in his (as usual) excellent post

BREAKING NEWS – CERN Experiment Confirms Cosmic Rays Influence Cloud Seeds

source of images

Comments Off on More Complexity Found In The Climate System

Filed under Climate Change Forcings & Feedbacks