Monthly Archives: November 2010

Further Confirmation Of A Need To Broaden Out The Assessement Of Climate Beyond CO2 Effects

As reported on my weblog since its inception, a focus on CO2 as the main driver of the climate system is grossly inadequate in terms of how the real climate system works. 

The failure of this narrow perspective has been reported in several multi-authored papers and assessment reports; e.g.

Kabat, P., Claussen, M., Dirmeyer, P.A., J.H.C. Gash, L. Bravo de Guenni, M. Meybeck, R.A. Pielke Sr., C.J. Vorosmarty, R.W.A. Hutjes, and S. Lutkemeier, Editors, 2004: Vegetation, water, humans and the climate: A new perspective on an interactive system. Springer, Berlin, Global Change – The IGBP Series, 566 pp.

National Research Council, 2005: Radiative forcing of climate change: Expanding the concept and addressing uncertainties. Committee on Radiative Forcing Effects on Climate Change, Climate Research Committee, Board on Atmospheric Sciences and Climate, Division on Earth and Life Studies, The National Academies Press, Washington, D.C., 208 pp.

Rial, J., R.A. Pielke Sr., M. Beniston, M. Claussen, J. Canadell, P. Cox, H. Held, N. de Noblet-Ducoudre, R. Prinn, J. Reynolds, and J.D. Salas, 2004: Nonlinearities, feedbacks and critical thresholds within the Earth’s climate system. Climatic Change, 65, 11-38.

Pielke Sr., R., K. Beven, G. Brasseur, J. Calvert, M. Chahine, R. Dickerson, D. Entekhabi, E. Foufoula-Georgiou, H. Gupta, V. Gupta, W. Krajewski, E. Philip Krider, W. K.M. Lau, J. McDonnell,  W. Rossow,  J. Schaake, J. Smith, S. Sorooshian,  and E. Wood, 2009: Climate change: The need to consider human forcings besides greenhouse gases. Eos, Vol. 90, No. 45, 10 November 2009, 413. Copyright (2009) American Geophysical Union.

McAlpine, C.A., W.F. Laurance, J.G. Ryan, L. Seabrook, J.I. Syktus, A.E. Etter, P.M. Fearnside, P. Dargusch, and R.A. Pielke Sr. 2010: More than CO2: A broader picture for managing climate change and variability to avoid ecosystem collapse. Current Opinion in Environmental Sustainability, 2:1–13, DOI 10.1016/j.cosust.2010.10.001.

The need to broaden out the assessment of the human role on the climate system, as well as to more accurately consider natural climate forcings and feedbacks has received important new confirmation from a new article in the Bulletin of the American Meteorological Society. 

 The excellent new article is

Nobre et al, 2010: Addressing the complexity of the Earth system. Bull. Amer. Met. Soc. DOI:10.1175/2010BAMS3012.1.

The abstract reads

“Integration of physical, biogeochemical, and societal processes would accelerate advances in Earth system prediction”

with the following excerpts from the text

“Earth system science addresses natural and human-driven processes affecting the evolution and ultimately the habitability of the planet. We must recognize that the Earth system encompasses interactions among the atmosphere, ocean, ice, land, biochemistry, and humanity. Humanity has advertently and inadvertently perturbed the entire system, with both positive and negative consequences. Thus, the accelerated development of a monitoring and prediction system that integrates physical, biogeochemical, and societal processes is essential if we are to provide quantitative information that can initiate and guide the mitigation of, and adaptation to, future changes in the Earth system.”

“The role of the biosphere. The biosphere is the “life zone” of Earth system. It is composed of living beings and their multi-way interaction with the geophysical and biological elements within the lithosphere (solid Earth), hydrosphere, and atmosphere. Until recently, the biosphere was primarily studied within the context of its response to geophysical influences, with less attention to the feedback of biospheric processes on weather and climate. However, this is beginning to change with new components of land cover, including urban areas (e.g., Oleson et al. 2008) and fire (e.g., Golding and Betts 2008), being implemented in the global models.”

“Many active biogeochemical feedback systems exhibit highly nonlinear behavior. Changes of system dynamics can be initiated by both natural and human activities. These changes can be abrupt “tipping points” between significantly differing states of the Earth system that society might not want to transgress (Steffen et al. 2003; Lenton et al. 2008; Rockström et al. 2009). The biosphere is also intertwined in the geochemical cycling that can contribute to natural and anthropogenic contributions to climate variability and change. The examples below illustrate this for anthropogenic changes in global nitrogen and ocean carbon cycles.”

The excellent Nobre et al 2010 paper provides further evidence that the 2007 IPCC WG1 report was much too narrow in terms of its assessment of the climate system. While we first need to assess the predictability of the Earth system (as a necessary condition before we can possibly provide accurate forecasts (predictions), the recognition that

“…. the Earth system encompasses interactions among the atmosphere, ocean, ice, land, biochemistry, and humanity. Humanity has advertently and inadvertently perturbed the entire system, with both positive and negative consequences.”

is an important major step forward in better reporting on the climate system.

Comments Off

Filed under Climate Change Forcings & Feedbacks, Research Papers

Guest Post By Jos de Laat On the Post By Andy Lacis

Guest Post by  Jos de Laat (Jos), Ph.D, Royal Netherlands Meteorological Institute.

Following the ongoing debate on this weblog with Andy Lacis I could not resist weighing in on the recent discussion. Although Dr. Lacis notes that “aerosols are the really big uncertainty”, later on it is suggested that the aerosol direct and indirect effects nevertheless are still “well known”. It is a sort of ambiguity that I have ran across very often, and from what I can determine it appears that this ambiguity has everything to do with aerosols acting as a “turning knob” to constrain climate models in their reproduction of 20th century temperature change.

“Recently I have been focusing in my work a little bit more on aerosols as well as climate modeling. I have started working on some interesting new aspects of aerosols (GRL paper in press: http://www.agu.org/journals/pip/gl/2010GL045171-pip.pdf), have been contributing to a more philosophical paper about the assessment of climate models, and I’m currently writing a book chapter about aerosols. While gathering information for this book I came across the 2009 CCSP report on aerosols. It contains some interesting statements that I think are worth mentioning.”

http://www.climatescience.gov/Library/sap/sap2-3/final-report/default.htm

http://downloads.climatescience.gov/sap/sap2-3/sap2-3-final-report-ExecSummary.pdf

For example, the executive summary states that (boldface added):

“Calculated change of surface temperature due to forcing by anthropogenic aerosols was reported in IPCC AR4 based on results from more than 20 … modeling groups. Despite a wide range of climate sensitivity … exhibited by the models, they all yield a global average temperature change very similar to that observed over the past century. This agreement across models appears to be a consequence of the use of very different aerosols forcing values which compensate for the range of climate sensitivity. For example, the direct cooling effect of sulfate aerosol varied by a factor of six among the models. An ever greater disparity was seen in the model treatment of black carbon and organic carbon. … For those models that include the indirect effect, the aerosol effect on cloud brightness (reflectivity) varied by up to a factor of nine. Therefore, THE FACT THAT MODELS HAVE REPRODUCED THE GLOBAL TEMPERATURE IN THE PAST DOES NOT IMPLY THAT THEIR FUTURE FORECASTS ARE ACCURATE.”

“On a global average basis, the sum of direct and indirect forcing by anthropogenic aerosols at the top of the atmosphere is almost certainly negative (a cooling influence), and thus almost certainly offsets a FRACTION of the positive (warming) due to anthropogenic greenhouse gases. However, because of the spatial and temporal non-uniformity of the aerosol RF, and likely differences in the effects of shortwave and longwave forcings, THE NET EFFECT ON EARTH’S CLIMATE IS NOT SIMPLY A FRACTIONAL OFFSET TO THE EFFECTS OF FORCING BY ANTHROPOGENIC GREENHOUSE GASES.”

“Although the nature and geographical distribution of forcings by greenhouse gases and aerosols are quite different, it is often assumed that to first approximation these forcings on global mean surface temperature are additive, so that the negative forcing by anthropogenic aerosols has partly offset the positive forcing by incremental greenhouse gas increases over the industrial period. … … However, since aerosol forcing is much more pronounced on regional scales than on the global scale because of the highly variable aerosol distributions, IT WOULD BE INSUFFICIENT OR EVEN MISLEADING TO PLACE TOO MUCH EMPHASIS ON THE GLOBAL AVERAGE. Also, aerosol RF at the surface is stronger than at the TOA, exerting large impacts within the atmosphere to alter the atmospheric circulation patterns and water cycle. THEREFORE, IMPACTS OF AEROSOLS ON CLIMATE SHOULD BE ASSESSED BEYOND THE LIMITED ASPECT OF GLOBALLY AVERAGE RADIATIVE FORCING AT TOA.”

Hence, aerosols remain a big enigma and will continue to do so until – as the report notes – “a firmer estimate of the radiative forcing by aerosols, as well as climate sensitivity, is available”. To what extent this can be achieved in the near future probably depends on the continuation of the current remote sensing capacity like for example assembled with the A-train satellites. Unfortunately, and despite its tremendous success, there are no immediate plans to continue/upgrade/replace most of the A-train missions beyond the current orbiting satellites. Hence, it is not inconceivable that the uncertainties with regard to the aerosol radiative forcing and thus constrains on the climate sensitivity will remain for a long time.

Comments Off

Filed under Climate Change Forcings & Feedbacks, Guest Weblogs

BMW’s Comment On Ethanol

I came across an interesting brochure from BMW that is titled “Beyond octane: How additives in gasoline are affecting your BMW’s performance”. We have scanned and posted the one side of this brochure below (click on image for a larger view).

The text that reads

“In combustion, ethanol provides less energy than gasoline, resulting in reduced fuel economy. When ethanol burns inside the engine, it tends to form a weaker mixture that may cause misfire, rough idle and cold start issues in your vehicle. In addition, engine components may deteriorate over time when in contact with ethanol”.

This is hardly an endorsement for this fuel component that is promoted as one way to reduce carbon dioxide emissions.

Comments Off

Filed under Climate Change Regulations

Atmospheric CO2 Thermostat: Continued Dialog by Andy Lacis

Andy Lacis has graciously continued the dialog that was started in our weblog posts

Guest Post “CO2: The Thermostat That Controls Earth’s Temperature” By Andy Lacis

Further Comment By Andy Lacis On CO2 As A Climate Thermostat

My Comments On The Andy Lacis Post On CO2 As A Climate Thermostat

Today, I am posting a new constructive contribution. I will respond (and invite Roy Spencer to respond) for next week.

Guest Post by Andy Lacis  – Atmospheric CO2 Thermostat: Continued Dialog (Part I)

The interaction here has been both useful and informative, as opposed to what so frequently happens in the more typical climate blog interactions where only predictable and otherwise immutable opinions get tossed about with no real exchange of information or ideas.

It seems to me that the root cause of the varied differences in interpretation regarding our Science paper conclusions may well originate from the perceived understanding (or misunderstanding) of what exactly the GISS ModelE climate model is capable of simulating, and what specifically is or is not being assumed in the ModelE climate experiment simulations that we describe.

This point is particularly relevant to Roy Spencer’s remark that “our assumptions determine our conclusions”. Or more specifically, that “after assuming clouds and water vapor are no more than feedbacks upon temperature”, they then  “prove their paradigm that CO2 drives climate by forcing the model with a CO2 change.” And, further along these lines that, “if they had forced the model with a water vapor change, it would have done the same thing.”

Basically, Roy’s comments would have been applicable had we used a 1D radiative/convective model (as in Hansen et al., 1981, Science, 213, 957–966) for our Science paper calculations. As you well know, in 1D RCMs, there is no real capability for including model ‘physics’. Instead, all of the cloud and water vapor feedback effects are either implicitly or explicitly prescribed, and thus by definition, ‘assumed’. However, with respect to our ModelE climate experiment simulations, Roy’s comments are completely off target because there is really nothing that is being assumed about cloud and water vapor feedbacks, other than clouds and water vapor behave according to established physics. Climate feedbacks are simply the end result of model physics.

Roy’s last point about forcing the model with water vapor brings up an interesting point. If water vapor is a feedback, can it also be a forcing? The answer is “Yes, absolutely!” any externally imposed water vapor change beyond the ‘feedback equilibrium’ distribution of water vapor will constitute a radiative forcing. To illustrate this point, we performed two GCM runs – one with instantaneously doubled water vapor, the other with instantaneously zeroed water vapor (actually reduced by a factor of 1000 to avoid a divide check in a diagnostic routine that expects finite column water vapor). Initial model temperatures (and model physics) are the same as in the control run. The instantaneous net TOA forcing for doubled water vapor is 12 W/m2 warming, and –60 W/m2 cooling for zeroed water vapor (see Schmidt et al., 2010, 115, D20106).

As expected, for the doubled water vapor experiment, there is enhanced rapid rainout, while for the zeroed water vapor experiment there is rapid evaporation into a very dry atmosphere.  This is because the condensation and evaporation are significantly faster acting processes than the changes in atmospheric and ocean temperature in response to the applied radiative forcings (which diminish rapidly as atmospheric water vapor returns to its control-run equilibrium distribution). Within a year, atmospheric water vapor distribution is back to being virtually indistinguishable from the control-run climate with no significant long term impact.

As you can see, Roy’s point about water vapor forcing is actually an excellent example to illustrate the feedback role of water vapor, even more directly and emphatically than by zeroing out the non-condensing GHGs in our Science paper. From all this, it is abundantly clear that it is the non-condensing GHGs that control the terrestrial greenhouse effect (and thus the global equilibrium temperature of Earth). Water vapor and clouds, although accounting for 75% of the total greenhouse effect, participate only to the extent of feedback amplification. This then leads to the conclusion that atmospheric CO2 (accounting for 80% of the non-condensing GHG forcing) acts as a thermostat in controlling the temperature of Earth by regulating the strength of the terrestrial greenhouse effect. Since the non-condensing GHGs (CO2 in particular) are all being accurately measured and monitored, and since humans are directly linked to these GHG increases, it then follows that the global warming aspect (increase in terrestrial greenhouse effect strength) of global climate change is directly the result of human industrial activity.

NOAA lists the current level of atmospheric CO2 at Mona Loa at 387.18 ppmv. This is indicative of the precision with which the non-condensing GHG forcing is known, both for recent trends and going back into the geological ice core record. Some uncertainty (possibly substantial) is associated with the magnitude of the water vapor and cloud feedback amplification. Similarly, there is uncertainty with the climate response time of the ocean heat capacity. The GISS ModelE produces an overall feedback amplification (climate sensitivity) of about 3 °C for doubled CO2 (or 0.75 °C/Wm–2). This climate feedback sensitivity is corroborated by the 400,000 year Antarctic ice core record (Hansen et al., 2008, Open Atmos. Sci. J., 2, 217–231). Hansen et al. also show that the global climate response time takes about 5 years to achieve 40% of the equilibrium warming, 100 years to reach 60%, and about 1500 years to approach the 100% level of the eventual global  equilibrium warming for a given radiative forcing.

So much for global warming by the anthropogenic GHGs. We are in basic agreement with you that global warming fueled by increasing GHGs is not the only thing that is causing global climate change. Our best estimate of climate forcings for the period 1750–2000 (Hansen et al., 2005, J. Geophys. Res., 110, D18104), is that GHG increases account for 2.9 W/m2 (of which CO2 contributes 1.5 W/m2). Aerosols are the really big uncertainty in global climate forcing with black carbon type aerosols causing 0.8 W/m2 warming, non-absorbing aerosols –1.1 W/m2 cooling, and indirect aerosol effect producing about –1.0 W/m2 cooling. Smaller radiative forcings are attributed to land use change (–0.15 W/m2) and to solar irradiance (0.30 W/m2).

Aerosol and land use forcings are particularly troublesome because the observational constraints are so poor for these forcings. This is largely because current satellite measurements are only capable of making spectral intensity-only measurements, which makes it is impossible to attribute intensity contributions as coming unambiguously from the land surface, aerosols, or from undetected cloud contamination. Hopefully, polarimetric measurements on the upcoming NASA Glory mission will greatly improve on the aerosol forcing uncertainty. Because of their short atmospheric life time, aerosols (especially black carbon) are the more attractive targets for mitigating global warming than GHGs with their long atmospheric life times. 

Natural (unforced) climate variability (e.g., El Nino, La Nina, decadal temperature fluctuations in the Pacific and Atlantic oceans), is another factor that is an important part of the ongoing global climate change. But all these are fluctuations about the global equilibrium temperature and do not by themselves produce a long term temperature trend. Still, they do confuse trend analysis of the existing climate record (which is too short to establish a statistical certainty).

Atmospheric CO2 Thermostat: Continued Dialog (Part II)

The GISS ModelE is specifically designed to be a ‘physical’ model, so that Roy Spencer’s water vapor and cloud feedback ‘assumptions’ never actually need to be made. There is of course no guarantee that the model physics actually operate without flaw or bias. In particular, given the nature of atmospheric turbulence, a ‘first principles’ formulation for water vapor and cloud processes is not possible. Because of his, there are a number of adjustable coefficients that have to be ‘tuned’ to ensure that the formulation of evaporation, transport, and condensation of water vapor into clouds, and its dependence on wind speed, temperature, relative humidity, etc., will be in close agreement with current climate distributions. However, once these coefficients have been set, they become part of the model physics, and are not subject to further change. As a result, the model clouds and water vapor are free to change in response to local meteorological conditions. Cloud and water vapor feedbacks are the result of model physics and are thus in no way “assumed”, or arbitrarily prescribed. A basic description of ModelE physics and of ModelE performance is given by Schmidt et al. (2006, J. Climate, 19, 153–192).

Of the different physical processes in ModelE, radiation is the closest to being ‘first principles’ based. This is the part of model physics that I am most familiar with, having worked for many years to design and develop the GISS GCM radiation modeling capability. The only significant assumption being made for radiation modeling is that the GCM cloud and absorber distributions are defined in terms of plane parallel geometry. We use the correlated k-distribution approach (Lacis and Oinas, 1991, J. Geophys. Res., 96, 9027–9063) to transform the HITRAN database of atmospheric line information into absorption coefficient tables, and we use the vector doubling adding method as the basis and standard of reference for GCM multiple scattering treatment.

Direct comparison of the upwelling and downwelling LW radiative fluxes, cooling rates, and flux differences between line-by-line calculations and the GISS ModelE radiation model results for the Standard Mid-latitude atmosphere is shown in Figure 1 below. 

As you can see, the GCM radiation model can reproduce the line-by-line calculated fluxes to better than 1 W/m2. This level of accuracy is representative for the full range of temperature and water vapor profiles that are encountered in the atmosphere for current climate as well as for excursions to substantially colder and warmer climate conditions. The radiation model also accounts in full for the overlapping absorption by the different atmospheric gases, including absorption by aerosols and clouds. In my early days of climate modeling when computer speed and memory were strong constraints, the objective was to develop simple parameterizations for weather GCM applications (e.g., Lacis and Hansen, 1974, J. Atmos. Sci., 31, 118–133). Soon after, when the science focus shifted to real climate modeling, it became clear that an explicit radiative model responds accurately to any and all changes that might take place in ground surface properties, atmospheric structure, and solar illumination. Thus the logarithmic behavior of radiative forcings for CO2 and for other GHGs is behavior that has been derived from the GCM radiation model’s radiative response (e.g., the radiative forcing formulas in Hansen et al., 1988, J. Geophys. Res., 93, 9341–9364) rather than being some kind of a constraint that is placed on the GCM radiation model. 

Climate is primarily a boundary value problem in physics, and the key boundary value is at the top of the atmosphere being defined entirely by the incoming (absorbed) solar radiation and the outgoing LW thermal radiation. The global mean upwelling LW flux at the ground surface is about 390 W/m2 (for 288 K), and the outgoing LW flux at TOA is about 240 W/m2 (or 255 K equivalent). The LW flux difference that exists between the ground and TOA of 150 W/m2 (or 33 K equivalent) is a measure of the terrestrial greenhouse effect strength. We should note that the transformation of the LW flux that is emitted upward by the ground, to the LW flux that eventually leaves the top of the atmosphere, is entirely by radiative transfer means. Atmospheric dynamical processes participate in this LW flux transformation only to the extent of helping define the atmospheric temperature profile, and in establishing the local atmospheric profiles of water vapor and cloud distributions that are used in the radiative calculations.

 

Armed with a capable radiative transfer model, it is then straightforward to take apart and reconstruct the entire atmospheric structure, constituent by constituent, or in any particular grouping, to attribute what fraction of the total terrestrial greenhouse effect each atmospheric constituent is responsible for. That is where the 50% water vapor, 25% cloud, and 20% CO2 attribution in the Science paper (for the atmosphere as a whole) came from. “Follow the money!” is the recommended strategy to get to the bottom of murky political innuendos. A similar approach, using “Follow the energy!” as the guideline, is an effective means for fathoming the working behavior of the terrestrial climate system. By using globally averaged radiative fluxes in the analysis, the complexities of advective energy transports get averaged out. The climate energy problem is thereby reduced to a more straightforward global energy balance problem between incoming (absorbed) SW solar energy and outgoing LW thermal energy, which is fully amenable to radiative transfer modeling analysis. The working pieces in the analysis are the absorbed solar energy input, the atmospheric temperature profile, surface temperature, including the atmospheric distribution of water vapor, clouds, aerosols, and the minor greenhouse gases, all of which can be taken apart and re-assembled at will in order to quantitatively characterize and attribute the relative importance of each radiative contributor. 

Validation of the GCM climate modeling performance is in terms of how well the model generated temperature, water vapor, and cloud fields resemble observational data of these quantities, including their spatial and seasonal variability. It would appear that ModelE does a generally credible job in reproducing most aspects of the terrestrial climate system.  However, direct observational validation of the GCM radiation model performance to a useful precision is not really feasible since the atmospheric temperature profile and absorber distributions cannot all be measured simultaneously with available instrumentation to the required precision that would lead to a meaningful closure experiment. As a result, validation of the GCM radiation model performance must necessarily rely on the established theoretical foundation of radiative transfer, and on comparisons to more precise radiative transfer bench marks such as line-by-line and vector doubling calculations that utilize laboratory measurements for cloud and aerosol refractive indices and absorption line radiative property information.   

 

Atmospheric CO2 Thermostat: Continued Dialog (Part III) 

As you commented earlier, attribution of the greenhouse effect would be of greater interest if performed for climate forcing perturbations relative to current climate (such as doubled CO2). Long ago, we did describe such attribution for cases of doubled CO2 and 2% solar irradiance increase (Hansen et al., 1984, AGU Geophysical Monograph, 29, 130–163), both of which produced a global equilibrium warming of about 4 °C. The question of how large is the cloud feedback sensitivity in current climate is the one question that is most acute. While clouds may account for 25% of the total atmospheric greenhouse strength, a strongly positive cloud feedback response is not representative of current climate modeling analyses, which suggest a near-zero cloud feedback sensitivity. This point can be investigated by performing a detailed flux change attribution study for doubled CO2. The radiative responses for both CO2 and water vapor changes relative to current climate amounts are basically logarithmic, i.e., indicative of strong saturation. It would appear that the radiative response of cloud changes relative to the current climate cloud distribution is even more strongly saturated than the CO2 and water vapor responses.

 

We performed a zonal feedback analysis (described in part by Lacis and Mishchenko, 1995, in Aerosol Forcing of Climate, Dahlem Workshop Reports, 17, 11–42) of the Hansen et al. (1984) climate sensitivity experiments. This analysis showed cloud feedback to be rather complicated, comprised of changes in cloud cover, cloud height, and column optical depth with a latitudinal dependence that produced positive cloud feedback in low to middle latitudes, but negative cloud feedback at high latitudes, as shown (by the orange curve) in Figure 2.

 

Water vapor feedback (blue curve, positive at all latitudes) is also a complicated feedback in that there is latitudinal dependence, a significant amount of the feedback response due to the vertical redistribution of water vapor, a strong negative feedback due to moist adiabatic lapse rate change at low latitudes, but which becomes a positive feedback in the polar regions. As expected, the snow/ice feedback (green curve) is also a strong feedback, but confined to polar latitudes. In the above, all of the feedback responses have been expressed in terms of temperature equivalents, i.e., the fraction of the zonal temperature change attributable to that particular feedback effect. 

In Figure 2, the solid black curve is the zonal mean equilibrium surface temperature change for doubled CO2 (4.2 °C global mean). The dotted black line is DTo = 1.2 °C the (near-constant with latitude) no-feedback equilibrium temperature change due directly to doubled CO2 that would be climate system temperature response produced in the absence of feedbacks. The red curve is the ‘advective’ feedback, depicting the net effect (and near cancellation) of changes in the advective transport of latent heat, sensible heat, and geopotential energy (each one of which is an order of magnitude larger than the radiative water vapor and cloud feedback contributions). However when averaged globally, the advective feedbacks must educe to zero.

 

As has been pointed out by Aires and Rossow (2003, Q. J. Royal Meteorol. Soc., 129, 239–275), the feedback interactions of the climate system are non-linear, and the feedback sensitivities are state-dependent and therefore variable in time, and thus are not ‘constants’ of the system. This means that inferring climate feedbacks from linear regressions of cloud or water vapor changes with respect to applied forcings is not likely to be successful. Nevertheless, by performing a detailed radiative flux change attribution for all contributing radiative components between two different climate equilibrium states (say, control vs doubled CO2), it is possible to infer on the basis of the flux change attribution how much forcing was provided by the changes in non-condensing radiative forcing agents, compared to the flux changes attributable to the feedback contributing components. From this radiative flux change comparison, we can infer the relative magnitude of the feedback sensitivity for that particular radiative forcing experiment. From analysis of many such radiative forcing experiments, we can get a better idea of the general characteristics of the climate feedback response, and how the feedback response may depend on the nature of the applied forcing, as described in the radiative forcing and climate response analyses by Hansen et al., 1997, J. Geophys. Res., 102, 6831–6864. 

We are in the process of doing a feedback attribution analysis (as in Figure 2) for doubled CO2 with the GISS ModelE. The analysis is straightforward, but tedious, in having to swap water vapor, cloud, and temperature profile fields between the control and double CO2 equilibrium run results and evaluating the instantaneous TOA radiative flux changes. A feedback sensitivity of 3°C for doubled CO2 is strongly supported by the geological record, suggesting that this analysis will provide realistic feedback sensitivities for climate perturbations relative to current climate.

 

The real uncertainties in climate modeling lie in the area of understanding the natural (unforced) climate fluctuations that occur on inter-annual and decadal time scales, and on regional spatial scales. This variability occurs because the local climate system responses to energy imbalances strongly overshoot the imbalance, achieving energy balance only in a global and time averaged sense. Fortunately, this natural (unforced) climate variability produces fluctuations about the equilibrium climate state, and therefore does not contribute to the long term climate trend.

Perspective and Overview 

The complexity of the physical processes that constitute the terrestrial climate system is undeniable. Clearly, full understanding of climate is not likely to be achieved in the foreseeable future since everything from microscopic to cosmic makes some contribution to climate, even if that contribution is miniscule. On the other hand, an adequate understanding of how the climate system works and operates is within reach.

 

Things that we know well

Terrestrial climate is established as the result of energy balance between SW solar radiation absorbed by the Earth and the LW thermal radiation emitted by the Earth. 

Atmospheric absorption of LW radiation by water vapor, clouds, CO2, and other trace gases  produces a greenhouse effect that keeps the surface temperature of Earth about 33 °C warmer than it otherwise would be without the atmospheric greenhouse absorbers.

 

Of the 33 °C terrestrial greenhouse effect, water vapor is responsible for about 50% of the effect, 25% is due to clouds, 20% is due to CO2, and the remaining 5% is contributed by CH4, N2O, O3, CFCs, and other lesser constituents. 

The atmospheric distribution of water vapor and clouds is the result of feedback processes, hence the water vapor and cloud amounts are determined by the prevailing meteorological conditions.

 

The non-condensing greenhouse gases (CO2, CH4, N2O, O3, and CFCs) provide the ultimate support structure for the terrestrial greenhouse effect, even though by themselves they account only for 25% of the total atmospheric greenhouse effect. 

Accurate measurement and monitoring of the non-condensing GHGs shows unrelenting increase in atmospheric GHG concentrations, with an accumulated radiative forcing of about 3 W/m2 since 1880.

 

Since the non-condensing GHG increase is due almost entirely to human industrial activity, primarily the burning of fossil fuel, humans are fully responsible for the global warming. 

Accurate measurements of solar irradiance over three solar cycles since the late 1970s show solar cycle variability to be of roughly 1 W/m2 amplitude, but with no significant trend.

 

Aerosols are important contributors of climate forcing, with non-absorbing aerosols and associated cloud-aerosol indirect effect contributing about –1 W/m2 apiece, and black carbon aerosols contributing about 0.8 W/m2 of warming. 

The current climate model sensitivity (for doubled CO2) of 3 °C per 4 W/m2 forcing is in good agreement with the geological (400 K-year) ice core record.

 

The climate system also undergoes natural (unforced) variability about its global equilibrium state with regional shifts in climate patterns on inter-annual and decadal time scales.

Things that we know less well

We know that aerosols are significant contributors to global climate change, but aerosol radiative properties, cloud-aerosol indirect effect, and the trend in aerosol changes are poorly constrained by intensity-only measurements that have great difficulty separating aerosol radiative properties from sub-pixel cloud contamination and from changes in spectral surface reflectivity. 

The long term trend in solar irradiance change must be inferred indirectly based on sunspot cycle changes and proxy information.

 

While changes in cloud distribution between two equilibrium climate states can be interpreted as ‘cloud feedback’, cloud response to changing meteorological conditions can impact multiple cloud characteristics (cloud cover, cloud height, cloud life time, water/ice phase, optical depth, particle size, diurnal phase), all of which have radiative consequences, some affecting the SW more, others the LW, but which are not readily confirmable with available observational data. 

While climate models do exhibit natural variability on inter-annual and decadal time scales that is qualitatively comparable to the real world, climate models have limited skill in modeling the regional and inter-annual climate fluctuations that take place in the climate system even in the absence of external forcing.

 

The bulk of the problems related to the realistic modeling of regional climate patterns and unforced variability are undoubtedly attributable to the still primitive state of ocean circulation and heat transport, which is decades behind the advances made in atmospheric modeling. 

The state of modeling of ice sheet dynamics, in particular the rate at which ice sheets will disintegrate in the face of continued global warming, is in an even more primitive state than that of the ocean climate response modeling.

 

There is anticipation, perhaps even support from observational evidence but with considerable uncertainty, that the frequency and magnitude of extreme weather events may be increasing as the strength of the hydrological cycle intensifies in step with global warming.  

Beside the geological evidence that the Earth could not support polar ice caps when atmospheric CO2 was greater than about 450 ppm (and sea level was more than 200 ft higher than present), there is little clear indication of a ‘tipping point’ beyond which recovery from polar ice cap meltdown might become problematic.

Policy implications

We currently seem to be operating under the ‘no regrets’ climate policy first formulated under the first Bush administration, which basically states that if anything undesirable should happen because of global climate change, we will then deal with that problem after the fact.

This approach appears to have saved about $200 million in not upgrading New Orleans levees. Unfortunately, the cost of dealing with Katrina after the fact was about $200 billion.

Hundreds of billions will be saved by not combating global warming. But the eventual cost may be hundreds of trillions to relocate major cities to higher ground ahead of rising sea levels.  

Comments Off

Filed under Climate Change Forcings & Feedbacks, Guest Weblogs

Repost Of “Further Discussion Of Global Warming”

In response to a comment on November 18 2o10 on Judy Curry’s post

Michael’s controversial testimony

by Don B., where he referred to an earlier post of mine from September 2010, I have reposted below.  It is titled

Further Discussion Of Global Warming

and reads

The report National Research Council, 2005: Radiative forcing of climate change: Expanding the concept and addressing uncertainties. Committee on Radiative Forcing Effects on Climate Change, Climate Research Committee, Board on Atmospheric Sciences and Climate, Division on Earth and Life Studies, The National Academies Press, Washington, D.C., 208 pp

provides a valuable summary of global warming, as utilized by the IPCC and others. The text we will use to discuss global warming will be based on the equations below.

“According to the radiative-convective equilibrium concept, the equation for determining global average surface temperature of the planet is

 

where

is the heat content of the land-ocean-atmosphere system with ρ the density, Cp the specific heat, T the temperature, and zb the depth to which the heating penetrates. [These equations] describes the change in the heat content where f is the radiative forcing at the tropopause, T′ is the change in surface temperature in response to a change in heat content, and λ is the climate feedback parameter (Schneider and Dickinson, 1974), also known as the climate sensitivity parameter, which denotes the rate at which the climate system returns the added forcing to space as infrared radiation or as reflected solar radiation (by changes in clouds, ice and snow, etc.). In essence, λ accounts for how feedbacks modify the surface temperature response to the forcing. In principle, T′ should account for changes in the temperature of the surface and the troposphere, and since the lapse rate is assumed to be known or is assumed to be a function of surface temperature, T′ can be approximated by the surface temperature.”

The use of T’ to diagnose global warming can be diagnosed [which is dH/dt] clearly involves estimates of f and λ which necessarily introduces complexity into obtaining dH/dt. Moreover, as we have documented in our papers; e.g.

Pielke Sr., R.A., C. Davey, D. Niyogi, S. Fall, J. Steinweg-Woods, K. Hubbard, X. Lin, M. Cai, Y.-K. Lim, H. Li, J. Nielsen-Gammon, K. Gallo, R. Hale, R. Mahmood, S. Foster, R.T. McNider, and P. Blanken, 2007: Unresolved issues with the assessment of multi-decadal global land surface temperature trends. J. Geophys. Res., 112, D24S08, doi:10.1029/2006JD008229

and

Klotzbach, P.J., R.A. Pielke Sr., R.A. Pielke Jr., J.R. Christy, and R.T. McNider, 2009: An alternative explanation for differential temperature trends at the surface and in the lower troposphere. J. Geophys. Res., 114, D21102, doi:10.1029/2009JD011841

there remain major problems with the accuracy of obtaining a global average value of T’. Among the unresolved problems are:

  1. There is no one value of T’. The estimates for this number are constructed from sea and land measurements of the values temperature anomalies and their absolute values across the globe. The actual loss of heat to space by radiation is proportional to T to the fourth power. A warm anomaly has a greater long wave flux to space in the tropics than at higher latitudes in the winter [e.g. Liljegren, 2008];
  2. The neglect of including concurrent trends and anomalies of the absolute humidity biases the dry bulb temperature estimates of T’, since it is really moist surface air enthalpy that should be measured [e.g. Davey et al 2006];
  3. The use of land surface minium temperatures introduces a bias since it responds in an amplified manner to changes in heating and cooling higher in the boundary layer.  This has been a warm bias in recent years [e.g. see Klotzbach et al, 2009];
  4. The siting for land observations have often been poorly located and, in the USA, have introduced a warm bias. The homogenization of the poorly and well sited locations smears this warm bias into the homogenized data analysis [a paper in preparation which will be submitted within the next two weeks on this research].

There is an obvious solution to these problems with respect to diagnosing global warming and cooling. We should adopt measuring a finite difference version of dH/dt  [e.g. monthly intervals].

There has been quite a bit of discussion in the posts on the accuracy of monitoring dH/dt from upper ocean heat content. The first requirement, however, is to agree that this is the now preferred metric to measure global warming and cooling, and to replace T’ [or, at least in addition to that metric].

My recommendation for the report of the meeting that was held September 7-9 2010 in the United Kingdom in Exeter titled

Surface temperature datasets for the 21st Century

is to focus on improved temperature data sets for regional studies (in which the issues we have raised in items #1 to #4 among others need to be addressed), while accepting that the time to use T’ as the primary metric to diagnose global warming and cooling has passed. 

 It is my opinion that they should recommend we move to the monitoring and reporting of dH/dt [in real time] to policymakers and the public using upper ocean heat content.

Follow up – November 23, 2010:

 My recommendations, as far as I know, remain ignored (rather than accepted or refuted) by NCDC, GISS and CRU

Comments Off

Filed under Climate Change Metrics

Two Recommended House Of Representatives Hearing Topics

I have posted on the failure of this week’s House of Representatives Subcommittee to properly assess the state of climate science; see

http://pielkeclimatesci.wordpress.com/2010/11/18/the-perpetuation-of-climate-misunderstandings-by-the-u-s-house-of-representatives-subcommitee-on-energy-and-environment/

In this post I want to present recommendations for two future Panels, whether led by Democrats or Republicans. Neither party has properly vetted the issues outlined below.

Panel 1 Recommendation

As we presented in our paper

Pielke Sr., R., K. Beven, G. Brasseur, J. Calvert, M. Chahine, R. Dickerson, D. Entekhabi, E. Foufoula-Georgiou, H. Gupta, V. Gupta, W. Krajewski, E. Philip Krider, W. K.M. Lau, J. McDonnell, W. Rossow, J. Schaake, J. Smith, S. Sorooshian, and E. Wood, 2009: Climate change: The need to consider human forcings besides greenhouse gases. Eos, Vol. 90, No. 45, 10 November 2009, 413. Copyright (2009) American Geophysical Union

there are three hypotheses that should be discussed at such a Panel.

They are

Hypothesis 1: Human influence on climate variability and change is of minimal importance, and natural causes dominate climate variations and changes on all time scales. In coming decades, the human influence will continue to be minimal.

Hypothesis 2a: Although the natural causes of climate variations and changes are undoubtedly important, the human influences are significant and involve a diverse range of first-order climate forcings, including, but not limited to, the human input of carbon dioxide (CO2). Most, if not all, of these human influences on regional and global climate will continue to be of concern during the coming decades.

Hypothesis 2b: Although the natural causes of climate variations and changes are undoubtedly important, the human influences are significant and are dominated by the emissions into the atmosphere of greenhouse gases, the most important of which is CO2. The adverse impact of these gases on regional and global climate constitutes the primary

The House Hearing

A Rational Discussion of Climate Change: the Science, the Evidence, the Response

did not ask the presenters which two of these three hypotheses should be rejected. If Hypotheses 1 and 2b are rejected, as we have concluded in our EOS paper, then as we write

“…the cost- benefit analyses regarding the mitigation of CO2 and other greenhouse gases need to be considered along with the other human climate forcings in a broader environmental context, as well as with respect to their role in the climate system. Because hypothesis 2a is the one best supported by the evidence,
policies focused on controlling the emissions of greenhouse gases must necessarily be supported by complementary policies focused on other first- order climate forcings. The issues that society faces related to these other forcings include the increasing demands of the human population, urbanization, changes in the natural landscape and land management, long- term weather variability and change, animal and insect dynamics, industrial and vehicular emissions, and so forth. All of these issues interact with and feed back upon each other.”

and

“The evidence predominantly suggests that humans are significantly altering the global environment, and thus climate, in a variety of diverse ways beyond the effects of human emissions of greenhouse gases, including CO2. Unfortunately, the 2007 Intergovernmental Panel on Climate Change (IPCC) assessment did not sufficiently acknowledge the importance of these other human climate forcings in altering regional and global climate and their effects on predictability at the regional scale.”

Further discussion of these three hypotheses, which could be used to fine tune them, are reported in the post

http://pielkeclimatesci.wordpress.com/2010/07/08/feedback-on-my-invitation-on-the-three-hypotheses-of-climate/

Panel 2 Recommendation

My second Panel recommendation is with respect to the “Response” to our current understanding of climate risks as contrasted with other environmental and social risks.  I posted on this in

http://pielkeclimatesci.wordpress.com/2010/08/03/a-way-forward-in-climate-science-based-on-a-bottom-up-resourse-based-perspective/

where I wrote

“There are 5 broad areas that we can use to define the need for vulnerability assessments : water, food, energy, [human] health and ecosystem function. Each area has societally critical resources. The vulnerability concept requires the determination of the major threats to these resources from climate, but also from other social and environmental issues. After these threats are identified for each resource, then the relative risk from natural- and human-caused climate change (estimated from the GCM projections, but also the historical, paleo-record and worst case sequences of events) can be compared with other risks in order to adopt the optimal mitigation/adaptation strategy.”

and suggested the following questions be asked

 1. Why is this resource important?  How is it used? To what stakeholders is it valuable?
 
2. What are the key environmental and social variables that influence this resource?
 
3.  What is the sensitivity of this resource to changes in each of these key variables? (this includes, but is not limited to, the sensitivity of the resource to climate variations and change on short (e.g. days); medium (e.g. seasons) and long (e.g. multi-decadal) time scales.
 
4. What changes (thresholds) in these key variables would have to occur to result in a negative (or positive) response to this resource?
 
5. What are the best estimates of the probabilities for these changes to occur? What tools are available to quantify the effect of these changes. Can these estimates be skillfully predicted?
 
6. What actions (adaptation/mitigation) can be undertaken in order to minimize or eliminate the negative consequences of these changes (or to optimize a positive response)?

7. What  are specific recommendations  for policymakers and other stakeholders?

These two Panels, in my view, provides a way to move beyond rehashing the same perspectives as presented in this week’s House Hearing.

Comments Off

Filed under Climate Science Op-Eds

New Paper “Screen Level Temperature Increase Due To Higher Atmospheric Carbon Dioxide In Calm And Windy nights Revisited’ By Steeneveld Et Al 2010

There is a new paper that further assesses the issue of long term temperature trends as a function of height near the ground. This new study was motivated by our paper

Pielke Sr., R.A., and T. Matsui, 2005: Should light wind and windy nights have the same temperature trends at individual levels even if the boundary layer averaged heat content change is the same? Geophys. Res. Letts., 32, No. 21, L21813, 10.1029/2005GL024407.

The new paper is

Steeneveld, G.J., A.A.M. Holtslag, R.T. McNider, and R.A Pielke Sr, 2010: Screen level temperature increase due to higher atmospheric carbon dioxide in calm and windy nights revisited. J. Geophys. Res., in press.

The abstract reads

“Long-term surface observations over land have shown temperature increases during the last century, especially during nighttime. Observations analyzed by Parker [2004] show similar long-term trends for calm and windy conditions at night, and on basis of this it was suggested that the possible effect of urban heat effects on long-term temperature trends are small. On the other hand, a simplified analytic model study by Pielke and Matsui [2005, henceforth PM05] suggests that at night the resultant long-term temperature trends over land should depend on height and strongly on wind speed (mostly due to alterations in the rate of nocturnal cooling in the stable boundary layer (SBL)). In this paper we expand the PM05 study by using a validated atmospheric boundary-layer model with elaborated atmospheric physics compared to PM05, in order to explore the response of the SBL over land to a change in radiative forcing. We find that the screen level temperature response is surprisingly constant for a rather broad range of both geostrophic wind speed (5-15 ms-1) and 10 meter wind (2-4.0 ms-1). This is mostly due to land surface-vegetation-atmosphere feedbacks taken into account in the present study which were not considered by PM05.”

Among the conclusions, we write

“Our study shows that the competing effects of boundary-layer height and wind speed dependent fluxes yield changes in shelter temperature that are largely independent of wind speed. PM05 only considered the role of wind speed and boundary-layer height. It is likely that urban heat island effects as observed, are due to the resident time of a parcel of air over a city and not due to the flux changes considered here or in PM05. Furthermore, the parameter spaces investigated in this paper are limited. For example, the CASES-99 observational site simulated here is quite smooth (the roughness length used in the simulations was 0.03 m). It is possible that larger roughnesses might provide more sensitivity to wind speed. This will be further explored in McNider et al. [2010].

Finally, we agree with PM05 that additional work is needed to understand SBL responses to both land use change and radiative forcing. Klotzbach et al. [2009], for example, which shows a statistically significant divergence between the long-term trends of the surface air and lower tropospheric temperatures at higher latitudes in the winter, indicate that the changes in the SBL over time remains an important climate change issue that has been not yet completely examined and understood.”

The answer to the question of whether long term temperature trends near the surface are a significant function of height is an important climate metric issue, as these trends are used in the construction of the annual average global surface temperature trend. From this new study, it appears that feedbacks mute temperature trends near the surface, however, this was for a specific situation and may not be general to other landscapes. The new McNider et al paper, that is in preparation, will examine this issue for other situations, and we will report on this weblog when this study is complete.

Comments Off

Filed under Climate Change Metrics, Research Papers, Uncategorized