Monthly Archives: October 2009

Examiner.com’s First Annual Survey on Global Warming

Update November 3 2009 330pm EST:  The poll is going to close in less than an hour. If you intend to participate, please do so now.

Tom Fuller of Examiner.com is conducting a survey – Examiner.com’s First Annual Survey on Global Warming. I encourage readers of my weblog to participate. Tom’s weblog has been an important addition to the discussion of the climate issue, and this new survey is another contribution.

As Tom writes

Click here to get started. Examiner.com’s First Annual Survey on Global Warming.

 
The introduction is below. Have fun!
 
 Thank you for participating in Examiner.com’s First Annual Survey on Global Warming.

First, let’s start with the ground rules. Your participation is completely anonymous, and no attempt will be made to contact you for any reason as a result of your participation or anything you write in this survey.

Second, this survey is not intended to be used as an opinion poll or a census, and will not be used as such. We are not trying to find out how many people ‘believe’ or ‘disbelieve’ in global warming. Our purpose is to try and find out if there are areas of agreement on possible policy initiatives going forward.

Comments Off on Examiner.com’s First Annual Survey on Global Warming

Filed under Climate Surveys

Interesting Post On Biofuels By Katharine Sanderson On The Nature.com Website

There is an informative post titled Biofuel woes by Katharine Sanderson on the Nature.com/climatefeedback website.

It reads in part

“Two papers in Science yesterday have poured cold water on the promise of second generation biofuels.

Biofuels derived from the cellulosic, woody parts of plants are not having their greenhouse gas emissions properly accounted for, says Jerry Melillo from the Marine Biological Laboratory at Woods Hole. Melillo’s study suggests that changes in the way land is used, as a consequence of growing crops for biofuels, is not taken into account, and if it were then those biofuels would be shown to actually cause more greenhouse gases to be released than fossil fuels. Nitrous oxide emissions from increased use of fertilisers are a big part of the problem.

“The problem is, we have a finite amount of land where new crops could be grown. Melillo and colleagues now report that if biofuel crops replace food crops on current farmlands, then the clearing of forested land for additional food crops will release more carbon from the soil there than in the areas where the biofuel crops themselves are being grown,” says the press release.

In a related policy forum article, Timothy Searchinger from Princeton University and a bunch of colleagues point out flaws in the ways that carbon emissions are counted for cap-and-trade schemes in both Europe and the US.

They say that the assertion that fuels made from biomass can be counted as carbon neutral is wrong. “Harvesting existing forests for electricity adds net carbon to the air,” the report says. “If bioenergy crops displace forest or grassland, the carbon released from soild and vegetation, plus lost future sequestration, generates carbon debt, which counts against the carbon the crops absorb.”

“In the near-term I think, irrespective of how you go about the cellulosic biofuels program, you’re going to have greenhouse gas emissions exacerbating the climate change problem,” Melillo is reported as saying in Reuters.

Energy efficiency news says the report is damning for biofuels.

More bad news comes from a UNEP report, highlighted by the New York Times. The report calls for greater debate about biofuels before ploughing headlong into a completely biofuel-powered society, although it focuses mainly on first generation fuels, unlike the Science papers.”

As has been discussed on my weblog; e.g. see

Comments On The Testimony Of Senator Dick Lugar On Climate Change and Deforestation On April 22 2008

  • The conversion of the landscape by deliberate management practices, is itself a climate change forcing (Kabat et al, 2004; NRC, 2005; Feddema et al, 2005; Pielke 2005). 
  • The net effect of deliberate landscape change such as afforestation may actually result in a radiative warming effect even though CO2 is extracted from the atmosphere by the plants. This occurs if the resulting surface albedo is less than for the original landscape and due to the added water vapor that is transpired into the atmosphere from the vegetation (i.e. see Pielke Sr., R.A., 2001: Carbon sequestration — The need for an integrated climate system approach. Bull. Amer. Meteor. Soc., 82, 2021.).
  • Further discussion of these issues is in the papers

    Pielke Sr., R.A., G. Marland, R.A. Betts, T.N. Chase, J.L. Eastman, J.O. Niles, D. Niyogi, and S. Running, 2002: The influence of land-use change and landscape dynamics on the climate system- relevance to climate change policy beyond the radiative effect of greenhouse gases. Phil. Trans. A. Special Theme Issue, 360, 1705-1719.

    Marland, G., R.A. Pielke, Sr., M. Apps, R. Avissar, R.A. Betts, K.J. Davis, P.C. Frumhoff, S.T. Jackson, L. Joyce, P. Kauppi, J. Katzenberger, K.G. MacDicken, R. Neilson, J.O. Niles, D. dutta S. Niyogi, R.J. Norby, N. Pena, N. Sampson, and Y. Xue, 2003: The climatic impacts of land surface change and carbon management, and the implications for climate-change mitigation policy. Climate Policy, 3, 149-157.

     Unless these issues are addressed in the context of developing climate policy that includes rewards for landscape management, the desired goal of reducing the human impact on climate will not be achieved.”

    The new papers,while already raising serious issues with biofuels, still has not examined the effect of the alteration of surface heat and moisture fluxes, and resultant effect on weather such as temperatures, clouds and precipitation as a result of biofuel agriculture. The concerns may be even greater than expressed in these new studies.

    Comments Off on Interesting Post On Biofuels By Katharine Sanderson On The Nature.com Website

    Filed under Climate Change Forcings & Feedbacks

    Comments On Len Ornstein’s Post “How To Quickly Lower Climate Risks, At ‘Tolerable’ Costs?”

    On October 26 2009 Len Ornstein posted a guest weblog titled “How To Quickly Lower Climate Risks, At ‘Tolerable’ Costs?”.  He has requested that I comment on his proposal to reduce carbon dioxide concentrations in the atmosphere. 

    As I have written previously, I am very concerned about geoengineering as a way to mitigate climate change from the addition of CO2 and other greenhouse gases; e.g. see

    Comments On The Physics Today Article “Will Desperate Climates Call for Desperate Geoengineering Measures?” by Barbara Goss Levi.

    I wrote in that post

    The claim in the Levi Physics Today article that geoengineering “intervention” [can] prevent or slow changes in the climate system is completely wrong. Geoengineering  would cause changes in the climate system!  The Levi focus almost exclusively on the role of the addition of carbon dioxide into the atmosphere is blind to the importance of altering the spatial pattern of climate forcing as a result of geoengineering.

    I do find that Len’s study further confirms the role of landscape change (in this case deliberate change) as a first order climate forcing.  However, this means that weather patterns will be altered since the spatial distribution of diabatic heating in the atmosphere will be different (e.g. see also our study of this diabatic heating effect due to aerosols in Matsui and Pielke 2006).  The teleconnection effect seen in their model runs seem muted at very long distance (e.g. see Figure 5) but they are present.  For example, there is a possible effect on Atlantic hurricanes, as noted in Section 6 of Ornstein et al. This raises the issue of unintended consequences. With respect to Atlantic tropical cyclones, these bring much needed rain to the western tropical and subtropical Atlantic Ocean land areas as well as the southeast USA. If this is altered, as suggested in the model results, this would be an unintended negative effect to those countries.

    I do agree with Len on the concern on the biogeochemical effect of added atmospheric concentrations of CO2. We do not know all of the potential effects, but there will be some. Thus the elevation of CO2 to too high a concentration should be prevented, and the engineering of Len’s proposal seems feasible.  However, as written above, unintended consequences on the climate elsewhere would need to be very thoroughly studied.

     I remain convinced that the mitigation approach with the least negative effects is the air capture of CO2 as discussed in

    Pielke, Jr., R. A., 2009. An Idealized Assessment of the Economics of Air Capture of Carbon Dioxide in Mitigation Policy, Environmental Science & Policy, Vol. 12, Issue 3, pp. 216-225.

    Comments Off on Comments On Len Ornstein’s Post “How To Quickly Lower Climate Risks, At ‘Tolerable’ Costs?”

    Filed under Climate Change Forcings & Feedbacks

    Further Support For Temperature Trends Associated With Land Use Change – Rosenzweig Et Al 2009 “Mitigating New York City’s Heat Island”

    There is an important, well written new paper that provides further evidence that land use change significantly influences the use of surface air temperatures in these areas as part of the construction of a global average surface temperature anomaly.

    The paper is

    Rosenzweig Cynthia, William D. Solecki, Lily Parshall, Barry Lynn, Jennifer Cox, Richard Goldberg, Sara Hodges, Stuart Gaffin, Ronald B. Slosberg, Peter Savio, Frank Dunstan, and Mark Watson: 2009, Mitigating New York City’s Heat Island: Integrating Stakeholder Perspectives and Scientific Evaluation. Bulletin of the American Meteorological Society. Volume 90, Issue 9 (September 2009) pp. 1297–1312.

    The abstract reads

    “This study of New York City, New York’s, heat island and its potential mitigation was structured around research questions developed by project stakeholders working with a multidisciplinary team of researchers. Meteorological, remotely-sensed, and spatial data on the urban environment were brought together to understand multiple dimensions of New York City’s heat island and the feasibility of mitigation strategies, including urban forestry, green roofs, and high-albedo surfaces. Heat island mitigation was simulated with the fifth-generation Pennsylvania State University–NCAR Mesoscale Model (MM5). Results compare the possible effectiveness of mitigation strategies at reducing urban air temperature in six New York City neighborhoods and for New York City as a whole. Throughout the city, the most effective temperature-reduction strategy is to maximize the amount of vegetation, with a combination of tree planting and green roofs. This lowered simulated citywide surface urban air temperature by 0.4°C on average, and 0.7°C at 1500 Eastern Standard Time (EST), when the greatest temperature reductions tend to occur. Decreases of up to 1.1°C at 1500 EST occurred in some neighborhoods in Manhattan and Brooklyn, where there is more available area for implementing vegetation planting. New York City agencies are using project results to guide ongoing urban greening initiatives, particularly tree-planting programs.”

    The paper is not written specifically with respect to the issue of diagnosing regional representative multi-decadal surface air temperature trends. However, it clearly shows the magnitude of the effect of land use change on surface air temperatures. For example,  Table 3 presents a summary of the effect of increased vegetation and higher surface albedo on urban air temperatures during heat waves for different areas of New York City. The average differences for different parts of New York range up to over 1 degree Celsius at 1500 EST and are even larger at individual locations for the maximum effect as shown in Table 4.

    This paper effectively shows how deliberate land management can alter the urban temperature environment. It also shows that as the region became  urban, temperature trends of these magnitudes occurred due to these landscape changes.

    The new Rosenzweig et al 2009 paper, while silent on the issue in its text, is an effective rebuttal of the papers

    Parker, D.E., 2004: Large-scale warming is not urban. Nature, 432, 290, doi:10.1038/432290a

     Peterson, T.C., 2003: Assessment of urban versus rural in situ surface temperatures in the contiguous United States: No difference found. J. Climate, 16, 2941-2959.

    As we have shown in

    Pielke Sr., R.A., C. Davey, D. Niyogi, S. Fall, J. Steinweg-Woods, K. Hubbard, X. Lin, M. Cai, Y.-K. Lim, H. Li, J. Nielsen-Gammon, K. Gallo, R. Hale, R. Mahmood, S. Foster, R.T. McNider, and P. Blanken, 2007: Unresolved issues with the assessment of multi-decadal global land surface temperature trends. J. Geophys. Res., 112, D24S08, doi:10.1029/2006JD008229

    there remain significant issues with the use of surface air temperatures from land based observations, as a diagnostic of global warming and cooling.

    Comments Off on Further Support For Temperature Trends Associated With Land Use Change – Rosenzweig Et Al 2009 “Mitigating New York City’s Heat Island”

    Filed under Climate Change Metrics

    Further Comments On The Vulnerability Perspective

    On September 21 2009 I posted The Vulnerability Perspective. In it, I identified 5 major resource areas that should be the focus of assessments as to the spectrum of risks from climate variability and change, as well as from other environmental and social threats.  I wrote

    There are 5 broad areas that we can use to define the need for vulnerability assessments : water, food, energy, health and ecosystem function. Each area has societally critical resources. The vulnerability concept requires the determination of the major threats to these resources from climate, but also from other social and environmental issues. After these threats are identified for each resource, then the relative risk from natural- and human-caused climate change (estimated from the GCM projections, but also the historical, paleo-record and worst case sequences of events) can be compared with other risks in order to adopt the optimal mitigation/adaptation strategy.

    In our my book chapter with Dev Niyogi

    Pielke Sr. R.A., and D. Niyogi, 2009: The role of landscape processes within the climate system. In: Otto, J.C. and R. Dikaum, Eds., Landform – Structure, Evolution, Process Control: Proceedings of the International Symposium on Landforms organised by the Research Training Group 437. Lecture Notes in Earth Sciences, Springer, Vol. 115, in press

    we presented a section that introduces a framework to investigate vulnerabilities. The section reads

    “Within the climate system, the need to consider the broader role of land-surface feedback becomes important not only for assessing the impacts but also for developing regional vulnerability and mitigation strategies.

    The IPCC fourth assessment second and third working groups deal with a range of issues targeted to these topics (Schneider et al. 2007). The IPCC identifies seven criteria for “key” vulnerabilities. They are: magnitude of impacts, timing of impacts, persistence and reversibility of impacts, likelihood (estimates of uncertainty) of impacts and vulnerabilities and confidence in those estimates, potential for adaptation, distributional aspects of impacts and vulnerabilities, and the importance of the system(s) at risk. While a number of potential vulnerabilities and uncertainties are considered (such as irreversible change in urbanization), the resulting feedback on the atmospheric processes due to such changes is still poorly understood or unaccounted for in these assessments. Indeed the UNFCCC Article 1 states: “‘Adverse effects of climate change’ means changes in the physical environment or biota resulting from climate change which have significant deleterious effects on the composition, resilience or productivity of natural and managed ecosystems or on the operation of socio-economic systems or on human health and welfare.” Thus, while the role of landscape is inherent within the UNFCCC framework, the corresponding translation for the assessments still remains largely greenhouse gas driven.

    Further, while the climate change projections have largely been at coarser resolution, the impacts and potential mitigation policies are often at local to regional scales. For example, climate models often project increasing drought at a regional scale. The resilience to such increased occurrence as well as changes in the intensity of droughts is, however, dependent on the local scale environmental conditions (such as moisture storage, and convective rainfall), and farming approaches (access to irrigation, timing of rain or stress, etc). As summarized in Adger (1996), an important issue for IPCC-like global assessments is to assess if the top-down approach can incorporate the “aggregation of individual decision-making in a realistic way, so that results of the modelling are applicable and policy relevant”.

    Therefore, as the community braces to develop resilience strategies it will becoming increasingly important to consider a bidirectional impact, i.e., not just the role of atmospheric changes (such as temperature and rainfall) on the physical environmental or biota, but also a feedback of the biota and other land-surface processes on further changes in the atmospheric processes – such as reviewed in this chapter.

    Klein et al. (1999) sought to assess whether the IPCC guidelines for assessing climate change impacts as well as adapative strategies can be applied to one example of coastal adaptation. They recommend that a broader approach is needed which has more local-scale information and input for assessing as well as monitoring the options. Again the missing link between local-scale features with global scale projections become apparent. The expanded eight-step approach of Schroter et al. (2005), designed to assess vulnerability to climate change, states the need for considering multiple interacting stresses. They recognize that climate change can be a result of greenhouse gas changes which are coupled to socioeconomic developments, which in turn are coupled to land-use changes – and that all of these drivers are expected to interactively affect the human – environmental system (such as crop yields).

    To extract the significance of the individual versus multiple stressors on crop yields, Mera et al. (2006) developed a crop modeling study with over 25 different climatic scenarios of temperature, rainfall, and radiation changes at a farm scale for both C3 and C4 types of crops (e.g., soybean and maize). As seen in many crop yield studies, the results suggested that yields were most sensitive to the amount of effective precipitation (estimated as rainfall minus physical evaporation/transpiration loss from the land surface). Changes in radiation had a nonlinear response with crops showing an increased productivity for some reduction in the radiation as a result of cloudiness and increased diffuse radiation and a decline in yield with further reduction in radiation amounts. The impact of temperature changes, which has been at the heart of many climate projections, however, was quite limited particularly if the soils did not have moisture stress. The analysis from the multiple climate change settings do not agree with those from individual changes, making a case for multivariable, ensemble approaches to identify the vulnerability and feedbacks in estimating climate-related impacts (cf. Turner et al. 2003).

    Another issue is the coupled vulnerability of the land surface to socioeconomic and climate change processes. This question was addressed byMetzger et al. (2006). They concluded that most assessment studies cannot provide needed information on regions or on ecosystem goods that are vulnerable. To address this question, we can hypothesize that the vulnerability of landscape (V) change is a product of the probability of the landscape change (Lc) and the service (S) provided by the landscape:

    V = prob (Lc) ∗S

    The service provided is a broad term and could mean societal benefits (such as recreation), or economic benefits (such as timber and food), or physical feedback as in terms of the modulating impact a landscape may have on regional temperatures or precipitation. While a variety of studies on vulnerability have sought to look at the economic and the societal feedbacks, the physical feedback of the fine-scale land heterogeneities have been critically missing in the literature. It is however important that land heterogeneity and transformation potential be considered at a finer scale because the landscape changes will in turn affect the regional and local vulnerability.

    Current economical assessment studies (Stern 2007) conclude that controlling land-use change such as from deforestation provides an opportunity cost in excess of $5 billion per annum. This estimate however appears to only consider the land transformation impact of deforestation and the resulting greenhouse emissions. As summarized in this chapter, the dynamical effects such as changes in rainfall, evaporation, convection, and temperature patterns due to landform changes can cause additional vulnerability (or resilience in some cases) and needs to be considered in such assessments (Marland et al. 2003). Similarly, the UNFCCC Article 3 also seeks afforestation (reforestation minus deforestation) since 1990 as a country’s commitment towards the green house gas emission controls. Not considering the dynamical feedbacks due to such forest land transformation can lead to additional vulnerabilities as described in Pielke et al. (2001a, 2002).”

    I plan to have further posts on this topic, focusing on the 5 resource areas of  water, food, energy, health and ecosystem function, in future weblogs.

    Comments Off on Further Comments On The Vulnerability Perspective

    Filed under Vulnerability Paradigm

    Comments On AP Story “Statistics Experts Reject Global Cooling Claims”

    UPDATE: October 27 2009: Seth Borenstein has alerted us to a full version of his article, which does include more details on the study [only the version I posted below was seen on the google news search yesterday]. The study approach itself is also available (see). My recommendation to focus on the more recent years using the more appropriate metric, upper ocean heat content trends, remains. I have suggested to Seth that he interview Jim Hansen to update what he wrote in 2005.  I also deleted the statement about the independence of the study as requested by Seth and substantiated by the longer AP story. It was completed independently of NOAA.

    There is a news report titled “Statistics experts reject global cooling claims” by Seth Borenstein which appeared today.

    The article reads 

    “WASHINGTON — The Earth is still warming, not cooling as some global warming skeptics are claiming, according to an analysis of global temperatures by independent statistics experts.

    The review of years of temperature data was conducted at the request of The Associated Press. Talk of a cooling trend has been spreading on the Internet, fueled by some news reports, a new book and temperatures that have been cooler in a few recent years.

    The statisticians, reviewing two sets of temperature data, found no trend of falling temperatures over time. And U.S. government figures show that the decade that ends in December will be the warmest in 130 years of record-keeping.

    Global warming skeptics are basing their claims on an unusually hot year in 1998. They say that since then, temperatures have fallen — thus, a cooling trend. But it’s not that simple.

    Since 1998, temperatures have dipped, soared, dropped again and are now rising once more. Records kept by the British meteorological office and satellite data used by climate skeptics still show 1998 as the hottest year. However, data from the U.S. National Oceanic and Atmospheric Administration and NASA show 2005 has topped 1998.

    “The last 10 years are the warmest 10-year period of the modern record,” said NOAA climate monitoring chief Deke Arndt. “Even if you analyze the trend during that 10 years, the trend is actually positive, which means warming.”

    Statisticians said the ups and downs during the last decade repeat random variability in data as far back as 1880.”

    This article, however, (which is not a true independent assessment if the study was completed by NOAA scientists)  is not based on the much more robust metric assessment of global warming as diagnosed by upper ocean heat content. Nor does it consider the warm bias issues with respect to surface land temperatures that we have raised in our peer reviewed papers; e.g. see and see

    With respect to ocean heat content changes, as summarized in the articles

    Ellis et al. 1978: The annual variation in the global heat balance of the Earth. J. Climate. 83, 1958-1962.

    Pielke Sr., R.A., 2003: Heat storage within the Earth system. Bull. Amer. Meteor. Soc., 84, 331-335

    Pielke Sr., R.A., 2008: A broader view of the role of humans in the climate system. Physics Today, 61, Vol. 11, 54-55

    and

    Douglass, D.H. and R. Knox, 2009: Ocean heat content and Earth’s radiation imbalance. Physics letters A

    trends and anomolies in the upper ocean heat content permits a quantitative assessment of the radiative imbalance of the climate system.

    Jim Hansen agrees on the use of the upper ocean heat content as an important diagnostic of global warming.   Jim Hansen in 2005 discussed this subject (see). In Jim’s write-up, he stated

    “The Willis et al. measured heat storage of 0.62 W/m2 refers to the decadal mean for the upper 750 m of the ocean. Our simulated 1993-2003 heat storage rate was 0.6 W/m2 in the upper 750 m of the ocean. The decadal mean planetary energy imbalance, 0.75 W/m2, includes heat storage in the deeper ocean and energy used to melt ice and warm the air and land. 0.85 W/m2 is the imbalance at the end of the decade.

    Certainly the energy imbalance is less in earlier years, even negative, especially in years following large volcanic eruptions. Our analysis focused on the past decade because: (1) this is the period when it was predicted that, in the absence of a large volcanic eruption, the increasing greenhouse effect would cause the planetary energy imbalance and ocean heat storage to rise above the level of natural variability (Hansen et al., 1997), and (2) improved ocean temperature measurements and precise satellite altimetry yield an uncertainty in the ocean heat storage, ~15% of the observed value, smaller than that of earlier times when unsampled regions of the ocean created larger uncertainty.”

    As discussed on my weblog and elsewhere (e.g. see and see), the upper ocean heat content trend, as evaluated by its heat anomalies, has been essentially flat since mid 2003 through at least June of this year.  Since mid 2003, the heat storage rate, rather then being 0.6 W/m2 in the upper 750m that was found prior to that time (1993-2003), has been essentially zero.

    Nonetheless, the article is correct that the climate system has not cooled even in the last 6 years. Moreover, on the long time period back to 1880, the consensus is that the climate system has warmed on the longest time period. Perhaps the current absence of warming is a shorter term natural feature of the climate system.  However, to state that the “[t]he Earth is still warming” is in error. The warming has, at least temporarily halted.

    The article (and apparently the NOAA study itself), therefore, suffers from a significant oversight since it does not comment on an update of the same upper ocean heat content data that Jim Hansen has used to assess global warming.

    Comments Off on Comments On AP Story “Statistics Experts Reject Global Cooling Claims”

    Filed under Climate Change Metrics

    Guest Weblog By Len Ornstein “How To Quickly Lower Climate Risks, At ‘Tolerable’ Costs?”

    In keeping with my goal to permit a diversity of views to be posted on my weblog by published climate scientists, below is a post by Len Ornstein.

    Guest Weblog By Len Ornstein  titled “How to Quickly Lower Climate Risks, at ‘Tolerable’ Costs?”

    Preamble:

    The data on climate change are very noisy. The physics of hydrodynamic systems like the oceans and atmosphere behave somewhat ‘erratically’ and ‘chaotically’, (especially in comparison, for example, to the physics of the ‘predictability’ of the Earth’s orbit around the sun) and in addition, the choices that are made about how to collect climate data, also can be subject to some uncertainty and error. So it’s not surprising that attempts to discern ‘trends’ in climate data are subject to a good deal of uncertainty. This is characteristic of all scientific data; only it’s especially severe in climate science.

    Scientist construct models of the world and then they (or others) observe the behavior of  relevant, discrete, worldly events to test whether the models are useful for ‘prediction’ of future events and/or interpolation of unobserved ‘past events’ in between already observed events. In general, the larger the number of ‘pertinent’ observations, and the more similar are the ‘results’ to one another, the more ‘likely’ it is that calculated means (or trends of means), are representative of ‘reality’. Likewise, the closer a model prediction comes to such a measured trend, the more robust may be its ability to ‘predict’. To communicate how likely reality has been estimated by the measurements and by the model, science tries to cope with likelihood by using ‘agreed upon’ metrics of uncertainty – such as confidence intervals – to help make discussion of uncertainty more tractable. But the public is used to ‘statements of fact’, and mistrust the weasel words of confidence intervals; most haven’t yet learned that nothing that can be said about real world ‘facts’ is either absolutely certain – or absolutely false.

     So when some scientist suggests that the mean of a ‘calculated trend’ of  some kind of climate ‘feature’ (e.g., global mean surface temperature (GMST)) is biased on the high side because of measurement errors of a particular kind – and another says that the trend is underestimated for perhaps just the opposite reasons – the public often sees it as an ideological difference (which it sometimes may be!). But more commonly, it’s an honest difference of opinion that stems from the different data histories with which these scientists have experience. Both respect the general significance of the confidence interval around the mean of the trend. But because they differ on what they consider pertinent, one may favor the data closer to the bottom of the confidence interval – and the other, closer to the top.

     On a small number of issues, I differ with Roger. His experience and mine differ widely, and I expect we can each learn from one another. His comments on the following matters will be appreciated:

    AGW

    The very wide 90% confidence interval of the GMST trend, includes as little as 1.5ºC/century. But even that would be an only slightly delayed “unmitigated catastrophe” – with business as usual. This is something that I conjecture Roger also believes. Obviously different aspects of ‘catastrophe’ cut in on different time scales, and can be ‘mitigated’ to different degrees. Those who are wealthy enough, can keep comfortable for some time, with air conditioning. But for most sea creatures dependent for survival on the stability of the aragonite (a form of calcium carbonate) in their ‘shells’ (snails, clams, corals, foramenifera, etc.), a drop of sea surface water pH of about 0.4 units (expected as a result of a doubling of atmospheric CO2) will be a catastrophe – whether beginning either 50 yrs,   or 200 yrs from now! Cooling by geo-engineering – without decarbonation – can’t save them. 

    That’s a good example of why uncertainty is not an excuse for inaction. 

    Roger recently posed the following question and answer on his thread, 

    Roger A. Pielke Sr. Answers To A Survey “Futures Of The Global Energy Game By Year 2030″

    “9. What do you think are the most important long-term external risks that players in the global energy game have under-attended to?

    His answer: The exclusion of energy sources, such as coal before there are adequate replacements, risks serious economic and social upheaval.”

    I’m one of those who is certain that we must stop burning coal as soon as possible. (Roger Jr’s recent ‘kudos’ to Greenpeace shows he also shares this opinion.) And I believe that can begin much sooner than most believe – without risks of “serious economic and social upheaval”.

    I have 2 papers in press at the journal, Climatic Change, that are already available online: 

    Ornstein L, Aleinov I and Rind D , 2009: ““Irrigated afforestation of the Sahara and Australian Outback to end global warming”   Climatic Change
    DOI 10.1007/s10584-009-9626-y

    Abstract: Each year, irrigated Saharan- and Australian-desert forests could sequester amounts of atmospheric CO2 at least equal to that from burning fossil fuels. Without any rain, to capture CO2 produced from gasoline requires adding about $1 to the per-gallon pump-price to cover irrigation costs, using reverse osmosis (RO), desalinated, sea water. Such mature technology is economically competitive with the currently favored, untested, power-plant Carbon Capture (and deep underground, or under-ocean) Sequestration (CCS). Afforestation sequesters CO2, mostly as easily stored wood, both from distributed sources (automotive, aviation, etc., that CCS cannot address) and from power plants. Climatological feasibility and sustainability of such irrigated forests, and their potential global impacts are explored using a general circulation model (GCM). Biogeophysical feedback (Charney 1975) is shown to stimulate considerable rainfall over these forests, reducing desalination and irrigation costs; economic value of marketed, renewable, forest biomass, further reduces costs; and separately, energy conservation also reduces the size of the required forests and therefore their total capital and operating costs. The few negative climate impacts outside of the forests are discussed, with caveats. If confirmed with other GCMs, such irrigated, subtropical afforestation probably provides the best, near term route to complete control of green-house-gas-induced, global warming.

    and

     Ornstein L. 2009: “Replacing coal with wood: sustainable, eco-neutral, conservation harvest of natural tree-fall in old-growth forests” Climatic Change DOI 10.1007/s10584-009-9625-z 

    Abstract: When a tree falls in a tropical old-growth forest, the above ground biomass decays fairly rapidly and its carbon is returned to the atmosphere as CO2. If the trunk of that tree were to be harvested, before decay, and were stored anoxically, or burned in place of coal, a net of about 2/3 of that amount of CO2 would be prevented from entering the atmosphere. If the ash-equivalent of each tree trunk (about 1% of dry mass) were recycled to the site of harvest, the process would be indefinitely sustainable and eco-neutral. Such harvest of the undisturbed old-growth forests of Amazonia and Equatorial Africa could effectively remove about 0.88 to 1.54 GtC/yr from the atmosphere. With care, additional harvest of adjacent live trees, equaling up to two times the mass of the fallen trees, might be similarly collected, just as sustainably, and with almost as little ecological impact. This very large contribution to the mitigation of global warming is discussed – with caveats. It could substantially and ‘immediately’ ‘cancel’ a good deal of coal emissions, but without closing down many presently coal-fired power plants – and at much lower cost and lead-time than carbon capture and sequestration (CCS).

    What’s the relevance?

    The two strategies together could yield about 8 to 13 GtC/yr (8 to 13 “wedges”) of new bio-sequestration; enough to easily ‘stop’ the current 8.8 GtC/yr increase in atmospheric CO2. By comparison, all other mitigation proposals provide, 1 or 2 wedges each. 8.8 GtC/yr dumped into the atmosphere as CO2 creates a problem of enormous scale, and can only be dealt with – successfully – with equally enormous expenditures of resources. That said, because of the induced rainfall, the typical cost will drop to something like $0.50/gallon of gasoline burned ($0.50/2.3kgC), the total global cost would be about $1.9 trillion/yr (compared to a 2008 US asset price deflation of about $25 trillion). When that’s compared to figures like estimates of $800 billion/yr for CCS, my ‘plan’ looks like a loser. But, CCS can address only about 20% of the 8.8GtC/yr problem at the $800 billion/yr price. My two solutions address the whole thing! CCS would involve a network of dangerous high-pressure pipelines coursing through the most developed neighborhoods of our civilizations – compared to relatively benign water aqueducts in what are presently virtually uninhabited deserts. CCS also requires deep and risky sequestration of CO2, whereas the sequestration in forests is much safer. And although the pressurized CO2 has virtually no value, the forest wood represents a ‘bonus’ of a ‘forever’ sustainable, eco-neutral, conservation harvestable (SENCH) supply of wood and wood products to serve in place of non-renewable fossil carbon – with near zero CO2 footprint! 

    Better management of forests provides the most practical way to begin to ‘immediately’ reduce atmospheric CO2 and buy enough time to permit energy conservation and the development and testing of new technology to then make even more of a difference. Using wood in place of coal – and at the same time, preventing further deforestation – unfortunately, are intimately tied together. And since deforestation is responsible for the release of about 1 – 2 GtC/yr, it has a large effect on how practical any proposed targets and menu for conservation might be.

    The decisions that must be made to accomplish enhanced bio-sequestration often are just those that are also likely to encourage more destructive deforestation by ‘other actors’ – as discussed in my SENCH paper.

    When bio-fuels begin to compete seriously with fossil fuels – for example, if ‘effective taxes’ on fossil fuels seem to make wood and corn ethanol attractive alternates, it’s then that market pressures to harvest bio-fuels will become overwhelming and will exert enormous pressures to increase deforestation.

    And this is why except for bio-harvests that result in zero or negative CO2 footprint (as carefully defined in my SENCH paper) such harvest must be subject to ‘an effective carbon tax’ in proportion to its positive CO2 footprint – just as has been proposed for fossil fuels – otherwise catastrophic deforestation may proceed with a vengeance! This point isn’t generally understood – even by many climatologists and environmentalists!

    [As this weblog was being prepared (10/23/2009) the following paper was published, that makes exactly this point!

    TD Searchinger, SP Hamburg, J Melillo, W Chameides, P Havlik, DM Kammen, GE Likens, RN Lubowski, M Obersteiner, M Oppenheimer, GP Robertson, WH Schlesinger, GD Tilman “Fixing a Critical Climate Accounting Error” Science 326, 527-528 [see http://www.princeton.edu/~tsearchi/writings.html].

    Abstract: Rules for applying the Kyoto Protocol and national cap-and-trade laws contain a major, but fixable, carbon accounting flaw in assessing bioenergy.

    With such ‘taxes’ on fossil fuels in place, the cost of transporting wood and wood products globally becomes no more onerous than the present global transport of petroleum.

    SkyHook

    We begin with the harvest of fallen trees in the Amazon and Congo, probably using lighter-than-air-ships, like the Boeing SkyHook, to move the harvested logs to a nearest river, and then use river transport to coastal ports. (No access roads or skid trails means surreptitious raping of those old-growth forests remains quite difficult.) This requires a modest investment and no new technologies. And the wood can be stored for a sufficient period for a large fraction of coal-burning plants to be modestly retrofitted to burn wood chip, pellets and charcoal. Nonetheless almost ‘immediate’ CO2 drawdown of up to 4.5 GtC/yr results. During that same period, wood-processing plants are developed in Brazil and Equatorial Africa to process at least part of the wood into charcoal and syngas etc. Some ‘experiments’ in irrigated afforestation of the Sahara, Outback, Thar desert and Saudi Arabia also begin ‘immediately’.

    If (when?) other conservation efforts (including wind, solar, electric vehicles, etc.) don’t ramp up fast enough to also begin to result in a significant reduction of atmospheric CO2, then the large afforestation projects commence – and only in proportion to the net projected need – so probably at a cost of (considerably?) less than 1.9 trillion dollars/yr ;-)

    Of course ‘effective taxing’ of fossil fuels and those bio-fuels with positive CO2 footprints would have to be negotiated worldwide with some dispatch. Wood and wood products would also have to carry international encryption tags, indicating site and date of harvest and associated CO2 footprint, to assure proper ‘taxation’ and to inhibit the kind of gaming of the system that could lead to deforestation. I see that as the biggest problem.

    And effective monitoring and policing agreements and infrastructure would have to be put in place in the Amazon and Congo areas to protect the old growth forests. That’s the second biggest problem.

     These kinds of considerations ought to be important parts of the December agenda in Copenhagen.

     Len Ornstein

    Comments Off on Guest Weblog By Len Ornstein “How To Quickly Lower Climate Risks, At ‘Tolerable’ Costs?”

    Filed under Guest Weblogs

    Is The Human Input Of CO2 A First Order Climate Forcing?

    In response to my post Erroneous Claim in an AP News Article, I have been asked if I consider if the human addition of CO2 is a first order climate forcing.  The answer, of course, as I have consistently emphasized in my research papers and presentations, and on my weblog, is a categorical YES (e.g. see, see, see and see). 

     The human addition of CO2 is a positive radiative forcing as well as a biogeochemical forcing.  It is a first order human climate forcing.

    The AP statement itself has two parts:

    1. “ the vast majority of scientists agree that global warming is occurring

    2.  “that the primary cause is a buildup of greenhouse gases in the atmosphere from the burning of fossil fuels, such as oil and coal.”

    Item 1 is correct if the time scale is over the last century. Global warming since mid-2003, however, based on the diagnosis of the upper ocean heat content, has halted, at least up through mid 2009.

    Item 2 is the “myth”.  Even with respect to global warming during the last 100 years, the addition of CO2 is just one of a number of positive radiative forcings (e.g. see), and natural forcings appear to be more significant than previously understood (e.g. see).  The statement that the “primary cause” of global warming is a buildup of greenhouse gases is incomplete and, therefore, incorrect.

    Thus, while I agree that the human addition of CO2 is a first order climate forcing, the claims that it is the primary human climate forcing is not supported by the science. This means that attempts to “control” the climate system, and to prevent a “dangerous intervention” into the climate system by humans that focuses just on CO2 and a few other greenhouse gases will necessarily be significantly incomplete, unless all of the other first order climate forcings are considered.

     Moreover, as I have written on extensively, climate change is much more than global warming and cooling (e.g. see  and see).  Human caused climate change can occur even in the absence of global warming (such as from land use change).  This makes attempts to mitigate climate change a much more daunting problem than assuming that all we need to do is control the human emissions of CO2 from fossil fuel combustion into the atmosphere.

    For the summary overview of my perspective see Main Conclusions.

    Comments Off on Is The Human Input Of CO2 A First Order Climate Forcing?

    Filed under Climate Change Forcings & Feedbacks, Climate Science Misconceptions

    Erroneous Claim in an AP News Article

    UPDATE #2 October 24 2009: If Dina Cappiello, Seth Borenstein and/or Kevin Freking chose to reply in order to refute my criticism of their statement in the news article, we would be glad to post as a guest weblog.

    UPDATE Oct 24 2009:  To make sure my text is clear, I repeated “the primary cause”in the text  below. As I weblogged on this morning, the human addition of CO2 from fossil fuel emissions is a first order global warming, and more generally a first order climate change forcing.  Efforts to reduce the magnitude of the human intervention into the climate system must include mitigation approaches with respect to CO2 emissions. However, by itself, this is only a part of the issue, as other human climate forcings are also of first order importance.

    There is an Associated Press [AP] news article today by Dina Cappiello, Seth Borenstein and Kevin Freking titled “Poll: US belief in global warming is cooling”.  In this article the reporters perpetuate the myth that

    “Though there are exceptions, the vast majority of scientists agree that global warming is occurring and that the primary cause is a buildup of greenhouse gases in the atmosphere from the burning of fossil fuels, such as oil and coal.”

    This is not true and is a case of the media seeking to make up news.

    We have already documented that a significant minority of climate scientists do not consider greenhouse gases as the primary cause for global warming, and, more generally, [as the primary] cause [of] climate change; e.g.  see

    Brown, F., J. Annan, and R.A. Pielke Sr., 2008: Is there agreement amongst climate scientists on the IPCC AR4 WG1?

    and

    National Research Council, 2005: Radiative forcing of climate change: Expanding the concept and addressing uncertainties. Committee on Radiative Forcing Effects on Climate Change, Climate Research Committee, Board on Atmospheric Sciences and Climate, Division on Earth and Life Studies, The National Academies Press, Washington, D.C., 208 pp.

    In the coming month, we will be presenting another article that documents that the AP authors are erroneous in their claim “that the vast majority of scientists agree that global warming is occurring and that the primary cause is a buildup of greenhouse gases in the atmosphere from the burning of fossil fuels, such as oil and coal.”

    If the reporters want to be balanced in their presentations, rather than lobbyists and advocates, they would persue the validity of their claim.  So far, however, they have failed in this journalistic role.

     

    Comments Off on Erroneous Claim in an AP News Article

    Filed under Bias In News Media Reports

    Comments On Roy Spencer’s Excellent Post “IPCC Crushes Scientific Objectivity, 91-0”

    Roy Spencer published an excellent post on October 18 2009 titled “IPCC Crushes Scientific Objectivity, 91-0”.

    He post includes the statements

    “The most glaring example of this bias [that of the IPCC] has been the lack of interest on the IPCC’s part in figuring out to what extent climate change is simply the result of natural, internal cycles in the climate system…….”

    “The IPCC is totally obsessed with external forcing, that is, energy imbalances imposed upon the climate system that are NOT the result of the natural, internal workings of the system…”

    “Admittedly, we really do not understand internal sources of climate change. Weather AND climate involves chaotic processes, most of which we may never understand, let alone predict. While chaos in weather is exhibited on time scales of days to weeks, chaotic changes in the ocean circulation could have time scales as long as hundreds of years, and we know that cloud formation – providing the Earth’s natural sun shade – is strongly influenced by the ocean….”

    “Thus, small changes in ocean circulation can lead to small changes in the Earth’s albedo (how much sunlight is reflected back to space), which in turn can lead to global warming or cooling. The IPCC’s view (which is never explicitly stated) that such changes in the climate system do not occur is little more than faith on their part….”

    The identification by Roy of a much more significant role for internal climate variability in altering even the global average radiative heating over multi-year and longer time scales is a major research finding. This hypothesis was not tested by the IPCC. Of course, none of the IPCC models can skillfully predict, even in retrospect, the multi-year variations that Roy has identified. Thus the IPCC simply chose to essentially ignore this issue.

    We presented this perspective of the climate system as a chaotic system in our papers; e.g. see

    Pielke, R.A., 1998: Climate prediction as an initial value problem. Bull. Amer. Meteor. Soc., 79, 2743-2746

    Rial, J., R.A. Pielke Sr., M. Beniston, M. Claussen, J. Canadell, P. Cox, H. Held, N. de Noblet-Ducoudre, R. Prinn, J. Reynolds, and J.D. Salas, 2004: Nonlinearities, feedbacks and critical thresholds within the Earth’s climate system. Climatic Change, 65, 11-38,

    but these also were ignored by the IPCC.

    We look forward to Roy’s  seminal publication of  “On the Diagnosis of Radiative Feedback in the Presence of Unknown Radiative Forcing”.  Of course, it needs to first hurdle the inappropriate role of some reviewers and even editors as gatekeepers of the IPCC dogma.

    Comments Off on Comments On Roy Spencer’s Excellent Post “IPCC Crushes Scientific Objectivity, 91-0”

    Filed under Climate Change Forcings & Feedbacks