Monthly Archives: August 2008

A Coupled MM5-NOAH Land Surface Model-based Assessment of Sensitivity of Planetary Boundary Layer Variables to Anomalous Soil Moisture Conditions by Quintanar et al 2008

There is an excellent new paper that illustrates effectively the role of soil moisture on weather and climate.

It is

Quintanar, A., Mahmood, R., Loughrin, Lovanh, N. C., 2008: A coupled MM5-Noah land surface model-based assessment of sensitivity of planetary boundary layer variables to anomalous soil moisture conditions. Physical Geography, 29, 54-78, 10.2747/0272-3646.29.1.54 (subscription required)

The abstract reads

“The sensitivity of the near-surface weather variables and small-scale convection to soil moisture for Western Kentucky was investigated with the aide of the MM5 Penn State/NCAR mesoscale atmospheric model for three different synoptic conditions in June 2006. The model was initialized with FNL reanalysis from NCEP containing soil moisture data calculated with the Noah land surface model. Dry and wet experiments were performed in order to find the influence of soil moisture specification on boundary layer atmospheric variables. Dry experiments showed less available atmospheric moisture (between 2 and 6 g kg-1) at near-surface levels during all synoptic events consistent with slightly deeper boundary layers, higher lifting condensation levels and a larger Bowen ratio. As expected, precipitation rates were generally smaller than those of the control simulation. However, during a moderately strong synoptic event in early June, the dry experiments displayed larger precipitation rates compared to the control experiment (up to 5 mm in 3 hr) as the soil volumetric fraction was decreased from 0.05 to 0.15 (m3 m-3) with respect to the control simulation. Precipitation rates in wet experiments were also modulated by characteristics of synoptic conditions. In early June, precipitation rates slightly were larger than the control run (from 0.2 mm 3 h-1 to 1.4 mm 3 h-1) while in the other periods precipitation was reduced significantly. Both dry and wet anomaly experiments experienced reduced precipitation for different reasons. It was found, lifting condensation level, CAPE and low Bowen ratio were not sensitive markers of changes in soil moisture. Equivalent potential temperature was a better indicator of precipitation changes among all experiments. The controlling factor in these responses was the soil moisture content forcing vertical velocities. Thermodynamic conditions such as local stability played a less substantial role in controlling the precipitation processes. It was found that the response of planetary boundary layer variables under a variety of soil moisture conditions can be modified due to degree of synoptic forcing. Weak-to-moderate forcing favored convection while strong synoptic forcing tended to suppress it under dry soil moisture conditions. Wetter soils did not produce a response in horizontal wind fields as large as under the drier soils.”

The evidence continues to accumulate on the major role of land surface processes in weather and climate.

Comments Off on A Coupled MM5-NOAH Land Surface Model-based Assessment of Sensitivity of Planetary Boundary Layer Variables to Anomalous Soil Moisture Conditions by Quintanar et al 2008

Filed under Climate Change Forcings & Feedbacks

Detecting Urbanization Effects on Surface and Subsurface Thermal Environment – A Case Study of Osaka by Huang et al. 2008

We would like to thank Tobias Rothenberger for alerting us to this paper! Huang S, Taniguchi M, Yamano M, Wang CH, 2008: Detecting urbanization effects on surface and subsurface thermal environment – A case study of Osaka. Sci Total Environ, doi:10.1016/j.scitotenv.2008.04.019

The abstract reads

“Tremendous efforts have been devoted to improve our understanding of the anthropogenic effects on the atmospheric temperature change. In comparison, little has been done in the study of the human impacts on the subsurface thermal environment. The objective of this study is to analyze surface air temperature records and borehole subsurface temperature records for a better understanding of the urban heat island effects across the ground surface. The annual surface air temperature time series from six meteorological stations and six deep borehole temperature profiles of high qualities show that Osaka has been undergoing excess warming since late 19th century. The mean warming rate in Osaka surface air temperature is about 2.0 °C/100a over the period from 1883 to 2006, at least half of which can be attributed to the urban heat island effects. However, this surface air temperature warming is not as strong as the ground warming recorded in the subsurface temperature profiles. The surface temperature anomaly from the Osaka meteorological record can only account for part of the temperature anomaly recorded in the borehole temperature profiles. Surface air temperature is conventionally measured around1.5 m above the ground; whereas borehole temperatures are measured from rocks in the subsurface. Heat conduction in the subsurface is much less efficient than the heat convection of the air above the ground surface. Therefore, the anthropogenic thermal impacts on the subsurface can be more persistent and profound than the impacts on the atmosphere. This study suggests that the surface air temperature records alone might underestimate the full extent of urban heat island effects on the subsurface environment.”

The paper includes the text

“The Osaka station shows a warming trend of 1.99 °C/100a over the 124 year period from 1883 to 2006, more than triple the 20th century global warming rate 0.6 °C/100a (IPCC, 2001). The anomalous urban warming is consistently recorded in the
records from the nearby urban/suburban stations, of which the warming rates are 2.24 °C/100a for Kyoto, 1.45 °C/100a for Kobe, and 1.96 °C/100a for Nara, respectively. In comparison, the warming rates recorded in the two rural stations are more diverse. Over its 55-year life span, the Tsurugisan station showed a warming rate of 0.47 °C/100a which is slightly lower than the global average; whereas the 82-year Ibukiyama record showed a 1.60 °C/100a warming rate that is much greater than the global average.”

and

“The JMA (JMA, 2006) cautions that its regional estimate might be not entirely free of urbanization perturbation. Based on the records from the urban stations around Osaka and the JMA regional estimate, a conservative estimate of the urban heat island effects in Osaka would be in the range of 1–2 °C/ 100a. This estimate agrees in general with the early analysis of Kato (1996). Based on principal component score analysis of monthly mean temperature data for the period from 1920 to 1992 from 51 meteorological stations in Japan, Kato suggests that the maximum urban effects with a population of over 100,000 in 1993 were 1.0–2.5 °C/100a in Japan (Kato, 1996).”

What this paper communicates is that monitoring temperature at just one site and/or one level is an inadequate diagnostic of the role of urban areas in altering the heating that occurs due to this landscape change. This study also documents the variations in time of the urban and rural temperature changes, as well as the spatial heterogeniety of these trends.  Claims by papers such as

Peterson, T. C., 2003: Assessment of urban versus rural in situ surface temperatures in the contiguous United States: No difference found. J. Climate, 16, 2941–2959,

and

Parker, D. E. (2004), Climate: Large-scale warming is not urban, Nature, 432, 290(18 November 2004); doi:10.1038/432290a.

are inconsistent with the observations reported in the new Huang et al paper.

 

Comments Off on Detecting Urbanization Effects on Surface and Subsurface Thermal Environment – A Case Study of Osaka by Huang et al. 2008

Filed under Climate Change Metrics

An Integrated Approach To Environmental Asessements By Stohlgren et al.

Climate Science has been encouraging the adoption of a vulnerability perspective as a much more effective method to reduce risk to climate and other environmental and social issues than provided by a reliance on downscaling from multi-decadal global model predictions; e.g. see

Pielke, R.A. Sr., 2004: Discussion Forum: A broader perspective on climate change is needed. IGBP Newsletter, 59, 16-19.

Pielke Sr., R.A., J.O. Adegoke, T.N. Chase, C.H. Marshall, T. Matsui, and D. Niyogi, 2007: A new paradigm for assessing the role of agriculture in the climate system and in climate change. Agric. Forest Meteor., Special Issue, 132, 234-254.

There is an effective and very important publication by T. Stohlgren, C. Jarnevich and S. Kumar  entitled “Forest legacies, climate change, altered disturbance regimes, invasive species and water” which provides more substance to this approach.

 The abstract reads

“Climate is a major driver of forest species distributions and the growth rate and structure of forests. Thus, climate change can potentially have significant effects on mountain forest hydrology, particularly the amount of water available downstream. However, many other factors influence forest biomass and mountain hydrology, and climate change effects cannot be viewed in isolation from previous land use histories (i.e. forest legacies), altered disturbance regimes (e.g. fire frequency, insect outbreaks, floods) and invasive species. Based on research from Colorado, United States, this article examines the many factors that must be considered in seeking to predict changes in water availability.”

Among the significant conclusions of this paper is that

To develop a predictive science, water managers have a long way to go. Despite the general trends discussed above, site-specific predictions and models of stream flow have eluded scientists. For example, in 2002 precipitation in Denver, Colorado was below average, and newspapers at the time predicted continued drought and low runoff for the city’s water supply. However, subsequent years (through 2007) had much higher and even above-average runoff despite the regional trends of warmer temperatures (Denver Water, 2007). Unfortunately, scientists have yet to create accurate predictions of stream flow months to seasons in advance.

This article effectively highlights the need for integrative assessments, and the lack of predictive skill that exists. It is an important much needed contribution to climate science, and environmental science in general.

 

Comments Off on An Integrated Approach To Environmental Asessements By Stohlgren et al.

Filed under Vulnerability Paradigm

The Narrow Perspective On Climate Science Being Communicated To Physics Teachers

Students who are being taught climate science are being indoctrinated into a narrow viewpoint of climate science [thanks to Ben Herman and Phil Krider for alerting us to this article]. The article below published by the American Association of Physics Teachers documents this bias.

Michael D. Mastrandrea and Stephen H. Schneider, 2008: Resource Letter GW-2: Global Warming. American Journal of Physics, Volume 76, Issue 7, pp. 608-614

This article is a “Resource Letter” whose mandate is described below: 

Resource Letters are guides for college and university physicists, astronomers, and other scientists to literature, websites, and other teaching aids. Each Resource Letter focuses on a particular topic and is intended to help teachers improve course content in a specific field of physics or to introduce nonspecialists to this field. The Resource Letters Editorial Board meets at the AAPT Winter Meeting to choose topics for which Resource Letters will be commissioned during the ensuing year. Items in the Resource Letter below are labeled with the letter E to indicate elementary level or material of general interest to persons seeking to become informed in the field, the letter I to indicate intermediate level or somewhat specialized material, or the letter A to indicate advanced or specialized material. No Resource Letter is meant to be exhaustive and complete; in time there may be more than one Resource Letter on a given subject. A complete list by field of all Resource Letters published to date is at the website http://www.kzoo.edu/ajp/letters.html. Suggestions for future Resource Letters, including those of high pedagogical value, are welcome and should be sent to Professor Roger H. Stuewer, Editor, AAPT Resource Letters, School of Physics and Astronomy, University of Minnesota, 116 Church Street SE, Minneapolis, MN 55455; e-mail: rstuewer@physics.umn.edu

with the abstract

This Resource Letter provides a guide to the literature on human-induced climate change, also known as global warming: Resource Letter GW-1: Global Warming, John W. Firor, Am. J. Phys. 62, 490–495 1994. After an introductory overview, journal articles, books, and websites are cited for the following topics: the greenhouse effect and radiative forcing, detection and attribution of human-induced climate change, carbon cycle feedbacks, paleoclimate, climate models and modeling uncertainties, projections of future climate change and climate impacts, and mitigation and adaptation policy options.

As an example of the exclusion of papers on climate issues, with respect to the 2003 western Europe heat wave, the authors include the papers

Human contribution to the European heatwave of 2003,” P. A. Stott, D. A. Stone, and M. R. Allen, Nature London 423, 610–614 2004

“More intense, more frequent, and longer lasting heat waves in the 21st century,” G. A. Meehl and C. Tebaldi, Science 305, 994–997 2004

yet ignored papers that conflicted with the conclusions in the above papers; e.g.

Chase, T.N., K. Wolter, R.A. Pielke Sr., and Ichtiaque Rasool, 2006: Was the 2003 European summer heat wave unusual in a global context? Geophys. Res. Lett., 33, L23709, doi:10.1029/2006GL027470

whose conclusions were indepencently confirmed in

Connolley W.M. 2008: Comment on “Was the 2003 European summer heat wave unusual in a global context?”by Thomas N. Chase et al. Geophys. Res. Lett., 35, L02703, doi:10.1029/2007GL031171.

as discussed in

Chase, T.N., K. Wolter, R.A. Pielke Sr., and Ichtiaque Rasool, 2008: Reply to comment by W.M. Connolley on ‘‘Was the 2003 European summer heat wave unusual in a global context?’’Geophys. Res. Lett., 35, L02704, doi:10.1029/2007GL031574.

Other climate topics were similarly presented selectively; for example, the neglect of the paper

Feddema et al. 2005: The importance of land-cover change in simulating future climates., 310, 1674-1678

where landscape change in this century was found in thier model runs to be a first order climate forcing.  However, this viewpoint was ignored.

 The article does contain valuable references (they do include a cite to the 2005 National Research Council report, for example), but, except for that publication,  it does not communicate the range of peer reviewed papers and books that conflict with the author’s viewpoint.

The article clearly misinforms the students and the physics teachers as to the actual diversity of issues with respect to the human role within the climate system, as well as the significance of natural climate forcings and feedbacks.  

Climate Science recommends that physics teachers read more widely than the list in the American Association of Physics Teachers resource list. 

While, we, of course, also have our own biases, our book

Cotton, W.R. and R.A. Pielke, 2007: Human impacts on weather and climate, Cambridge University Press, 330 pp

does provide a more inclusive set of peer reviewed papers and research summaries than is provided in the Mastrandreaa and Schneider article.

Comments Off on The Narrow Perspective On Climate Science Being Communicated To Physics Teachers

Filed under Climate Science Misconceptions, Climate Science Reporting

Reply By Josh Willis To Climate Science Questions Of August 19 2008

Josh Willis graciously has answered the questions that were asked in the Climate Science weblog of August 19 2008.

 Question by Roger A. Pielke Sr.

“In terms of global warming, there is not a ‘bigger picture’ than the diagnosis and monitoring of ocean heat content changes.”

Reply by Josh Willis

I agree.  However, there is a 50 year record of ocean heat content that, despite it’s recent problems, still shows 50 years of ocean warming and implies a net positive radiative imbalance over that period.  Furthermore,the 100 year records of sea surface temperature:

http://www.ncdc.noaa.gov/oa/climate/research/anomalies/anomalies.html

and global sea level rise:

http://www.agu.org/pubs/crossref/2006/2005GL024826.shtml

strongly suggest that the net imbalance goes back even further.  Of course, I agree with you that other forcings besides CO2 are also important.  And I believe you that the relative strength of the different forcings is still uncertain.  But I think it would be very surpising to learn that CO2 doesn’t matter at all.  So, given its residence time and the likelihood that humans will continue to increase the amount in the atmosphere over the next century, it seems like a no-brainer that we are in for at least some addition anthropogenic warming from CO2 over the next 100 years.

Question by Roger A. Pielke Sr.

“A question for you is when will you be posting upper ocean heat content anomaly maps and long term trends in near real time? This would substantially elevate the scientific discussion of global warming.”

Reply by Josh Willis

Yes, I know I’m behind on this.  Issues with data biases continue to come up.  Although I don’t think there will be any problems as large as those with the XBT probes or the small group of bad Argo floats, researchers at CSIRO are continuing to make refinements to the pressure measurements returned by Argo floats.  And, if they discover problems that affect large numbers of floats in the same way, it could have an impact on estimates of globally-integrated ocean heat content.  Again, I think these problems will end up being small, but they are slowing progress on making real-time estimates of ocean heat content.

Question by Roger A. Pielke Sr.

“My question to you is what would have to occur in terms of the accumulation of upper ocean heat content for you to reject the IPCC climate model predictions of global warming? For instance, what accumulation in Joules must the upper ocean have for the 10 year period starting in 2004 for you to not reject the model predictions of this quantity?”

Reply by Josh Willis

I don’t think “when should we reject the model” is quite the right question to ask.  I think we should back up a bit and ask “what should we expect the model to get right?”  The problem is that we really don’t have a good idea of what the interannual to decadal variability in ocean heat content and radiative imbalance looks like.  That’s because until recently, the accuracy of ocean heat content estimates has not been adequate to resolve year-to-year fluctuations. Estimates like those from Levitus are good enough to say that the Earth has been out of radiative balance for much of the last 50 years, but they are not accurate enough to get the interannual flucatuations right, and the decadal variations were plagued by data problems.  As we fix these problems, we will get a better idea of what these interannual to decadal changes look like and we can begin to ask if the models are getting it right.

But suppose we find that there is internal ocean variabiltiy that causes year-to-year and decade-to-decade fluctuations in the radiative imbalance, and that the IPCC models don’t simulate this variability.  That doesn’t necessarily mean that the IPCC estimates of net heat content change after 50 to 100 years will be way off base, only that the envelope of possible future heat content changes is broader than expected because there is natural variability we didn’t now about.  In other words, we may find out that the radiative balance fluctuates naturally over periods of years to decades, but that the IPCC models are still getting the magnitude of the external forcing approximately right.  I’m not saying that’s definitely the case.  Only that we can’t rule it out solely on the basis of a few years of zero radiative imbalance.

I do agree with you that several years of zero or little radiative imbalance poses some very difficult questions for the modeling community. But I do not think it is grounds for outright rejection of all model results.

Roger A. Pielke Response  

With respect to anthropogenic CO2, I agree it is (and will continue to be) a warming forcing. However, it does not mean that the total radiative imbalance due to human activities will be positive over multi-decadal time scales, since a number of the forcings are negative. These negative effects include several of the aerosol forcings that we identified in the 2005 NRC report (e.g. see), as well as possible negative radiative effects from circulation changes (both from natural and human caused effects; e.g. see).

 In terms of rejecting the global  climate models as having forecast skill, if they cannot quantitatively simulate decadal global heat accumulation, they certainly should not be used for decadal regional climate predictions (which they are being used for; e.g see).

The current lack of global warming for four years is too short to convincingly reject the models with respect to the prediction of the decadal radiative imbalance but, I agree with you that it certainly raises difficult questions for the modeling community. The tracking of the ocean heat content in the coming years should be among the highest priorities in climate science.

Comments Off on Reply By Josh Willis To Climate Science Questions Of August 19 2008

Filed under Climate Change Forcings & Feedbacks, Climate Models, Guest Weblogs

Modeling the Effects of Historical Vegetation Change on Near-Surface Atmosphere in the Northern Chihuahuan Desert by Beltran-Przekurat et al.

The published version of our paper

Beltrán-Przekurat, A., R.A. Pielke Sr., D.P.C. Peters, K.A. Snyder, and A. Rango, 2008: Modelling the effects of historical vegetation change on near surface atmosphere in the northern Chihuahuan Desert. J. Arid Environments, 72:10, 1897-1910, doi:10.1016/j.jaridenv.2008.05.012

has appeared.

The abstract reads

“Our goal was to evaluate effects of broad-scale changes in vegetation from grasslands to shrublands over the past 150 years on near-surface atmosphere over the Jornada Experimental Range in the northern Chihuahuan Desert, using a regional climate model. Simulations were conducted using 1858 and 1998 vegetation maps, and data collected in the field. Overall, the vegetation shift led to small changes in sensible heat (SH) and an increase in latent heat (LH). The impacts of shrub encroachment depended on shrubland type: conversion from grass to mesquite cools the near-surface atmosphere and from grass to creosote bush warms it. Higher albedo of mesquite relative to grasses reduced available energy, which was dissipated mainly as LH due to the deeper root system in mesquite. In creosotebush-dominated areas, a decrease in albedo, an increase in roughness length and displacement height contributed to the SH increase and warmer temperatures. Sensitivity simulations showed that an increase in soil moisture content enhanced shrub LH and a reduction in mesquite cover enhanced the temperature differences. The observed shift in vegetation led to complex interactions between land and surface fluxes, demonstrating that vegetation itself is a weather and climate variable as it significantly influences temperature and humidity.”

Among the significant conclusions of this paper, it demonstrates that unless landscape changes are included in the assessment of surface and boundary layer fluxes, the attribution of the reasons for observed temperature change will be misinterpreted.

Comments Off on Modeling the Effects of Historical Vegetation Change on Near-Surface Atmosphere in the Northern Chihuahuan Desert by Beltran-Przekurat et al.

Filed under Climate Change Forcings & Feedbacks, Climate Models

A 1,000-year, Annually-Resolved Record of Hurricane Activity From Boston, Massachusetts by Besonen et al.

There is a new paper which uses paelo-data to extend the record of hurricane activity back before the histroical record. The paper is

Besonen, M. R., R. S. Bradley, M. Mudelsee, M. B. Abbott, and P. Francus (2008), A 1,000-year, annually-resolved record of hurricane activity from Boston, Massachusetts,Geophys. Res. Lett., 35, L14705, doi:10.1029/2008GL033950.

with the abstract

“The annually-laminated (i.e., varved) sediment record from the Lower Mystic Lake (near Boston, MA), contains a series of anomalous graded beds deposited by strong flooding events that have affected the basin over the last millennium. From the historic portion of the record, 10 out of 11 of the most prominent graded beds correspond with years in which category 2-3 hurricanes are known to have struck the Boston area. Thus, we conclude that the graded beds represent deposition related to intense hurricane precipitation combined with wind-driven vegetation disturbance that exposes fresh, loose sediment. The hurricane signal shows strong, centennial-scale variations in frequency with a period of increased activity between the 12th-16th centuries, and decreased activity during the 11th and 17th-19th centuries. These frequency changes are consistent with other paleoclimate indicators from the tropical North Atlantic, in particular, sea surface temperature variations.”

The conclusion reads,

“The LML sedimentary record provides a well-controlled and annually-resolved record of category 2–3 hurricane activity in the Boston area over the last millennium. The hurricane signal shows centennial-scale variations in frequency with a period of increased activity between the 12th–16th centuries, and decreased activity during the 11th and 17th–19th centuries. We recognize that the LML record is a single point source record representative for the greater Boston area, and hurricanes that passed a few hundred km to the east or west may not have produced the very heavy rainfall amounts and vegetation disturbance in the lake watershed necessary to produce a strong signal within the LML sediments. Nevertheless, we also note that clear evidence of a secular change in hurricane frequency identified in the LML record is consistent with other lines of evidence that conditions for the development of hurricanes have changed on centennial timescales. Hence, it appears that hurricane activity was more frequent in the first half of the last millennium when tropical Atlantic SSTs were warmer and eastern equatorial Pacific SSTs were cooler than in subsequent centuries.”

This study adds to the clear documentation in the paleo-record that climate is not stationary, and never has been!

Climate Science discussed this recently in the weblog

The Value Of Paleoclimate Records In Assessing Vulnerability to Drought: A New Paper Meko et al 2008

This perspective is also presented in

Rial, J., R.A. Pielke Sr., M. Beniston, M. Claussen, J. Canadell, P. Cox, H. Held, N. de Noblet-Ducoudre, R. Prinn, J. Reynolds, and J.D. Salas, 2004: Nonlinearities, feedbacks and critical thresholds within the Earth’s climate system. Climatic Change, 65, 11-38.

Claims that we “need to stabilize the climate”  illustrate a complete lack of understanding of the actual large variations in time and space of the climate system from natural climate variability.

Comments Off on A 1,000-year, Annually-Resolved Record of Hurricane Activity From Boston, Massachusetts by Besonen et al.

Filed under Climate Change Forcings & Feedbacks, Definition of Climate

Circulation and Land Surface Influences on Convection in the Midwest U.S. “Corn Belt” during the Summers of 1999 and 2000 Parts I and II by Carleton et al. 2008

Two  significant papers have appeared which provide additional demonstration of the role of landscape as a first order climate forcing. The papers are

Carleton, A.M., D.L. Arnold, D.J. Travis, S. Curran, and J.O. Adegoke, 2008: Synoptic Circulation and Land Surface Influences on Convection in the Midwest U.S. “Corn Belt” during the Summers of 1999 and 2000. Part I: Composite Synoptic Environments. J. Climate, 21, 3389–3415.

with the abstract

“In the Midwest U.S. Corn Belt, the 1999 and 2000 summer seasons (15 June–15 September) expressed contrasting spatial patterns and magnitudes of precipitation (1999: dry; 2000: normal to moist). Distinct from the numerical modeling approach often used in studies of land surface–climate interactions, a “synoptic climatological” (i.e., stratified composite) approach is applied to observation data (e.g., precipitation, radar, and atmospheric reanalyses) to determine the relative influences of “top-down” synoptic atmospheric circulation (Part I, this paper) and “bottom-up” land surface mesoscale conditions (Part II) on the predominantly convective precipitation variations. Because mesoscale modeling suggests that the free-atmosphere wind speed (“background wind”) regulates the land surface–atmosphere mesoscale interaction, each day’s spatial range of wind speed at 500 hPa [V(500)] over the Central Corn Belt (CCB) is classified into one of five categories ranging from “weak flow” to “jet maximum.” Deep convective activity (i.e., presence/absence and morphological signature type) is determined for each afternoon and early evening period from the Next Generation Weather Radar (NEXRAD) imagery. Frequencies of the resulting background wind–convection joint occurrence types for the 1999 and 2000 summer seasons are examined in the context of the statistics determined for summers in the longer period of 1996–2001, and also compose categories for which NCEP–NCAR reanalysis (NNR) fields are averaged to yield synoptic composite environments for the two study seasons. The latter composites are compared visually with high-resolution (spatial) composites of precipitation to help identify the influence of top-down climate controls.

The analysis confirms that reduced (increased) organization of radar-indicated deep convection tends to occur with weaker (stronger) background flow. The summers of 1999 and 2000 differ from one another in terms of background flow and convective activity, but more so with respect to the six-summer averages, indicating that a fuller explanation of the precipitation differences in the two summers must be sought in the analysis of additional synoptic meteorological variables. The composite synoptic conditions on convection (CV) days (no convection (NC) days) in 1999 and 2000 are generalized as follows: low pressure incoming from the west (high pressure or ridging), southerly (northerly) lower-tropospheric winds, positive (negative) anomalies of moisture in the lower troposphere, rising (sinking) air in the midtroposphere, and a location south of the upper-tropospheric jet maximum (absence of an upper-tropospheric jet or one located just south of the area). Features resembling the “northerly low-level jets” identified in previous studies for the Great Plains are present on some NC-day composites. On CV days the spatial synchronization of synoptic features implying baroclinity increases with increasing background wind speed. The CV and NC composites differ least on days of weaker flow, and there are small areas within the CCB having no obvious association between precipitation elevated amounts and synoptic circulation features favoring the upward motion of air. These spatial incongruities imply a contributory influence of “stationary” (i.e., climatic) land surface mesoscale processes in convective activity, which are examined in Part II.”

and

Carleton, A.M., D.J. Travis, J.O. Adegoke, D.L. Arnold, and S. Curran, 2008: Synoptic Circulation and Land Surface Influences on Convection in the Midwest U.S. “Corn Belt” during the Summers of 1999 and 2000. Part II: Role of Vegetation Boundaries. J. Climate, 21, 3617–3641.

with the abstract

“In Part I of this observational study inquiring into the relative influences of “top down” synoptic atmospheric conditions and “bottom up” land surface mesoscale conditions in deep convection for the humid lowlands of the Midwest U.S. Central Corn Belt (CCB), the composite atmospheric environments for afternoon and evening periods of convection (CV) versus no convection (NC) were determined for two recent summers (1999 and 2000) having contrasting precipitation patterns and amounts. A close spatial correspondence was noted between composite synoptic features representing baroclinity and upward vertical motion with the observed precipitation on CV days when the “background” (i.e., free atmosphere) wind speed exceeded approximately 10 m s-1at 500 hPa (i.e., “stronger flow”). However, on CV days when wind speeds were <10 m s-1(i.e., “weaker flow”), areas of increased precipitation can be associated with synoptic composites that are not so different from those for corresponding NC days. From these observations, the presence of a land surface mesoscale influence on deep convection and precipitation is inferred that is better expressed on weaker flow days. Climatically, a likely candidate for enhancing low-level moisture convergence to promote deep convection are the quasi-permanent vegetation boundaries (QPVBs) between the two major land use and land cover (LULC) types of crop and forest that characterize much of the CCB. Accordingly, in this paper the role of these boundaries on summer precipitation variations for the CCB is extracted in two complementary ways: 1) for contrasting flow day types in the summers 1999 and 2000, by determining the spatially and temporally aggregated land surface influence on deep convection from composites of thermodynamic variables [e.g., surface lifted index (SLI), level of free convection (LFC), and lifted condensation level (LCL)] that are obtained from mapped data of the 6-h NCEP-NCAR reanalyses (NNR), and 0000 UTC rawinsonde ascents; and 2) for summer seasons 1995-2001, from the statistical associations of satellite-retrieved LULC boundary attributes (i.e., length and width) and precipitation at high spatial resolutions.

For the 1999 and 2000 summers (item 1 above), thermodynamic composites determined for V(500) categories having minimal differences in synoptic meteorological fields on CV minus NC (CV – NC) days (i.e., weaker flow), show statistically significant increases in atmospheric moisture (e.g., greater precipitable water; lower LCL and LFC) and static instability [e.g., positive convective available potential energy (CAPE)] compared to NC days. Moreover, CV days for both weaker and stronger background flow have associated subregional-scale thermodynamic patterns indicating free convection at the earth’s surface, supported by a synoptic pattern of at least weakly upward motion of air in the midtroposphere in contrast to NC days.

The possibility that aerodynamic contrasts along QPVBs readily permit air to be lofted above the LFC when the lower atmosphere is moist, thereby assisting or enhancing deep convection on CV days, is supported by the multiyear analysis (item 2 above). In early summer when LULC boundaries are most evident, precipitation on weaker flow days is significantly greater within 20 km of boundaries than farther away, but there is no statistical difference on stronger flow days. Statistical relationships between boundary mean attributes and mean precipitation change sign between early summer (positive) and late summer (negative), in accord with shifts in the satellite-retrieved maximum radiances from forest to crop areas. These phenological changes appear related, primarily, to contrasting soil moisture and implied evapotranspiration differences. Incorporating LULC boundary locations and phenological status into reliable forecast fields of lower-to-midtropospheric humidity and wind speed should lead to improved short-term predictions of convective precipitation in the Corn Belt and also, potentially, better climate seasonal forecasts.”

Both of these papers should be added to the growing peer-reviewed documentation of how the human alteration of the landscape alters weather and climate patterns from what would occur otherwise.

 

 

Comments Off on Circulation and Land Surface Influences on Convection in the Midwest U.S. “Corn Belt” during the Summers of 1999 and 2000 Parts I and II by Carleton et al. 2008

Filed under Climate Change Forcings & Feedbacks

Josh Willis’ Reply To My Weblog Of August 14 2008

Josh Willis graciously replied to the Climate Science weblog of August 14 2008 entitled “An Odd Weblog By Josh Willis” on the JPL weblog site

Reply by Josh Willis

Roger, thank you for the comment and the cross-link to my blog. I’m glad you enjoyed the gambling analogy, but I’m not gambling on the models or ignoring the Joules in the Great Ocean Heat Bank. I’m just looking at the bigger picture. Like the casino owners.

True, ocean heat content is the better metric for global warming, and the past few years of no warming are interesting. But tacked on to the 50-year-record of ocean warming before that, the last four years pretty much ARE just a wiggle. And yes, the estimates of global surface temperature do have errors and uncertainties. But the record of sea surface temperature also shows about 1 degree C of warming over the last 100 years. Remember, the oceans are 2/3 of the Earth’s surface and that record has fewer problems than the temperature data over land. Between the long-term records of ocean heat content, land and ocean surface warming, global sea level rise (about 20 cm over the last 100 years) and the increase in atmospheric CO2, you get a pretty simple, consistent picture of man-made warming. No models required.

Of course, the data are not perfect. Our understanding and our climate models are missing important pieces of the puzzle. But let’s not miss the forest for the trees. You don’t have to count every tree around before you realize you’re in the woods, just like a casino doesn’t have to win every bet to turn a profit.

Despite all the uncertainties, I think it is pretty clear that humans have already warmed the planet. And if we continue to add more CO2 to the atmosphere, we will warm it even further.”

Reply by Roger A. Pielke Sr.

Josh- Thank you for your reply. My response is given below:

1. In terms of global warming, there is not a “bigger picture” than the diagnosis and monitoring of ocean heat content changes. This has been shown effectively, for example, by

Barnett, T.P., D.W. Pierce, and R. Schnur, 2001: Detection of anthropogenic climate change in the world’s oceans. Science, 292, 270-274

and

Levitus, S., J.I. Antonov, J. Wang, T.L. Delworth, K.W. Dixon, and A.J. Broccoli, 2001: Anthropogenic warming of Earth’s climate system. Science, 292, 267-269.

They both found global warming using the upper ocean heat content change, and also concluded that the ocean is by far the largest location within the Earth’s climate system where this occurs.

To emphasize this point, Barnett et al wrote in their conclusions that

“Perhaps the most important aspect of this work is that it establishes a strong constraint on the performance and veracity of anthropogenically forced climate models. For example, a climate model that reproduces the observed change in global air temperature over the last 50 years, but fails to quantitatively reproduce the observed changed in ocean heat content, cannot be correct. The PCM has a relatively low sensitivity (less anthropogenic impact on climate) and captures both the ocean- and air-temperature changes. It seems likely that models with higher sensitivity, those predicting the most drastic anthropogenic climate changes in the future, may have difficulty satisfying the ocean constraint. To our knowledge, the PCM is the only model currently able to do this and still accurately reflect the changes in surface air temperature over the last 50 years.”

Thus, there clearly are serious issues with using the global average surface temperature to diagnose global warming. Since the land portion of the data, particularly minimum temperatures, has made up much of the reported warming, the major issues that we have identified do significantly affect the global average trend. The warm biases we have found do not eliminate the increase in surface temperature, but it does significantly reduce the magnitude of the increase (e.g., see for just one example).

A question for you is when will you be posting upper ocean heat content anomaly maps and long term trends in near real time? This would substantially elevate the scientific discussion of global warming.

2. You conclude that the warming over the last 100 years is man-made and attribute this to “the increase in atmospheric CO2″.  It is actually quite straightforward to show that no more than 30% of the human positive radiative forcing can be attributed to CO2 (see), and this assumes we understand the role of natural variability in the radiative forcing, which we clearly do not (e.g. see). 

Models cannot, of course, by themselves be used to prove anything.  They can only be used as hypotheses which is tested against the data.

As Jim Hansen stated

“The Willis et al. measured heat storage of 0.62 W/m2 refers to the decadal mean for the upper 750 m of the ocean. Our simulated 1993-2003 heat storage rate was 0.6 W/m2 in the upper 750 m of the ocean. The decadal mean planetary energy imbalance, 0.75 W/m2, includes heat storage in the deeper ocean and energy used to melt ice and warm the air and land. 0.85 W/m2 is the imbalance at the end of the decade.”

However, the imbalance since 2004, at least for the upper ocean, has been essentially zero.

My question to you is what would have to occur in terms of the accumulation of upper ocean heat content for you to reject the IPCC climate model predictions of global warming? For instance, what accumulation in Joules must the upper ocean have for the 10 year period starting in 2004 for you to not reject the model predictions of this quantity?

Comments Off on Josh Willis’ Reply To My Weblog Of August 14 2008

Filed under Climate Change Metrics

Comments On The Physics Today Article “Will Desperate Climates Call for Desperate Geoengineering Measures?” by Barbara Goss Levi

There is an article on geoengineering of the climate system;

Levi, B. G., 2008: Will desperate climates call for desperate geoengineering measures? Earth scientists ponder the wisdom of large-scale efforts to counter global warming. Physics Today, 61:8, 26-28.

Excerpts of the article read

“Concerned that Earth’s climate will change to an unacceptable degree or at an unacceptable rate before economies can shift significantly away from carbon-based energy sources, some scientists have begun casting their eyes in a previously shunned direction: geoengineering, or intentional and large-scale intervention to prevent or slow changes in the climate system.

Geoengineering sometimes refers strictly to techniques for increasing Earth’s albedo, or reflectivity, to lower its temperature and compensate for greenhouse warming. More broadly, the term can include efforts to accelerate some of the natural processes for removal of CO2 from the atmosphere. Many such ideas have been around for decades. In the past few years, however, the debate over their potential deployment has intensified.”

The figure below, reproduced from their paper (unfortunately, the image is poor), presents different geoengineering approaches and their relative costs and risks.

 

 (Figure originally from Kurt House, Harvard University.)

Each of these porposals for geoengineering of climate, however, are fraught with very significant risk!

As discussed in an early Climate Science weblog What is the Importance to Climate of Heterogeneous Spatial Trends in Tropospheric Temperatures?

“…..regional diabatic heating due to human activities represents a major, but under-recognized climate forcing, on long-term global weather patterns. Indeed, this heterogeneous climate forcing may be more important on the weather that we experience than changes in weather patterns associated with the more homogeneous spatial radiative forcing of the well-mixed greenhouse gases…”

The proposed options of geoengineering of climate illustrated in the figure would result in new human heterogeneous climate forcing. Since we do not even yet know the consequences of the existing inadvertent non-homogeneous human climate forcings such as land use/land cover change and aerosols (i.e. see), the introduction of deliberate heterogeneous human climate forcings is dangerous and irresponsible.

The claim in the Levi Physics Today article that geoengineering “intervention” [can] prevent or slow changes in the climate system is completely wrong. Geoengineering  would cause changes in the climate system!  The Levi focus almost exclusively on the role of the addition of carbon dioxide into the atmosphere is blind to the importance of altering the spatial pattern of climate forcing as a result of geoengineering.

Comments Off on Comments On The Physics Today Article “Will Desperate Climates Call for Desperate Geoengineering Measures?” by Barbara Goss Levi

Filed under Climate Change Forcings & Feedbacks, Climate Change Regulations, Climate Science Misconceptions, Uncategorized