Monthly Archives: March 2011

News Article “Aircraft Condensation Trails Criss-Crossing The Sky May Be Warming The Planet On A Normal Day More Than The Carbon Dioxide Emitted By All Planes Since The Wright Brothers’ First Flight In 1903, A Study Said On Tuesday”

There is a news article on March 29 2011 from Rueters titled

Aircraft condensation trails criss-crossing the sky may be warming the planet on a normal day more than the carbon dioxide emitted by all planes since the Wright Brothers’ first flight in 1903, a study said on Tuesday.

The text begins with [highlight added]

Aircraft condensation trails criss-crossing the sky may be warming the planet on a normal day more than the carbon dioxide emitted by all planes since the Wright Brothers’ first flight in 1903, a study said on Tuesday.”

Another excerpt reads

“The study, by experts at the DLR German Aerospace Center, estimated that the net warming effect for the Earth of contrails and related cirrus clouds at any one time was 31 milliwatts per square meter, more than the warming effect of accumulated CO2 from aviation of 28 milliwatts.”

If correct, this is a remarkable finding with respect to contrails as a climate forcing. It also shows that as we study the climate system, we find it is affected by a wider diversity of human climate forcings than concluded by the IPCC. The human effect on the climate system is not dominated by CO2 and a few other greenhouse gases.

Comments Off

Filed under Climate Change Forcings & Feedbacks, Climate Science Reporting

News Article “Global Groundwater Depletion Leads To Sea Level Rise”

I was alerted to a news article (h/t to Emma Daniels) at Deltares titled

Global groundwater depletion leads to sea level rise

The news article starts with the text

“Large-scale abstraction of groundwater for irrigation of crops leads to a sea level rise of 0.8 mm per year, which is about one fourth of the current rate of sea level rise of 3.1 mm per year. This conclusion follows from a study by hydrologists from Utrecht University and the research institute Deltares. A paper about this study is currently in press with Geophysical Research Letters.”

The GRL article is

Wada, Y., L. P.H. van Beek, C. M. van Kempen, J. W.T.M. Reckman, S. Vasak, and M.F.P. Bierkens (2010), Global depletion of groundwater resources, Geophysical Research Letters doi:10.1029/2010GL044571.

with the abstract

“In regions with frequent water stress and large aquifer systems groundwater is often used as an additional water source. If groundwater abstraction exceeds the natural groundwater recharge for extensive areas and long times, overexploitation or persistent groundwater depletion occurs. Here we provide a global overview of groundwater depletion (here defined as abstraction in excess of recharge) by assessing groundwater recharge with a global hydrological model and subtracting estimates of groundwater abstraction. Restricting our analysis to sub-humid to arid areas we estimate the total global groundwater depletion to have increased from 126 (±32) km3 a−1 in 1960 to 283 (±40) km3 a−1 in 2000. The latter equals 39 (±10)% of the global yearly groundwater abstraction, 2 (±0.6)% of the global yearly groundwater recharge, 0.8 (±0.1)% of the global yearly continental runoff and 0.4 (±0.06)% of the global yearly evaporation, contributing a considerable amount of 0.8 (±0.1) mm a−1 to current sea-level rise.”

Comments Off

Filed under Climate Change Metrics

Photographic Metadata For Surface Temperature Observing Sites In The Netherlands

As a result of a visit to the Netherlands, I was informed about an excellent site which provides some photographic documentation of surface temperature sites in this country. Such metadata, of course, should be standard and readily available for all locations worldwide which are used in the construction of a global average surface temperature trend.

The site is (the information is in Dutch)

Stationslijst KNMI meteorologische stations

and includes the following observation sites

Nummer – Stationsnaam Nummer – Stationsnaam
210 Valkenburg 290 Twenthe
235 De Kooy 310 Vlissingen
240 Schiphol 319 Westdorpe
242 Vlieland 323 Wilhelminadorp
249 Berkhout 330 Hoek van Holland
   
251 Hoorn Terschelling 340 Woensdrecht
257 Wijk aan Zee 344 Rotterdam
260 De Bilt 348 Cabauw
265 Soesterberg 350 Gilze-Rijen
267 Stavoren 356 Herwijnen
   
269 Lelystad 370 Eindhoven
270 Leeuwarden 375 Volkel
273 Marknesse 377 Ell
275 Deelen 380 Maastricht
277 Lauwersoog 391 Arcen
   
278 Heino  
279 Hoogeveen  
280 Eelde  
283 Hupsel  
286 Nieuw Beerta  

 

 

My only suggestion is that they need to also present views looking out in the cardinal directions from the site, as has been done by Anthony Watts and volunteers for http://www.surfacestations.org/.

Comments Off

Filed under Climate Change Metrics

Two Perspectives On Vulnerability – The IPCC view And The Bottom-Up Perspective

There are two approaches to assess vulnerability, as is presented in the figure below from

O’Brien, K. L., S. Eriksen, L. Nygaard, and A. Schjolden, (2007), Why different interpretations of vulnerability matter in climate change discourses. Climate Policy 7 (1): 73–88

and

Füssel, H.-M. (2009), Review and quantitative analysis of indices of climate change exposure, adaptive capacity, sensitivity, and impacts. Development and Climate Change: Background Note. to the World Development Report 2010, 35 pp. Available online at http://siteresources.worldbank.org/INTWDR2010/Resources/5287678-583 1255547194560/WDR2010_BG_Note_Fussel.pdf.

Figure caption: Framework depicting two interpretations of vulnerability to climate change: (a) outcome vulnerability and (b) contextual vulnerability. From: Füssel [2009] and O’Brien, K. L. et al. [2007].

We discuss these two types of vulnerability in our submitted paper

Pielke Sr., R.A., R. Wilby, D. Niyogi, F. Hossain, K. Dairuku, J. Adegoke, G. Kallos, T. Seastedt, and K. Suding, 2011: Dealing with complexity and extreme events using a bottom-up, resource-based vulnerability perspective. AGU Monograph on Complexity and Extreme Events in Geosciences, submitted.  

The outcome vulnerability is the IPCC approach where “climate change” in the left side of the figure is typically obtained from the IPCC-type multi-decadal global climate model projections. With the contextual vulnerability, however,  “climate variability and change” represent just one of the stressors.

We presented this perspective in 2004 in our book

Kabat, P., Claussen, M., Dirmeyer, P.A., J.H.C. Gash, L. Bravo de Guenni, M. Meybeck, R.A. Pielke Sr., C.J. Vorosmarty, R.W.A. Hutjes, and S. Lutkemeier, Editors, 2004: Vegetation, water, humans and the climate: A new perspective on an interactive system. Springer, Berlin, Global Change – The IGBP Series, 566 pp

where we used the terminology “scenario” which is the “outcome vulnerability approach and the term “vulnerability” with is the same as the “contextual vulnerability” used by O’Brien et al.  The summary table from Kabat et al 2004 with respect to these two approaches is reproduced below (from Pielke and Guenni, 2004).

Approach Scenario Vulnerability 
Assumed dominant stress Climate, recent greenhouse gas emissions to the atmosphere, ocean temperatures, aerosols, etc . Multiple Stresses: Climate (historical climate variability, land use and water use, altered disturbance regimes invasive species, contaminants/pollutants, habitat loss, etc.
Usual timeframe of concern Long-term, doubled CO2 30 to 100 years in the future. Short-term (0-30 years) and long-term research.
Usual scale of concern Global,sometimes regional. Local scale needs downscaling techniques. However, there is little evidence to suggest that present models provide realistic, accurate, or precise climate scenarios at local or regional scales. Local, regional, national, and global scales.
Major parameters of concern Spatially averaged changes in mean temperatures and precipitation in fairly large grid cells with some regional scenarios for drought. Potential extreme values in multiple parameters (temperature, precipitations, frost-free days) and additional focus on extreme events (floods, fires, droughts, etc.) measures of uncertainty.
Major limitations for developing coping strategies Focus on single stress limits preparedness for other stresses.Results often show gradual ramping of climate change-limiting preparedness for extreme events.Results represent only a limited subset of all likely future outcomes – usually unidirectional trends.Results are accepted by many scientists, the media, and the public as actual “predictions”.Lost in the translation of results is that all models of the distant future have unstated (presently unknowable) levels of certainty or probability. Approach requires detailed data on multiple stresses and their interactions at local, regional, national, and global scales – and many areas lack adequate information.Emphasis on short-term issues may limit preparedness for abrupt “threshold” changes in climate sometime in the short or long term.Requires preparedness for a far greater variation of possible futures, including abrupt changes in any direction – this is probably more realistic, yet difficult.

 Figure caption: Contrast between a top-down versus bottom-up assessment of the vulnerability of resources to climate variability and change [from Kabat, P., Claussen, M., Dirmeyer, P.A., J.H.C. Gash, L. Bravo de Guenni, M. Meybeck, R.A. Pielke Sr., C.J. Vorosmarty, R.W.A. Hutjes, and S. Lutkemeier, Editors, 2004: Vegetation, water, humans and the climate: A new perspective on an interactive system. Springer, Berlin, Global Change - The IGBP Series, 566 pp.]

The recent Nature article Overstretching attribution by Parmesan et al 2011 clearly shows why the outcome vulnerability approach is inadequate and is actually misleading policymakers.  As they headline in that article

“The biological world is responding rapidly to a changing climate, but attempts to attribute individual impacts to rising greenhouse gases are ill-advised.”

The conclude their article with

“We advocate striving for a richer understanding of interactions between multiple drivers of change through doing empirical research, emphasizing tractable questions and using model-based attribution approaches more as a tool for improving projections of biodiversity impacts than as an end in itself. To do so should clarify the dialogue between climate scientists, biologists and policymakers, and generate much-needed assessments of the current and future impacts of anthropogenic climate change on biota.”

The adoption of the contextual vulnerability approach means that instead of the IPCC global climate models dominating the effort (as with the outcome vulnerability approach), the climate model projections are only one part of the assessment of threats to key social and environmental resources. This is going to be difficult for that community to accept, but it is required if we are to develop more inclusive and scientifically robust assessments of the risk we face in the coming decades.

Comments Off

Filed under Vulnerability Paradigm

Guest Post By Professor Kiminori Itoh On The Earthquake and Tsunami In Japan On March 11 2011

I am pleased to post an insightful communication by Kiminori Itoh regarding the March 11 2011 earthquake and tsunami in Japan [see also Source Of Information On The Japan March 11 2011 Earthquake And Tsunami]

Guest Post by Professor Kiminori Itoh of Yokohama National University

[A report based on the e-mail from Kiminori Itoh to Roger Pielke Sr, March 23, 2011]

 Thank you for your blog article of March 23 on the earthquake and tsunami in Japan. Your conclusion on vulnerability is quite right. Every officer and policy maker in Japan as well should know how to manage the vulnerability and resilience of our society after the present hour of trial.

 As my residence is in Tokyo and my university (Yokohama National University) is in Yokohama (quite near and due South to Tokyo), we felt the big shake of the earthquake to some extent. But, it was much smaller compared with places near the seismic center, such as Miyagi prefecture and Iwate prefecture. Aftershocks are still continuing even today after more than 10 days from the first big one (reported as M 9.0).

 One of the worst results of the earthquake is, of course, the damage on nuclear reactors located at the East coast of Fukushima prefecture, that is, the Fukushima Daiichi (i.e., No. 1) nuclear plant (hereafter, the FD1 plant).

 Although the radioactive pollution from the damaged nuclear reactors is the heaviest problem, our  problem right now in Tokyo and nearby areas is, as may reported, the shortage of electricity due to the failure of the nuclear reactors. Of course, the collapsed nuclear reactors themselves are really a grand disaster which is still continuing now, and no one knows when it will end with what results. I feel like reading your PNAS paper (Peters et al., Cross-scale interactions, nonlinearities, and forecasting catastrophic events, Proc. Nat. Acad. Sci., 101, 15130–15135 (2004)) on how forest fires grow and turn uncontrollable. 

 One of the biggest mistakes of Tokyo Electricity Co. (TEPCO) is that they could not decide to abandon (I mean “decommission”) the troubled reactors at the first stage of the event due to the tsunami. Because of the past several accidents at their nuclear reactors at other places (2005 and 2007, in particular) and resultant financial problems, they did not want to close the DF1 plant, I guess.

 Moreover, there are several “human errors” by TEPCO. One of them is that they did not suitably prepare for the risk of tsunamis. In fact, the designer of the oldest reactor of the FD1 plant reportedly did not consider tsunamis when he applied the design provided by GE to the plant. This is totally different way from the bottom-up approach which is necessary for local management.

 It might be true that the designer’s fault could not be blamed at that time, because there was little knowledge about past big tsunamis. But, according to the most recent information (Asahi Shin-bun Newspaper, March 25, 2011), TEPCO ignored findings about a big tsunami around 1000 years ago, which findings were obtained after the plan for FD1 plant had started. Thus, what they could do was to forget the suggestion on the big tsunami or convince themselves that there would be no big tsunamis during the lifetime of the reactors (or lifetime of themselves).

 This type of conviction has been called in Japan as “legend of nuclear power”; that is, there will be no earthquakes and/or tsunamis, magnitude of which is larger than anticipated by designers and companies. Frankly speaking, this is a psychologically interesting phenomenon worth to investigate further in the field of risk management. One of the reasons behind should be the technology in concern was imported one.

 They placed oil tanks for Diesel pumps (which provide power in emergency) just in front of the coast, between the ocean and the reactors, the weakest point in their plant against tsunamis. And hence, the tanks had been washed away by the tsunami. I can imagine how they felt when they saw their oil tanks taken away into the ocean, but this is their fault, I think. After knowing there was a chance for a big tsunami “beyond imagination,” they were able to change the position of the tanks even though the design for the reactor itself was unchangeable. But for them, doing this meant that their conviction was wrong.

 One of the specialists who explained the accident on TV claimed “There is absolutely no possibility that the Diesel pumps did not work due to the tsunami.” His claim was based on the fact that the pumps were placed at the basement of the reactor buildings, and had to have been shielded from tsunamis. But, his claim was incorrect; the pumps were damaged by the tsunami as a matter of fact. This is because the tsunami produced several big waves, not one. At the first wave, the pumps were safe, and they moved the pumps from the basement to near the reactor to make coolant (water) circulate. Then, the second big wave came to make the pumps ineffective.

 Thus, even the specialist could not imagine what actually happened. He is a specialist of nuclear plants maybe, but not a specialist of tsunamis; he did not know the tsunamis can accompany several big waves with rather long intervals. Your point in the PNAS paper that a disaster needs different scientific fields to understand it and to control it is thus unfortunately verified in the present event as well.

 I am now willing to propose a resilient electric power system in Japan. For instance, the shortage of the electricity is, so to speak, a historical result associated with our power system; that is, a mixture of 50 Hz and 60 Hz. The former came from Germany and the latter from U. S. A. in Meiji era (around 120 years ago). If TEPCO (50 Hz) can buy electricity from other companies with 60 Hz, there should be no shortage, but they cannot do it because of the difference of the frequency, and due to little capacity of converting the frequency (only a GW while ten GW is required). This is incredibly stupid. What a non-resilient system!

 There are more points to write, for instance the future of our energy system and its relation to the global warming issue, but I shall finish now. Consuming electricity by working long time is not good now. Thank you for reading this.

 Sincerely,

Kiminori Itoh, Professor, Yokohama National University Tel. & Fax. +81-45-339-4354 E-mail: itohkimi@ynu.ac.jp

Comments Off

Filed under Guest Weblogs, Vulnerability Paradigm

Repost Of “Emissions of Black Carbon by Rockets – Can You Spot The Irony?” By Darin Toohey

There is a quite interesting post by Darin Toohey of the University of Colorado at Boulder titled

Emissions of Black Carbon by Rockets – can you spot the irony?

where it is clear that his message on the climate risks of these launches is not being heeded.

 Darin’s research on this topic is presented in the posts

Guest Weblog By Professor Darin W. Toohey Of The University Of Colorado At Boulder

“Limits On The Space Launch Market Related To Stratospheric Ozone Depletion” By Ross et al. 2009

News Article “ Boulder scientists: Space tourism could contribute to climate change

The current post by Darin is quite interesting as evident in the title he chose

Emissions of Black Carbon by Rockets – can you spot the irony?

“The Sierra Nevada Corporation (SNC) Space Systems Group announces the successful completion of two critical milestones for NASA’s Commercial Crew Development (CCDev) Program. On September 21, 2010, SNC completed three successful test firings of a single hybrid rocket motor in one day. ”  (SNC Press Release, Oct. 11, 2010)

“If the space tourism industry matures to the point that 1,000 hybrid-powered suborbital flights depart annually, those trips would deposit roughly 600 metric tons of soot into the stratosphere each year. Over decades of launches, those emissions would form a persistent and asymmetric cloud over the northern hemisphere that could impact atmospheric circulation and regional temperatures far more than the greenhouse gases released into the stratosphere by those same flights.”  (Scientific American,  October 23, 2010)

“[T]he rocket fuel that most companies plan to use — a solid synthetic rubber that is burned with a liquid nitrous oxide — creates far more soot than more standard rocket fuels that mix together hydrogen and oxygen.” (Boulder Daily Camera, October 25, 2010)

” ‘This is at the absolute forefront of the way you can do scientific research today,’ said SwRI’s Alan Stern, who is leading the institute’s suborbital program. (SwRI) stepped out front, and we didn’t just put our toe in the water… I expect that these scientists will be the first of many to fly to space commercially[.] As the scientific community realizes that they can put payloads and people into space at unprecedented low costs, the floodgates will open even wider.” (Boulder Daily Camera, February 28, 2011)

“U.S. Sen. Mark Udall, D-Colo., toured Sierra Nevada and the new space vehicle Tuesday as part of his Colorado Workforce tour, then led a town hall-style discussion with about 150 employees. ‘I’m here to be an advocate for the Colorado aerospace industry,’ Udall said.” (Boulder Daily Camera,  March 23, 2011)”

Comments Off

Filed under Climate Change Forcings & Feedbacks, Climate Science Misconceptions, Climate Science Reporting, Vulnerability Paradigm

Important Paper On Aerosol Forcing Of The Climate System – “Lifting Potential Of Solar‐Heated Aerosol Layers”

As we learn more about aerosols, their role within the climate system is becoming better recognized as very complex. There is an important new paper that further documents this complexity. It is

Boers, R., A. T. de Laat, D. C. Stein Zweers, and R. J. Dirksen (2010), Lifting potential of solar‐heated aerosol layers, Geophys. Res. Lett., 37, L24802, doi:10.1029/2010GL045171.

The abstract reads [hihglight added]

“Absorption of shortwave solar radiation can potentially heat aerosol layers and create buoyancy that can result in the ascent of the aerosol layer over several kilometres altitude within 24–48 hours. Such heating is seasonally dependent with the summer pole region producing the largest lifting in solstice because aerosol layers are exposed to sunshine for close to 24 hours a day. The smaller the Angstrøm parameter, the larger the lifting potential. An important enhancement to lifting is the diffuse illumination of the base of the aerosol layer when it is located above highly reflective cloud layers. It is estimated that aerosol layers residing in the boundary layer with optical properties typical for biomass burning aerosols can reach the extra tropical tropopause within 3–4 day entirely due to diabatic heating as a result of solar shortwave absorption and cross‐latitudinal transport. It is hypothesized that this mechanism can explain the presence and persistence of upper tropospheric/lower stratospheric aerosol layers.”

The conclusion contains the text

“In this paper we explored the potential of lifting of aerosol layers to the upper troposphere/lower stratosphere by means of solar heating. To this end we decoupled the diabatic heating due to solar heating from the adiabatic cooling due to altitude gain. Knowing the latitudinal variation of the potential temperature lapse rate, the daily diabatic heating rate of aerosol layers is translated into a tropospheric altitude gain. For optical properties typical of biomass burning events aerosol layers are lifted by 3–5 km in the course of 3 days. With additional solar heating by diffuse radiation from reflected radiation at the base of the aerosol layer and cross ‐ latitudinal flow these gains can be doubled so that aerosol layer can quickly reach tropopause levels. Neither the presence of pyro ‐ convection nor the onset of complex synoptic scale dynamical systems is required to allow this mechanism to achieve lift. If the aerosol layer decays as a result of sedimentation and/or chemical changes this method of vertical transport will cease to be effective. Sedimentation will remove mostly the larger particles (>10 mm) within a matter of days because of the magnitude their fall speeds. However for the remaining particles the residence time is much longer, and the higher the particles are lifted, the less the risk will be that they fall out due to precipitation which is not so relevant at higher altitudes. Thus, solar absorption can be a powerful mechanism to transport aerosol layers towards higher tropospheric altitudes.”

The importance of this paper is that aerosols can be lofted into the stratosphere, where they will persist much longer than when they are confined to the troposphere and can be scavenged out by precipitation. As with volcanic emissions into the stratosphere, these aerosols can influence weather for weeks and longer. Persistant input of aerosols into the stratopshere by this mechanism can affect climate for years.

Comments Off

Filed under Climate Change Forcings & Feedbacks, Research Papers