AGW Observer

Observations of anthropogenic global warming

New research from last week 49/2010

Posted by Ari Jokimäki on December 13, 2010

Here is the new research published last week. I’m not including everything that was published but just some papers that got my attention. Those who follow my Facebook page (and/or Twitter) have already seen most of these, as I post these there as soon as they are published. Here, I’ll just put them out in one batch. Sometimes I might also point out to some other news as well, but the new research will be the focus here. Here’s the archive for the news of previous weeks. By the way, if this sort of thing interests you, be sure to check out A Few Things Illconsidered, they have a weekly posting containing lots of links to new research and other climate related news. Planet 3.0 also reports new research.

Published last week:

On the two degree limit

Three views of two degrees – Jaeger & Jaeger (2010) “Limiting global warming to 2°C above pre-industrial global mean temperature has become a widely endorsed goal for climate policy. It has also been severely criticized. We show how the limit emerged out of a marginal remark in an early paper about climate policy and distinguish three possible views of it. The catastrophe view sees it as the threshold separating a domain of safety from a domain of catastrophe. The cost-benefit view sees it as a strategy to optimize the relation between the costs and benefits of climate policy. The focal point view sees it as a solution to a complex coordination problem. We argue that the focal point view is the most appropriate. It leads to an emphasis on implementing effective steps toward a near-zero emissions economy, without panicking in the face of a possible temporary overshooting. After several decades of practical experiences, the focal point may or may not be redefined on the basis of knowledge gathered thanks to these experiences.” Carlo C. Jaeger and Julia Jaeger, Regional Environmental Change
DOI: 10.1007/s10113-010-0190-9. [full text]

Human activity shows in Europe temperatures in all seasons

Human activity and anomalously warm seasons in Europe – Christidis et al. (2010) “Seasonal mean temperatures averaged over the European region have warmed at a rate of 0.35–0.52 K/decade since 1980. The last decade has seen record-breaking seasonal temperatures in Europe including the summer of 2003 and the spring, autumn, and winter of 2007. Previous studies have established that European summer warming since the early twentieth century can be attributed to the effects of human influence. The attribution analysis described here employs temperature data from observations and experiments with two climate models and uses optimal fingerprinting to partition the climate response between its anthropogenic and natural components. These responses are subsequently combined with estimates of unforced climate variability to construct distributions of the annual values of seasonal mean temperatures with and without the effect of human activity. We find that in all seasons, anthropogenic forcings have shifted the temperature distributions towards higher values. We compute the associated change in the likelihood of having seasons whose temperatures exceed a pre-specified threshold. We first set the threshold equal to the seasonal temperature observed in a particular year to assess the effect of anthropogenic influences in past seasons. We find that in the last decade (1999–2008) it is extremely likely (probability greater than 95%) that the probability has more than doubled under the influence of human activity in spring and autumn, while for summer it is extremely likely that the probability has at least quadrupled. One of the two models employed in the analysis indicates it is extremely likely the probability has more than doubled in winter too. We also compute the change in probability over a range of temperature thresholds which enables us to provide updates on the likely change in probability attributable to human influence as soon as observations become available. Such near-real time information could be very useful for adaptation planning.” Nikolaos Christidis, Peter A. Stott, Gareth S. Jones, Hideo Shiogama, Toru Nozawa, Jürg Luterbacher, International Journal of Climatology, 2010, DOI: 10.1002/joc.2262.

Black carbon in Arctic snow probably has not contributed to rapid sea ice decline

Light-absorbing impurities in Arctic snow – Doherty et al. (2010) “Absorption of radiation by ice is extremely weak at visible and near-ultraviolet wavelengths, so small amounts of light-absorbing impurities in snow can dominate the absorption of solar radiation at these wavelengths, reducing the albedo relative to that of pure snow, contributing to the surface energy budget and leading to earlier snowmelt. In this study Arctic snow is surveyed for its content of light-absorbing impurities, expanding and updating the 1983–1984 survey of Clarke and Noone. Samples were collected in Alaska, Canada, Greenland, Svalbard, Norway, Russia, and the Arctic Ocean during 1998 and 2005–2009, on tundra, glaciers, ice caps, sea ice, frozen lakes, and in boreal forests. Snow was collected mostly in spring, when the entire winter snowpack is accessible for sampling. Sampling was carried out in summer on the Greenland Ice Sheet and on the Arctic Ocean, of melting glacier snow and sea ice as well as cold snow. About 1200 snow samples have been analyzed for this study. The snow is melted and filtered; the filters are analyzed in a specially designed spectrophotometer system to infer the concentration of black carbon (BC), the fraction of absorption due to non-BC light-absorbing constituents and the absorption Ångstrom exponent of all particles. This is done using BC calibration standards having a mass absorption efficiency of 6.0 m2 g−1 at 550 nm and by making an assumption that the absorption Angstrom exponent for BC is 1.0 and for non-BC light-absorbing aerosol is 5.0. The reduction of snow albedo is primarily due to BC, but other impurities, principally brown (organic) carbon, are typically responsible for ~40% of the visible and ultraviolet absorption. The meltwater from selected snow samples was saved for chemical analysis to identify sources of the impurities. Median BC amounts in surface snow are as follows (nanograms of carbon per gram of snow): Greenland 3, Arctic Ocean snow 7, melting sea ice 8, Arctic Canada 8, subarctic Canada 14, Svalbard 13, Northern Norway 21, western Arctic Russia 27, northeastern Siberia 34. Concentrations are more variable in the European Arctic than in Arctic Canada or the Arctic Ocean, probably because of the proximity to BC sources. Individual samples of falling snow were collected on Svalbard, documenting the springtime decline of BC from March through May. Absorption Ångstrom exponents are 1.5–1.7 in Norway, Svalbard, and western Russia, 2.1–2.3 elsewhere in the Arctic, and 2.5 in Greenland. Correspondingly, the estimated contribution to absorption by non-BC constituents in these regions is ~25%, 40%, and 50% respectively. It has been hypothesized that when the snow surface layer melts some of the BC is left at the top of the snowpack rather than being carried away in meltwater. This process was observed in a few locations and would cause a positive feedback on snowmelt. The BC content of the Arctic atmosphere has declined markedly since 1989, according to the continuous measurements of near-surface air at Alert (Canada), Barrow (Alaska), and Ny-Ålesund (Svalbard). Correspondingly, the new BC concentrations for Arctic snow are somewhat lower than those reported by Clarke and Noone for 1983–1984, but because of methodological differences it is not clear that the differences are significant. Nevertheless, the BC content of Arctic snow appears to be no higher now than in 1984, so it is doubtful that BC in Arctic snow has contributed to the rapid decline of Arctic sea ice in recent years.” Doherty, S. J., Warren, S. G., Grenfell, T. C., Clarke, A. D., and Brandt, R. E.: Light-absorbing impurities in Arctic snow, Atmos. Chem. Phys., 10, 11647-11680, doi:10.5194/acp-10-11647-2010, 2010. [full text]

On anthropogenic changes in upper ocean temperature

Can oceanic reanalyses be used to assess recent anthropogenic changes and low-frequency internal variability of upper ocean temperature? – Corre et al. (2010) “A multivariate analysis of the upper ocean thermal structure is used to examine the recent long-term changes and decadal variability in the upper ocean heat content as represented by model-based ocean reanalyses and a model-independent objective analysis. The three variables used are the mean temperature above the 14°C isotherm, its depth and a fixed depth mean temperature (250 m mean temperature). The mean temperature above the 14°C isotherm is a convenient, albeit simple, way to isolate thermodynamical changes by filtering out dynamical changes related to thermocline vertical displacements. The global upper ocean observations and reanalyses exhibit very similar warming trends (0.045°C per decade) over the period 1965–2005, superimposed with marked decadal variability in the 1970s and 1980s. The spatial patterns of the regression between indices (representative of anthropogenic changes and known modes of internal decadal variability), and the three variables associated with the ocean heat content are used as fingerprint to separate out the different contributions. The choice of variables provides information about the local heat absorption, vertical distribution and horizontal redistribution of heat, this latter being suggestive of changes in ocean circulation. The discrepancy between the objective analysis and the reanalyses, as well as the spread among the different reanalyses, are used as a simple estimate of ocean state uncertainties. Two robust findings result from this analysis: (1) the signature of anthropogenic changes is qualitatively different from those of the internal decadal variability associated to the Pacific Interdecadal Oscillation and the Atlantic Meridional Oscillation, and (2) the anthropogenic changes in ocean heat content do not only consist of local heat absorption, but are likely related with changes in the ocean circulation, with a clear shallowing of the tropical thermocline in the Pacific and Indian oceans.” L. Corre, L. Terray, M. Balmaseda, A. Ribes and A. Weaver, Climate Dynamics, DOI: 10.1007/s00382-010-0950-8.

Flood risk concerns 3x people and 10x assets by 2070s

A global ranking of port cities with high exposure to climate extremes – Hanson et al. (2010) “This paper presents a first estimate of the exposure of the world’s large port cities (population exceeding one million inhabitants in 2005) to coastal flooding due to sea-level rise and storm surge now and in the 2070s, taking into account scenarios of socio-economic and climate changes. The analysis suggests that about 40 million people (0.6% of the global population or roughly 1 in 10 of the total port city population in the cities considered) are currently exposed to a 1 in 100 year coastal flood event. For assets, the total value exposed in 2005 across all cities considered is estimated to be US$3,000 billion; corresponding to around 5% of global GDP in 2005 (both measured in international USD) with USA, Japan and the Netherlands being the countries with the highest values. By the 2070s, total population exposed could grow more than threefold due to the combined effects of sea-level rise, subsidence, population growth and urbanisation with asset exposure increasing to more than ten times current levels or approximately 9% of projected global GDP in this period. On the global-scale, population growth, socio-economic growth and urbanization are the most important drivers of the overall increase in exposure particularly in developing countries, as low-lying areas are urbanized. Climate change and subsidence can significantly exacerbate this increase in exposure. Exposure is concentrated in a few cities: collectively Asia dominates population exposure now and in the future and also dominates asset exposure by the 2070s. Importantly, even if the environmental or socio-economic changes were smaller than assumed here the underlying trends would remain. This research shows the high potential benefits from risk-reduction planning and policies at the city scale to address the issues raised by the possible growth in exposure.” Susan Hanson, Robert Nicholls, N. Ranger, S. Hallegatte, J. Corfee-Morlot, C. Herweijer and J. Chateau, Climatic Change, DOI: 10.1007/s10584-010-9977-4.

Vegetation might cause stronger negative feedback than previously thought

Quantifying the negative feedback of vegetation to greenhouse warming: A modeling approach – Bounoua et al. (2010) “Several climate models indicate that in a 2 × CO2 environment, temperature and precipitation would increase and runoff would increase faster than precipitation. These models, however, did not allow the vegetation to increase its leaf density as a response to the physiological effects of increased CO2 and consequent changes in climate. Other assessments included these interactions but did not account for the vegetation down-regulation to reduce plant’s photosynthetic activity and as such resulted in a weak vegetation negative response. When we combine these interactions in climate simulations with 2 × CO2, the associated increase in precipitation contributes primarily to increase evapotranspiration rather than surface runoff, consistent with observations, and results in an additional cooling effect not fully accounted for in previous simulations with elevated CO2. By accelerating the water cycle, this feedback slows but does not alleviate the projected warming, reducing the land surface warming by 0.6°C. Compared to previous studies, these results imply that long term negative feedback from CO2-induced increases in vegetation density could reduce temperature following a stabilization of CO2 concentration.” Bounoua, L., F. G. Hall, P. J. Sellers, A. Kumar, G. J. Collatz, C. J. Tucker, and M. L. Imhoff (2010), Geophys. Res. Lett., 37, L23701, doi:10.1029/2010GL045338.

Leave a comment