Interdecadal variability/long-term changes in global precipitation patterns during the past three decades: global warming and/or pacific decadal variability?

2012 ◽  
Vol 40 (11-12) ◽  
pp. 3009-3022 ◽  
Author(s):  
Guojun Gu ◽  
Robert F. Adler
2009 ◽  
Vol 22 (11) ◽  
pp. 3156-3166 ◽  
Author(s):  
Beate G. Liepert ◽  
Michael Previdi

Abstract Recently analyzed satellite-derived global precipitation datasets from 1987 to 2006 indicate an increase in global-mean precipitation of 1.1%–1.4% decade−1. This trend corresponds to a hydrological sensitivity (HS) of 7% K−1 of global warming, which is close to the Clausius–Clapeyron (CC) rate expected from the increase in saturation water vapor pressure with temperature. Analysis of two available global ocean evaporation datasets confirms this observed intensification of the atmospheric water cycle. The observed hydrological sensitivity over the past 20-yr period is higher by a factor of 5 than the average HS of 1.4% K−1 simulated in state-of-the-art coupled atmosphere–ocean climate models for the twentieth and twenty-first centuries. However, the analysis shows that the interdecadal variability in HS in the models is high—in particular in the twentieth-century runs, which are forced by both increasing greenhouse gas (GHG) and tropospheric aerosol concentrations. About 12% of the 20-yr time intervals of eight twentieth-century climate simulations from the third phase of the Coupled Model Intercomparison Project (CMIP3) have an HS magnitude greater than the CC rate of 6.5% K−1. The analysis further indicates different HS characteristics for GHG and tropospheric aerosol forcing agents. Aerosol-forced HS is a factor of 2 greater, on average, and the interdecadal variability is significantly larger, with about 23% of the 20-yr sensitivities being above the CC rate. By thermodynamically constraining global precipitation changes, it is shown that such changes are linearly related to the difference in the radiative imbalance at the top of the atmosphere (TOA) and the surface (i.e., the atmospheric radiative energy imbalance). The strength of this relationship is controlled by the modified Bowen ratio (here, global sensible heat flux change divided by latent heat flux change). Hydrological sensitivity to aerosols is greater than the sensitivity to GHG because the former have a stronger effect on the shortwave transmissivity of the atmosphere, and thus produce a larger change in the atmospheric radiative energy imbalance. It is found that the observed global precipitation increase of 13 mm yr−1 decade−1 from 1987 to 2006 would require a trend of the atmospheric radiative imbalance (difference between the TOA and the surface) of 0.7 W m−2 decade−1. The recovery from the El Chichón and Mount Pinatubo volcanic aerosol injections in 1982 and 1991, the satellite-observed reductions in cloudiness during the phase of increasing ENSO events in the 1990s, and presumably the observed reduction of anthropogenic aerosol concentrations could have caused such a radiative imbalance trend over the past 20 years. Observational evidence, however, is currently inconclusive, and it will require more detailed investigations and longer satellite time series to answer this question.


2002 ◽  
Vol 29 (19) ◽  
pp. 24-1-24-4 ◽  
Author(s):  
Amy J. Bratcher ◽  
Benjamin S. Giese

2020 ◽  
Vol 143 (1-2) ◽  
pp. 177-191
Author(s):  
Peter Hoffmann ◽  
Arne Spekat

AbstractThis study looks into the question to what extent long-term change patterns of observed temperature and rainfall over Europe can be attributed to dynamical causes, in other words: Are the observed changes due to a change in frequency of the patterns or have the patterns’ dynamical properties changed? By using a combination of daily meteorological data and a European weather-type classification, the long-term monthly mean temperature and precipitation were calculated for each weather-type. Subsequently, the observed weather-type sequences were used to construct analogue time series for temperature and precipitation which only include the dynamical component of the long-term variability since 1961. The results show that only a fraction of about 20% of the past temperature rise since 1990, which for example amounted to 1 °C at the Potsdam Climate Station can be explained by dynamical changes, i.e. most of the weather-types have become warmer. Concerning long-term changes of seasonal rainfall patterns, a fraction of more than 60% is considerably higher. Moreover, the results indicate that for rainfall compared with temperature, the decadal variability and trends of the dynamical component follow the observed ones much stronger. Consequently, most of the explained seasonal rainfall variances can be linked to changes in weather-type sequences in Potsdam and over Europe. The dynamical contribution to long-term changes in annual and seasonal rainfall patterns dominates due to the fact that the alternation of wet and dry weather-types (e.g. the types Trough or High pressure over Central Europe), their frequencies and duration has significantly changed in the last decades.


2018 ◽  
Vol 22 (12) ◽  
pp. 6399-6414 ◽  
Author(s):  
Lanying Zhang ◽  
George Kuczera ◽  
Anthony S. Kiem ◽  
Garry Willgoose

Abstract. The duration of dry or wet hydrological epochs (run lengths) associated with positive or negative Inter-decadal Pacific Oscillation (IPO) or Pacific Decadal Oscillation (PDO) phases, termed Pacific decadal variability (PDV), is an essential statistical property for understanding, assessing and managing hydroclimatic risk. Numerous IPO and PDO paleoclimate reconstructions provide a valuable opportunity to study the statistical signatures of PDV, including run lengths. However, disparities exist between these reconstructions, making it problematic to determine which reconstruction(s) to use to investigate pre-instrumental PDV and run length. Variability and persistence on centennial scales are also present in some millennium-long reconstructions, making consistent run length extraction difficult. Thus, a robust method to extract meaningful and consistent run lengths from multiple reconstructions is required. In this study, a dynamic threshold framework to account for centennial trends in PDV reconstructions is proposed. The dynamic threshold framework is shown to extract meaningful run length information from multiple reconstructions. Two hydrologically important aspects of the statistical signatures associated with the PDV are explored: (i) whether persistence (i.e. run lengths) during positive epochs is different to persistence during negative epochs and (ii) whether the reconstructed run lengths have been stationary during the past millennium. Results suggest that there is no significant difference between run lengths in positive and negative phases of PDV and that it is more likely than not that the PDV run length has been non-stationary in the past millennium. This raises concerns about whether variability seen in the instrumental record (the last ∼100 years), or even in the shorter 300–400-year paleoclimate reconstructions, is representative of the full range of variability.


1975 ◽  
Vol 14 (70) ◽  
pp. 49-56 ◽  
Author(s):  
J. F. Nye ◽  
H. H. Wills

The displacement of the surface of an ice sheet and of markers set in its top layers can be measured geodetically, and also, it is expected, by radio-echo methods. The paper discusses how such measurements could be interpreted as showing long-term changes in the thickness of the ice sheet; in particular it discusses how one might design an experiment so as to avoid unwanted effects due to short-term changes in rate of accumulation. The analysis is similar to that of Federer and others (1970), but it corrects an error, so that when applied to their results for central Greenland it gives a different result for the lowering of the surface. Federer and others have already concluded that the average accumulation rates during the past 100 years have been below those needed to keep in balance with the velocity of the ice sheet as a whole. Using a particular model, it is found that this has resulted in the surface lowering at a mean rate of 0.050 m a−1 between 1871 and 1968, and a mean rate of 0.140 m a−1 between 1959 and 1968.


2020 ◽  
Vol 95 (sp1) ◽  
pp. 1416
Author(s):  
Norman Dreier ◽  
Rain Männikus ◽  
Peter Fröhle

Author(s):  
Robert Pool

The past couple of decades have been a confusing, frustrating period for engineers. With their creations making the world an ever richer, healthier, more comfortable place, it should have been a time of triumph and congratulation for them. Instead, it has been an era of discontent. Even as people have come to rely on technology more and more, they have liked it less. They distrust the machines that are supposedly their servants. Sometimes they fear them. And they worry about the sort of world they are leaving to their children. Engineers, too, have begun to wonder if something is wrong. It is not simply that the public doesn’t love them. They can live with that. But some of the long-term costs of technology have been higher than anyone expected: air and water pollution, hazardous wastes, the threat to the Earth’s ozone layer, the possibility of global warming. And the drumbeat of sudden technological disaster over the past twenty years is enough to give anyone pause: Three Mile Island, Bhopal, the Challenger, Chernobyl, the Exxon Valdez, the downing of a commercial airliner by a missile from the U.S.S. Vincennes. Is it time to rethink our approach to technology? Some engineers believe that it is. In one specialty after another, a few prophets have emerged who argue for doing things in a fundamentally new way. And surprisingly, although these visionaries have focused on problems and concerns unique to their own particular areas of engineering, a single underlying theme appears in their messages again and again: Engineers should pay more attention to the larger world in which their devices will function, and they should consciously take that world into account in their designs. Although this may sound like a simple, even a self-evident, bit of advice, it is actually quite a revolutionary one for engineering. Traditionally, engineers have aimed at perfecting their machines as machines. This can be seen in the traditional measures of machines: how fast they are, how much they can produce, the quality of their output, how easy they are to use, how much they cost, how long they last.


2020 ◽  
Author(s):  
Jessica Neu ◽  
Kazuyuki Miyazaki ◽  
Kevin Bowman ◽  
Gregory Osterman

<p>Given the importance of tropospheric ozone as a greenhouse gas and a hazardous pollutant that impacts human health and ecosystems, it is critical to quantify and understand long-term changes in its abundance.  Satellite records are beginning to approach the length needed to assess variability and trends in tropospheric ozone, yet an intercomparison of time series from different instruments shows substantial differences in the net change in ozone over the past decade.  We discuss our efforts to produce Earth Science Data Records of tropospheric ozone and quantify uncertainties and biases in these records.  We also discuss the role of changes in the magnitude and distribution of precursor emissions and in downward transport of ozone from the stratosphere in determining tropospheric ozone abundances over the past 15 years.</p>


Sign in / Sign up

Export Citation Format

Share Document