scholarly journals Reconstructing the NH Mean Temperature: Can Underestimation of Trends and Variability Be Avoided?

2011 ◽  
Vol 24 (3) ◽  
pp. 674-692 ◽  
Author(s):  
Bo Christiansen

Abstract There are indications that hemispheric-mean climate reconstructions seriously underestimate the amplitude of low-frequency variability and trends. Some of the theory of linear regression and error-in-variables models is reviewed to identify the sources of this problem. On the basis of the insight gained, a reconstruction method that is supposed to minimize the underestimation is formulated. The method consists of reconstructing the local temperatures at the geographical locations of the proxies, followed by calculating the hemispheric average. The method is tested by applying it to an ensemble of surrogate temperature fields based on two climate simulations covering the last 500 and 1000 yr. Compared to the regularized expectation maximization (RegEM) truncated total least squares (TTLS) method and a composite-plus-scale method—two methods recently used in the literature—the new method strongly improves the behavior regarding low-frequency variability and trends. The potential importance in real-world situations is demonstrated by applying the methods to a set of 14 decadally smoothed proxies. Here the new method shows much larger low-frequency variability and a much colder preindustrial temperature level than the other reconstruction methods. However, this should mainly be seen as a demonstration of the potential losses and gains of variability, as the reconstructions based on the 14 decadally smoothed proxies are not very robust.

2009 ◽  
Vol 22 (4) ◽  
pp. 951-976 ◽  
Author(s):  
Bo Christiansen ◽  
T. Schmith ◽  
P. Thejll

Abstract Reconstruction of the earth’s surface temperature from proxy data is an important task because of the need to compare recent changes with past variability. However, the statistical properties and robustness of climate reconstruction methods are not well known, which has led to a heated discussion about the quality of published reconstructions. In this paper a systematic study of the properties of reconstruction methods is presented. The methods include both direct hemispheric-mean reconstructions and field reconstructions, including reconstructions based on canonical regression and regularized expectation maximization algorithms. The study will be based on temperature fields where the target of the reconstructions is known. In particular, the focus will be on how well the reconstructions reproduce low-frequency variability, biases, and trends. A climate simulation from an ocean–atmosphere general circulation model of the period a.d. 1500–1999, including both natural and anthropogenic forcings, is used. However, reconstructions include a large element of stochasticity, and to draw robust statistical interferences, reconstructions of a large ensemble of realistic temperature fields are needed. To this end a novel technique has been developed to generate surrogate fields with the same temporal and spatial characteristics as the original surface temperature field from the climate model. Pseudoproxies are generated by degrading a number of gridbox time series. The number of pseudoproxies and the relation between the pseudoproxies and the underlying temperature field are determined realistically from Mann et al. It is found that all reconstruction methods contain a large element of stochasticity, and it is not possible to compare the methods and draw conclusions from a single or a few realizations. This means that very different results can be obtained using the same reconstruction method on different surrogate fields. This might explain some of the recently published divergent results. Also found is that the amplitude of the low-frequency variability in general is underestimated. All methods systematically give large biases and underestimate both trends and the amplitude of the low-frequency variability. The underestimation is typically 20%–50%. The shape of the low-frequency variability, however, is well reconstructed in general. Some potential in validating the methods on independent data is found. However, to gain information about the reconstructions’ ability to capture the preindustrial level it is necessary to consider the average level in the validation period and not the year-to-year correlations. The influence on the reconstructions of the number of proxies, the type of noise used to generate the proxies, the strength of the variability, as well as the effect of detrending the data prior to the calibration is also reported.


2020 ◽  
Vol 40 (12) ◽  
pp. 5154-5169
Author(s):  
Mohammad Reza Khazaei ◽  
Bagher Zahabiyoun ◽  
Mehraveh Hasirchian

2011 ◽  
Vol 24 (23) ◽  
pp. 6013-6034 ◽  
Author(s):  
Bo Christiansen ◽  
Fredrik Charpentier Ljungqvist

Abstract A new multiproxy reconstruction of the Northern Hemisphere extratropical mean temperature over the last millennium is presented. The reconstruction is performed with a novel method designed to avoid the underestimation of low-frequency variability that has been a general problem for regression-based reconstruction methods. The disadvantage of this method is an exaggerated high-frequency variability. The reconstruction is based on a set of 40 proxies of annual to decadal resolution that have been shown to relate to the local temperature. The new reconstruction shows a very cold Little Ice Age centered around the 17th century with a cold extremum (for 50-yr smoothing) of about 1.1 K below the temperature of the calibration period, AD 1880–1960. This cooling is about twice as large as corresponding numbers reported by most other reconstructions. In the beginning of the millennium the new reconstruction shows small anomalies in agreement with previous studies. However, the new temperature reconstruction decreases faster than previous reconstructions in the first 600 years of the millennium and has a stronger variability. The salient features of the new reconstruction are shown to be robust to changes in the calibration period, the source of the local temperatures, the spatial averaging procedure, and the screening process applied to the proxies. An ensemble pseudoproxy approach is applied to estimate the confidence intervals of the 50-yr smoothed reconstruction showing that the period AD 1500–1850 is significantly colder than the calibration period.


2016 ◽  
Author(s):  
Juhani Rinne ◽  
Mikko Alestalo ◽  
Jörg Franke

Abstract. Recently it has been shown that climate estimates derived from tree rings often tend to show erroneous long-term oscillations, i.e. there are spectral biases at low frequencies. The result is independent of parameter studied (precipitation or temperature) or measured proxy (tree ring widths or maximum latewood densities). In order to find reasons for such universal errors, a new reconstruction method is introduced where no age dependence of the tree rings is determined. The aim, however, is not to generate better reconstructions but to study error variances of long-term oscillations. It is shown that paucities and data gaps due to missing trees increase the risk for erroneous low-frequency variability. A general approximate formula is introduced in order to estimate the presence of such a risk. A case study using Torneträsk data from Northern Sweden illustrates how longer periods with missing trees cause paucities and gaps leading to erroneous climatic oscillations. Systematic underestimation of the temperature around AD 1600 and after 1950 (“divergence”) is in the study case explained by such data gaps and paucities.


2011 ◽  
Vol 7 (6) ◽  
pp. 3991-4035
Author(s):  
B. Christiansen ◽  
F. C. Ljungqvist

Abstract. We present two new multi-proxy reconstructions of the extra-tropical Northern Hemisphere (30–90° N) mean temperature: a two-millennia long reconstruction reaching back to AD 1 based on 32 proxies and a 500-yr long reconstruction reaching back to AD 1500 based on 91 proxies. The proxies are of different types and of different resolutions (annual, annual-to-decadal, and decadal) but all have previously been shown to relate to local or regional temperature. We use a reconstruction method, LOC, that recently has been shown to confidently reproduce low-frequency variability. Confidence intervals are obtained by an ensemble pseudo-proxy method that both estimates the variance and the bias of the reconstructions. The two-millennia long reconstruction shows a well defined Medieval Warm Period with a peak warming ca. AD 950–1050 reaching 0.7 °C relative to the reference period AD 1880–1960. The 500-yr long reconstruction confirms previous results obtained with the LOC method applied to a smaller proxy compilation; in particular it shows the Little Ice Age cumulating in AD 1580–1720 with a temperature minimum of −1.1 °C below the reference period. The reconstructed local temperatures, the magnitude of which are subject to wide confidence intervals, show a rather geographically homogeneous LIA while more geographical inhomogeneities are found for the Medieval Warm Period. Reconstructions based on different number of proxies show only small differences suggesting that LOC reconstructs 50-yr smoothed extra-tropical NH mean temperatures well and that low-frequency noise in the proxies is a relatively small problem.


2020 ◽  
Vol 6 (3) ◽  
pp. 36-39
Author(s):  
Rongqing Chen ◽  
Knut Möller

AbstractPurpose: To evaluate a novel structural-functional DCT-based EIT lung imaging method against the classical EIT reconstruction. Method: Taken retrospectively from a former study, EIT data was evaluated using both reconstruction methods. For different phases of ventilation, EIT images are analyzed with respect to the global inhomogeneity (GI) index for comparison. Results: A significant less variant GI index was observed in the DCTbased method, compared to the index from classical method. Conclusion: The DCT-based method generates more accurate lung contour yet decreasing the essential information in the image which affects the GI index. These preliminary results must be consolidated with more patient data in different breathing states.


Water ◽  
2020 ◽  
Vol 12 (7) ◽  
pp. 2058 ◽  
Author(s):  
Larissa Rolim ◽  
Francisco de Souza Filho

Improved water resource management relies on accurate analyses of the past dynamics of hydrological variables. The presence of low-frequency structures in hydrologic time series is an important feature. It can modify the probability of extreme events occurring in different time scales, which makes the risk associated with extreme events dynamic, changing from one decade to another. This article proposes a methodology capable of dynamically detecting and predicting low-frequency streamflow (16–32 years), which presented significance in the wavelet power spectrum. The Standardized Runoff Index (SRI), the Pruned Exact Linear Time (PELT) algorithm, the breaks for additive seasonal and trend (BFAST) method, and the hidden Markov model (HMM) were used to identify the shifts in low frequency. The HMM was also used to forecast the low frequency. As part of the results, the regime shifts detected by the BFAST approach are not entirely consistent with results from the other methods. A common shift occurs in the mid-1980s and can be attributed to the construction of the reservoir. Climate variability modulates the streamflow low-frequency variability, and anthropogenic activities and climate change can modify this modulation. The identification of shifts reveals the impact of low frequency in the streamflow time series, showing that the low-frequency variability conditions the flows of a given year.


2008 ◽  
Vol 21 (9) ◽  
pp. 1948-1962 ◽  
Author(s):  
R. Garcia-Herrera ◽  
D. Barriopedro ◽  
E. Hernández ◽  
H. F. Diaz ◽  
R. R. Garcia ◽  
...  

Abstract The authors present a chronology of El Niño (EN) events based on documentary records from northern Peru. The chronology, which covers the period 1550–1900, is constructed mainly from primary sources from the city of Trujillo (Peru), the Archivo General de Indias in Seville (Spain), and the Archivo General de la Nación in Lima (Peru), supplemented by a reassessment of documentary evidence included in previously published literature. The archive in Trujillo has never been systematically evaluated for information related to the occurrence of El Niño–Southern Oscillation (ENSO). Abundant rainfall and river discharge correlate well with EN events in the area around Trujillo, which is very dry during most other years. Thus, rain and flooding descriptors, together with reports of failure of the local fishery, are the main indicators of EN occurrence that the authors have searched for in the documents. A total of 59 EN years are identified in this work. This chronology is compared with the two main previous documentary EN chronologies and with ENSO indicators derived from proxy data other than documentary sources. Overall, the seventeenth century appears to be the least active EN period, while the 1620s, 1720s, 1810s, and 1870s are the most active decades. The results herein reveal long-term fluctuations in warm ENSO activity that compare reasonably well with low-frequency variability deduced from other proxy data.


Sign in / Sign up

Export Citation Format

Share Document