scholarly journals A Surrogate Ensemble Study of Climate Reconstruction Methods: Stochasticity and Robustness

2009 ◽  
Vol 22 (4) ◽  
pp. 951-976 ◽  
Author(s):  
Bo Christiansen ◽  
T. Schmith ◽  
P. Thejll

Abstract Reconstruction of the earth’s surface temperature from proxy data is an important task because of the need to compare recent changes with past variability. However, the statistical properties and robustness of climate reconstruction methods are not well known, which has led to a heated discussion about the quality of published reconstructions. In this paper a systematic study of the properties of reconstruction methods is presented. The methods include both direct hemispheric-mean reconstructions and field reconstructions, including reconstructions based on canonical regression and regularized expectation maximization algorithms. The study will be based on temperature fields where the target of the reconstructions is known. In particular, the focus will be on how well the reconstructions reproduce low-frequency variability, biases, and trends. A climate simulation from an ocean–atmosphere general circulation model of the period a.d. 1500–1999, including both natural and anthropogenic forcings, is used. However, reconstructions include a large element of stochasticity, and to draw robust statistical interferences, reconstructions of a large ensemble of realistic temperature fields are needed. To this end a novel technique has been developed to generate surrogate fields with the same temporal and spatial characteristics as the original surface temperature field from the climate model. Pseudoproxies are generated by degrading a number of gridbox time series. The number of pseudoproxies and the relation between the pseudoproxies and the underlying temperature field are determined realistically from Mann et al. It is found that all reconstruction methods contain a large element of stochasticity, and it is not possible to compare the methods and draw conclusions from a single or a few realizations. This means that very different results can be obtained using the same reconstruction method on different surrogate fields. This might explain some of the recently published divergent results. Also found is that the amplitude of the low-frequency variability in general is underestimated. All methods systematically give large biases and underestimate both trends and the amplitude of the low-frequency variability. The underestimation is typically 20%–50%. The shape of the low-frequency variability, however, is well reconstructed in general. Some potential in validating the methods on independent data is found. However, to gain information about the reconstructions’ ability to capture the preindustrial level it is necessary to consider the average level in the validation period and not the year-to-year correlations. The influence on the reconstructions of the number of proxies, the type of noise used to generate the proxies, the strength of the variability, as well as the effect of detrending the data prior to the calibration is also reported.

2011 ◽  
Vol 24 (3) ◽  
pp. 674-692 ◽  
Author(s):  
Bo Christiansen

Abstract There are indications that hemispheric-mean climate reconstructions seriously underestimate the amplitude of low-frequency variability and trends. Some of the theory of linear regression and error-in-variables models is reviewed to identify the sources of this problem. On the basis of the insight gained, a reconstruction method that is supposed to minimize the underestimation is formulated. The method consists of reconstructing the local temperatures at the geographical locations of the proxies, followed by calculating the hemispheric average. The method is tested by applying it to an ensemble of surrogate temperature fields based on two climate simulations covering the last 500 and 1000 yr. Compared to the regularized expectation maximization (RegEM) truncated total least squares (TTLS) method and a composite-plus-scale method—two methods recently used in the literature—the new method strongly improves the behavior regarding low-frequency variability and trends. The potential importance in real-world situations is demonstrated by applying the methods to a set of 14 decadally smoothed proxies. Here the new method shows much larger low-frequency variability and a much colder preindustrial temperature level than the other reconstruction methods. However, this should mainly be seen as a demonstration of the potential losses and gains of variability, as the reconstructions based on the 14 decadally smoothed proxies are not very robust.


2018 ◽  
Vol 14 (6) ◽  
pp. 901-922 ◽  
Author(s):  
Mari F. Jensen ◽  
Aleksi Nummelin ◽  
Søren B. Nielsen ◽  
Henrik Sadatzki ◽  
Evangeline Sessford ◽  
...  

Abstract. Here, we establish a spatiotemporal evolution of the sea-surface temperatures in the North Atlantic over Dansgaard–Oeschger (DO) events 5–8 (approximately 30–40 kyr) using the proxy surrogate reconstruction method. Proxy data suggest a large variability in North Atlantic sea-surface temperatures during the DO events of the last glacial period. However, proxy data availability is limited and cannot provide a full spatial picture of the oceanic changes. Therefore, we combine fully coupled, general circulation model simulations with planktic foraminifera based sea-surface temperature reconstructions to obtain a broader spatial picture of the ocean state during DO events 5–8. The resulting spatial sea-surface temperature patterns agree over a number of different general circulation models and simulations. We find that sea-surface temperature variability over the DO events is characterized by colder conditions in the subpolar North Atlantic during stadials than during interstadials, and the variability is linked to changes in the Atlantic Meridional Overturning circulation and in the sea-ice cover. Forced simulations are needed to capture the strength of the temperature variability and to reconstruct the variability in other climatic records not directly linked to the sea-surface temperature reconstructions. This is the first time the proxy surrogate reconstruction method has been applied to oceanic variability during MIS3. Our results remain robust, even when age uncertainties of proxy data, the number of available temperature reconstructions, and different climate models are considered. However, we also highlight shortcomings of the methodology that should be addressed in future implementations.


2005 ◽  
Vol 127 (5) ◽  
pp. 486-498 ◽  
Author(s):  
Mayank Tyagi ◽  
Sumanta Acharya

Large eddy simulations are performed in a periodic domain of a rotating square duct with normal rib turbulators. Both the Coriolis force as well as the centrifugal buoyancy forces are included in this study. A direct approach is presented for the unsteady calculation of the nondimensional temperature field in the periodic domain. The calculations are performed at a Reynolds number (Re) of 12,500, a rotation number (Ro) of 0.12, and an inlet coolant-to-wall density ratio Δρ/ρ of 0.13. The predicted time and space-averaged Nusselt numbers are shown to compare satisfactorily with the published experimental data. Time sequences of the vorticity components and the temperature fields are presented to understand the flow physics and the unsteady heat transfer behavior. Large scale coherent structures are seen to play an important role in the mixing and heat transfer. The temperature field appears to contain a low frequency mode that extends beyond a single inter-rib geometric module, and indicates the necessity of using at least two inter-rib modules for streamwise periodicity to be satisfied. Proper orthogonal decomposition (POD) of the flowfield indicates a low dimensionality of this system with almost 99% of turbulent energy in the first 80 POD modes.


2011 ◽  
Vol 24 (14) ◽  
pp. 3609-3623 ◽  
Author(s):  
Fiona Johnson ◽  
Seth Westra ◽  
Ashish Sharma ◽  
Andrew J. Pitman

Abstract Climate change impact studies for water resource applications, such as the development of projections of reservoir yields or the assessment of likely frequency and amplitude of drought under a future climate, require that the year-to-year persistence in a range of hydrological variables such as catchment average rainfall be properly represented. This persistence is often attributable to low-frequency variability in the global sea surface temperature (SST) field and other large-scale climate variables through a complex sequence of teleconnections. To evaluate the capacity of general circulation models (GCMs) to accurately represent this low-frequency variability, a set of wavelet-based skill measures has been developed to compare GCM performance in representing interannual variability with the observed global SST data, as well as to assess the extent to which this variability is imparted in precipitation and surface pressure anomaly fields. A validation of the derived skill measures is performed using GCM precipitation as an input in a reservoir storage context, with the accuracy of reservoir storage estimates shown to be improved by using GCM outputs that correctly represent the observed low-frequency variability. Significant differences in the performance of different GCMs is demonstrated, suggesting that judicious selection of models is required if the climate impact assessment is sensitive to low-frequency variability. The two GCMs that were found to exhibit the most appropriate representation of global low-frequency variability for individual variables assessed were the Istituto Nazionale di Geofisica e Vulcanologia (INGV) ECHAM4 and L’Institut Pierre-Simon Laplace Coupled Model, version 4 (IPSL CM4); when considering all three variables, the Max Planck Institute (MPI) ECHAM5 performed well. Importantly, models that represented interannual variability well for SST also performed well for the other two variables, while models that performed poorly for SST also had consistently low skill across the remaining variables.


Author(s):  
Mayank Tyagi ◽  
Sumanta Acharya

Large eddy simulations are performed in a periodic domain of a rotating square duct with normal rib turbulators. Both the Coriolis force as well as the centrifugal buoyancy force are included in this study. A direct approach is presented for the unsteady calculation of the non-dimensional temperature field in the periodic domain. The calculations are performed at a Reynolds number (Re) of 12, 500, a Rotation number (Ro) of 0.12 and an inlet coolant-to-wall density ratio (Δρ/ρ) of 0.13. The time-averaged Nusselt numbers compare satisfactorily with the data of Wagner et al. (J. Turbomachinery, Vol. 114, pp. 847–857). Time-sequences of the vorticity components and the temperature fields are presented to understand the flow physics and the unsteady heat transfer processes. Large scale coherent structures are seen to play an important role in the mixing and heat transfer. The temperature field appears to contain a low frequency mode that extends beyond a single inter-rib geometric module, and indicates the necessity of using at least two inter-rib modules for streamwise periodicity to be satisfied. Proper orthogonal decomposition (POD) of 200 snapshots indicates a low dimensionality of this system with almost 99% of turbulent energy in the first 80 POD modes.


2011 ◽  
Vol 24 (23) ◽  
pp. 6013-6034 ◽  
Author(s):  
Bo Christiansen ◽  
Fredrik Charpentier Ljungqvist

Abstract A new multiproxy reconstruction of the Northern Hemisphere extratropical mean temperature over the last millennium is presented. The reconstruction is performed with a novel method designed to avoid the underestimation of low-frequency variability that has been a general problem for regression-based reconstruction methods. The disadvantage of this method is an exaggerated high-frequency variability. The reconstruction is based on a set of 40 proxies of annual to decadal resolution that have been shown to relate to the local temperature. The new reconstruction shows a very cold Little Ice Age centered around the 17th century with a cold extremum (for 50-yr smoothing) of about 1.1 K below the temperature of the calibration period, AD 1880–1960. This cooling is about twice as large as corresponding numbers reported by most other reconstructions. In the beginning of the millennium the new reconstruction shows small anomalies in agreement with previous studies. However, the new temperature reconstruction decreases faster than previous reconstructions in the first 600 years of the millennium and has a stronger variability. The salient features of the new reconstruction are shown to be robust to changes in the calibration period, the source of the local temperatures, the spatial averaging procedure, and the screening process applied to the proxies. An ensemble pseudoproxy approach is applied to estimate the confidence intervals of the 50-yr smoothed reconstruction showing that the period AD 1500–1850 is significantly colder than the calibration period.


Sign in / Sign up

Export Citation Format

Share Document