scholarly journals Joint inversion of proxy system models to reconstruct paleoenvironmental time series from heterogeneous data

2020 ◽  
Vol 16 (1) ◽  
pp. 65-78 ◽  
Author(s):  
Gabriel J. Bowen ◽  
Brenden Fischer-Femal ◽  
Gert-Jan Reichart ◽  
Appy Sluijs ◽  
Caroline H. Lear

Abstract. Paleoclimatic and paleoenvironmental reconstructions are fundamentally uncertain because no proxy is a direct record of a single environmental variable of interest; all proxies are indirect and sensitive to multiple forcing factors. One productive approach to reducing proxy uncertainty is the integration of information from multiple proxy systems with complementary, overlapping sensitivity. Mostly, such analyses are conducted in an ad hoc fashion, either through qualitative comparison to assess the similarity of single-proxy reconstructions or through step-wise quantitative interpretations where one proxy is used to constrain a variable relevant to the interpretation of a second proxy. Here we propose the integration of multiple proxies via the joint inversion of proxy system and paleoenvironmental time series models in a Bayesian hierarchical framework. The “Joint Proxy Inversion” (JPI) method provides a statistically robust approach to producing self-consistent interpretations of multi-proxy datasets, allowing full and simultaneous assessment of all proxy and model uncertainties to obtain quantitative estimates of past environmental conditions. Other benefits of the method include the ability to use independent information on climate and environmental systems to inform the interpretation of proxy data, to fully leverage information from unevenly and differently sampled proxy records, and to obtain refined estimates of proxy model parameters that are conditioned on paleo-archive data. Application of JPI to the marine Mg∕Ca and δ18O proxy systems at two distinct timescales demonstrates many of the key properties, benefits, and sensitivities of the method, and it produces new, statistically grounded reconstructions of Neogene ocean temperature and chemistry from previously published data. We suggest that JPI is a universally applicable method that can be implemented using proxy models of wide-ranging complexity to generate more robust, quantitative understanding of past climatic and environmental change.

2019 ◽  
Author(s):  
Gabriel J. Bowen ◽  
Brenden Fisher-Femal ◽  
Gert-Jan Reichart ◽  
Appy Sluijs ◽  
Caroline H. Lear

Abstract. Paleoclimatic and paleoenvironmental reconstructions are fundamentally uncertain because no proxy is a direct record of a single environmental variable of interest; all proxies are indirect and sensitive to multiple forcing factors. One productive approach to reducing proxy uncertainty is the integration of information from multiple proxy systems with complimentary, overlapping sensitivity. Most such analyses are conducted in an ad-hoc fashion, either through qualitative comparison to assess the similarity of single-proxy reconstructions or through step-wise quantitative interpretations where one proxy is used to constrain a variable relevant to the interpretation of a second proxy. Here we propose the integration of multiple proxies via the joint inversion of proxy system and paleoenvironmental time series models in a Bayesian hierarchical framework. The "Joint Proxy Inversion" (JPI) method provides a statistically robust approach to producing self-consistent interpretations of multi-proxy datasets, allowing full and simultaneous assessment of all proxy and model uncertainties to obtain quantitative estimates of past environmental conditions. Other benefits of the method include the ability to use independent information on climate and environmental systems to inform the interpretation of proxy data, to fully leverage information from unevenly- and differently-sampled proxy records, and to obtain refined estimates of proxy model parameters that are conditioned on paleo-archive data. Application of JPI to the marine Mg / Ca and δ18O proxy systems at two distinct timescales demonstrates many of the key properties, benefits, and sensitivities of the method, and produces new, statistically-grounded reconstructions of Neogene ocean temperature and chemistry from previously published data. We suggest that JPI is a universally applicable method that can be implemented using proxy models of wide-ranging complexity to generate more robust, quantitative understanding of past climatic and environmental change.


2011 ◽  
Vol 23 (1) ◽  
pp. 97-123 ◽  
Author(s):  
Arta A. Jamshidi ◽  
Michael J. Kirby

We present an approach for constructing nonlinear empirical mappings from high-dimensional domains to multivariate ranges. We employ radial basis functions and skew radial basis functions for constructing a model using data that are potentially scattered or sparse. The algorithm progresses iteratively, adding a new function at each step to refine the model. The placement of the functions is driven by a statistical hypothesis test that accounts for correlation in the multivariate range variables. The test is applied on training and validation data and reveals nonstatistical or geometric structure when it fails. At each step, the added function is fit to data contained in a spatiotemporally defined local region to determine the parameters—in particular, the scale of the local model. The scale of the function is determined by the zero crossings of the autocorrelation function of the residuals. The model parameters and the number of basis functions are determined automatically from the given data, and there is no need to initialize any ad hoc parameters save for the selection of the skew radial basis functions. Compactly supported skew radial basis functions are employed to improve model accuracy, order, and convergence properties. The extension of the algorithm to higher-dimensional ranges produces reduced-order models by exploiting the existence of correlation in the range variable data. Structure is tested not just in a single time series but between all pairs of time series. We illustrate the new methodologies using several illustrative problems, including modeling data on manifolds and the prediction of chaotic time series.


2019 ◽  
Vol 23 (10) ◽  
pp. 4323-4331 ◽  
Author(s):  
Wouter J. M. Knoben ◽  
Jim E. Freer ◽  
Ross A. Woods

Abstract. A traditional metric used in hydrology to summarize model performance is the Nash–Sutcliffe efficiency (NSE). Increasingly an alternative metric, the Kling–Gupta efficiency (KGE), is used instead. When NSE is used, NSE = 0 corresponds to using the mean flow as a benchmark predictor. The same reasoning is applied in various studies that use KGE as a metric: negative KGE values are viewed as bad model performance, and only positive values are seen as good model performance. Here we show that using the mean flow as a predictor does not result in KGE = 0, but instead KGE =1-√2≈-0.41. Thus, KGE values greater than −0.41 indicate that a model improves upon the mean flow benchmark – even if the model's KGE value is negative. NSE and KGE values cannot be directly compared, because their relationship is non-unique and depends in part on the coefficient of variation of the observed time series. Therefore, modellers who use the KGE metric should not let their understanding of NSE values guide them in interpreting KGE values and instead develop new understanding based on the constitutive parts of the KGE metric and the explicit use of benchmark values to compare KGE scores against. More generally, a strong case can be made for moving away from ad hoc use of aggregated efficiency metrics and towards a framework based on purpose-dependent evaluation metrics and benchmarks that allows for more robust model adequacy assessment.


2007 ◽  
Vol 73 (8) ◽  
pp. 2468-2478 ◽  
Author(s):  
Bernadette Klotz ◽  
D. Leo Pyle ◽  
Bernard M. Mackey

ABSTRACT A new primary model based on a thermodynamically consistent first-order kinetic approach was constructed to describe non-log-linear inactivation kinetics of pressure-treated bacteria. The model assumes a first-order process in which the specific inactivation rate changes inversely with the square root of time. The model gave reasonable fits to experimental data over six to seven orders of magnitude. It was also tested on 138 published data sets and provided good fits in about 70% of cases in which the shape of the curve followed the typical convex upward form. In the remainder of published examples, curves contained additional shoulder regions or extended tail regions. Curves with shoulders could be accommodated by including an additional time delay parameter and curves with tails shoulders could be accommodated by omitting points in the tail beyond the point at which survival levels remained more or less constant. The model parameters varied regularly with pressure, which may reflect a genuine mechanistic basis for the model. This property also allowed the calculation of (a) parameters analogous to the decimal reduction time D and z, the temperature increase needed to change the D value by a factor of 10, in thermal processing, and hence the processing conditions needed to attain a desired level of inactivation; and (b) the apparent thermodynamic volumes of activation associated with the lethal events. The hypothesis that inactivation rates changed as a function of the square root of time would be consistent with a diffusion-limited process.


Author(s):  
Arnaud Dufays ◽  
Elysee Aristide Houndetoungan ◽  
Alain Coën

Abstract Change-point (CP) processes are one flexible approach to model long time series. We propose a method to uncover which model parameters truly vary when a CP is detected. Given a set of breakpoints, we use a penalized likelihood approach to select the best set of parameters that changes over time and we prove that the penalty function leads to a consistent selection of the true model. Estimation is carried out via the deterministic annealing expectation-maximization algorithm. Our method accounts for model selection uncertainty and associates a probability to all the possible time-varying parameter specifications. Monte Carlo simulations highlight that the method works well for many time series models including heteroskedastic processes. For a sample of fourteen hedge fund (HF) strategies, using an asset-based style pricing model, we shed light on the promising ability of our method to detect the time-varying dynamics of risk exposures as well as to forecast HF returns.


2018 ◽  
Vol 2018 ◽  
pp. 1-11
Author(s):  
Mostafa Ali ◽  
Yasser Mohamed

3D Visualization provides a mean for communicating different construction activities to diverse audiences. The scope, level of detail, and time resolution of the 3D visualization process are determined based on the targeted audiences. Developing the 3D visualization requires obtaining and merging heterogeneous data from different sources (such as BIM model and CPM schedule). The data merging process is usually carried out on ad hoc basis for a specific visualization case which limits the reusability of the process. This paper discusses a framework for automatic merging of heterogeneous data to create a visualization. The paper describes developing an ontology which captures concepts related to the visualization process. Then, heterogeneous data sources that are commonly used in construction are fed into the ontology which can be queried to produce different visualization scenarios. The potential of this approach has been demonstrated by providing multiple visualization scenarios that cover different audiences, levels of detail, and time resolutions.


2013 ◽  
Vol 20 (6) ◽  
pp. 1071-1078 ◽  
Author(s):  
E. Piegari ◽  
R. Di Maio ◽  
A. Avella

Abstract. Reasonable prediction of landslide occurrences in a given area requires the choice of an appropriate probability distribution of recurrence time intervals. Although landslides are widespread and frequent in many parts of the world, complete databases of landslide occurrences over large periods are missing and often such natural disasters are treated as processes uncorrelated in time and, therefore, Poisson distributed. In this paper, we examine the recurrence time statistics of landslide events simulated by a cellular automaton model that reproduces well the actual frequency-size statistics of landslide catalogues. The complex time series are analysed by varying both the threshold above which the time between events is recorded and the values of the key model parameters. The synthetic recurrence time probability distribution is shown to be strongly dependent on the rate at which instability is approached, providing a smooth crossover from a power-law regime to a Weibull regime. Moreover, a Fano factor analysis shows a clear indication of different degrees of correlation in landslide time series. Such a finding supports, at least in part, a recent analysis performed for the first time of an historical landslide time series over a time window of fifty years.


2017 ◽  
Vol 17 (6) ◽  
pp. 401-422 ◽  
Author(s):  
Buu-Chau Truong ◽  
Cathy WS Chen ◽  
Songsak Sriboonchitta

This study proposes a new model for integer-valued time series—the hysteretic Poisson integer-valued generalized autoregressive conditionally heteroskedastic (INGARCH) model—which has an integrated hysteresis zone in the switching mechanism of the conditional expectation. Our modelling framework provides a parsimonious representation of the salient features of integer-valued time series, such as discreteness, over-dispersion, asymmetry and structural change. We adopt Bayesian methods with a Markov chain Monte Carlo sampling scheme to estimate model parameters and utilize the Bayesian information criteria for model comparison. We then apply the proposed model to five real time series of criminal incidents recorded by the New South Wales Police Force in Australia. Simulation results and empirical analysis highlight the better performance of hysteresis in modelling the integer-valued time series.


Sign in / Sign up

Export Citation Format

Share Document