Automatic estimation of large residual statics corrections

Geophysics ◽  
1986 ◽  
Vol 51 (2) ◽  
pp. 332-346 ◽  
Author(s):  
Daniel H. Rothman

Conventional approaches to residual statics estimation obtain solutions by performing linear inversion of observed traveltime deviations. A crucial component of these procedures is picking time delays; gross errors in these picks are known as “cycle skips” or “leg jumps” and are the bane of linear traveltime inversion schemes. This paper augments Rothman (1985), which demonstrated that the estimation of large statics in noise‐contaminated data is posed better as a nonlinear, rather than as a linear, inverse problem. Cycle skips then appear as local (secondary) minima of the resulting nonlinear optimization problem. In the earlier paper, a Monte Carlo technique from statistical mechanics was adapted to perform global optimization, and the technique was applied to synthetic data. Here I present an application of a similar Monte Carlo method to field data from the Wyoming Overthrust belt. Key changes, however, have led to a more efficient and practical algorithm. The new technique performs explicit crosscorrelation of traces. Instead of picking the peaks of these crosscorrelation functions, the method transforms the crosscorrelation functions to probability distributions and then draws random numbers from the distributions. Estimates of statics are now iteratively updated by this procedure until convergence to the optimal stack is achieved. Here I also derive several theoretical properties of the algorithm. The method is expressed as a Markov chain, in which the equilibrium (steady‐state) distribution is the Gibbs distribution of statistical mechanics.

Entropy ◽  
2021 ◽  
Vol 23 (6) ◽  
pp. 662
Author(s):  
Mateu Sbert ◽  
Jordi Poch ◽  
Shuning Chen ◽  
Víctor Elvira

In this paper, we present order invariance theoretical results for weighted quasi-arithmetic means of a monotonic series of numbers. The quasi-arithmetic mean, or Kolmogorov–Nagumo mean, generalizes the classical mean and appears in many disciplines, from information theory to physics, from economics to traffic flow. Stochastic orders are defined on weights (or equivalently, discrete probability distributions). They were introduced to study risk in economics and decision theory, and recently have found utility in Monte Carlo techniques and in image processing. We show in this paper that, if two distributions of weights are ordered under first stochastic order, then for any monotonic series of numbers their weighted quasi-arithmetic means share the same order. This means for instance that arithmetic and harmonic mean for two different distributions of weights always have to be aligned if the weights are stochastically ordered, this is, either both means increase or both decrease. We explore the invariance properties when convex (concave) functions define both the quasi-arithmetic mean and the series of numbers, we show its relationship with increasing concave order and increasing convex order, and we observe the important role played by a new defined mirror property of stochastic orders. We also give some applications to entropy and cross-entropy and present an example of multiple importance sampling Monte Carlo technique that illustrates the usefulness and transversality of our approach. Invariance theorems are useful when a system is represented by a set of quasi-arithmetic means and we want to change the distribution of weights so that all means evolve in the same direction.


2003 ◽  
Vol 66 (10) ◽  
pp. 1900-1910 ◽  
Author(s):  
VALERIE J. DAVIDSON ◽  
JOANNE RYKS

The objective of food safety risk assessment is to quantify levels of risk for consumers as well as to design improved processing, distribution, and preparation systems that reduce exposure to acceptable limits. Monte Carlo simulation tools have been used to deal with the inherent variability in food systems, but these tools require substantial data for estimates of probability distributions. The objective of this study was to evaluate the use of fuzzy values to represent uncertainty. Fuzzy mathematics and Monte Carlo simulations were compared to analyze the propagation of uncertainty through a number of sequential calculations in two different applications: estimation of biological impacts and economic cost in a general framework and survival of Campylobacter jejuni in a sequence of five poultry processing operations. Estimates of the proportion of a population requiring hospitalization were comparable, but using fuzzy values and interval arithmetic resulted in more conservative estimates of mortality and cost, in terms of the intervals of possible values and mean values, compared to Monte Carlo calculations. In the second application, the two approaches predicted the same reduction in mean concentration (−4 log CFU/ml of rinse), but the limits of the final concentration distribution were wider for the fuzzy estimate (−3.3 to 5.6 log CFU/ml of rinse) compared to the probability estimate (−2.2 to 4.3 log CFU/ml of rinse). Interval arithmetic with fuzzy values considered all possible combinations in calculations and maximum membership grade for each possible result. Consequently, fuzzy results fully included distributions estimated by Monte Carlo simulations but extended to broader limits. When limited data defines probability distributions for all inputs, fuzzy mathematics is a more conservative approach for risk assessment than Monte Carlo simulations.


2014 ◽  
Author(s):  
Andreas Tuerk ◽  
Gregor Wiktorin ◽  
Serhat Güler

Quantification of RNA transcripts with RNA-Seq is inaccurate due to positional fragment bias, which is not represented appropriately by current statistical models of RNA-Seq data. This article introduces the Mix2(rd. "mixquare") model, which uses a mixture of probability distributions to model the transcript specific positional fragment bias. The parameters of the Mix2model can be efficiently trained with the Expectation Maximization (EM) algorithm resulting in simultaneous estimates of the transcript abundances and transcript specific positional biases. Experiments are conducted on synthetic data and the Universal Human Reference (UHR) and Brain (HBR) sample from the Microarray quality control (MAQC) data set. Comparing the correlation between qPCR and FPKM values to state-of-the-art methods Cufflinks and PennSeq we obtain an increase in R2value from 0.44 to 0.6 and from 0.34 to 0.54. In the detection of differential expression between UHR and HBR the true positive rate increases from 0.44 to 0.71 at a false positive rate of 0.1. Finally, the Mix2model is used to investigate biases present in the MAQC data. This reveals 5 dominant biases which deviate from the common assumption of a uniform fragment distribution. The Mix2software is available at http://www.lexogen.com/fileadmin/uploads/bioinfo/mix2model.tgz.


2020 ◽  
Vol 41 (2) ◽  
pp. 219-229 ◽  
Author(s):  
Ricardo Hideaki Miyajima ◽  
Paulo Torres Fenner ◽  
Gislaine Cristina Batistela ◽  
Danilo Simões

The processing of Eucalyptus logs is a stage that follows the full tree system in mechanized forest harvesting, commonly performed by grapple saw. Therefore, this activity presents some associated uncertainties, especially regarding technical and silvicultural factors that can affect productivity and production costs. To get around this problem, Monte Carlo simulation can be applied, or rather a technique that allows to measure the probabilities of values from factors that are under conditions of uncertainties, to which probability distributions are attributed. The objective of this study was to apply the Monte Carlo method for determining the probabilistic technical-economical coefficients of log processing using two different grapple saw models. Field data were obtained from an area of forest planted with Eucalyptus, located in the State of São Paulo, Brazil. For the technical analysis, the time study protocol was applied by the method of continuous reading of the operational cycle elements, which resulted in production. As for the estimated cost of programmed hour, the applied methods were recommended by the Food and Agriculture Organization of the United Nations. The incorporation of the uncertainties was carried out by applying the Monte Carlo simulation method, by which 100,000 random values were generated. The results showed that the crane empty movement is the operational element that most impacts the total time for processing the logs; the variables that most influence the productivity are specific to each grapple saw model; the difference of USD 0.04 m3 in production costs was observed between processors with gripping area of 0.58 m2 and 0.85 m2. The Monte Carlo method proved to be an applicable tool for mechanized wood harvesting for presenting a range of probability of occurrences for the operational elements and for the production cost.


2019 ◽  
Vol 623 ◽  
pp. A156 ◽  
Author(s):  
H. E. Delgado ◽  
L. M. Sarro ◽  
G. Clementini ◽  
T. Muraveva ◽  
A. Garofalo

In a recent study we analysed period–luminosity–metallicity (PLZ) relations for RR Lyrae stars using theGaiaData Release 2 (DR2) parallaxes. It built on a previous work that was based on the firstGaiaData Release (DR1), and also included period–luminosity (PL) relations for Cepheids and RR Lyrae stars. The method used to infer the relations fromGaiaDR2 data and one of the methods used forGaiaDR1 data was based on a Bayesian model, the full description of which was deferred to a subsequent publication. This paper presents the Bayesian method for the inference of the parameters ofPL(Z) relations used in those studies, the main feature of which is to manage the uncertainties on observables in a rigorous and well-founded way. The method encodes the probability relationships between the variables of the problem in a hierarchical Bayesian model and infers the posterior probability distributions of thePL(Z) relationship coefficients using Markov chain Monte Carlo simulation techniques. We evaluate the method with several semi-synthetic data sets and apply it to a sample of 200 fundamental and first-overtone RR Lyrae stars for whichGaiaDR1 parallaxes and literatureKs-band mean magnitudes are available. We define and test several hyperprior probabilities to verify their adequacy and check the sensitivity of the solution with respect to the prior choice. The main conclusion of this work, based on the test with semi-syntheticGaiaDR1 parallaxes, is the absolute necessity of incorporating the existing correlations between the period, metallicity, and parallax measurements in the form of model priors in order to avoid systematically biased results, especially in the case of non-negligible uncertainties in the parallaxes. The relation coefficients obtained here have been superseded by those presented in our recent paper that incorporates the findings of this work and the more recentGaiaDR2 measurements.


Minerals ◽  
2018 ◽  
Vol 8 (12) ◽  
pp. 579 ◽  
Author(s):  
Ryosuke Oyanagi ◽  
Atsushi Okamoto ◽  
Noriyoshi Tsuchiya

Water–rock interaction in surface and subsurface environments occurs in complex multicomponent systems and involves several reactions, including element transfer. Such kinetic information is obtained by fitting a forward model into the temporal evolution of solution chemistry or the spatial pattern recorded in the rock samples, although geochemical and petrological data are essentially sparse and noisy. Therefore, the optimization of kinetic parameters sometimes fails to converge toward the global minimum due to being trapped in a local minimum. In this study, we simultaneously present a novel framework to estimate multiple reaction-rate constants and the diffusivity of aqueous species from the mineral distribution pattern in a rock by using the reactive transport model coupled with the exchange Monte Carlo method. Our approach can estimate both the maximum likelihood and error of each parameter. We applied the method to the synthetic data, which were produced using a model for silica metasomatism and hydration in the olivine–quartz–H2O system. We tested the robustness and accuracy of our method over a wide range of noise intensities. This methodology can be widely applied to kinetic analyses of various kinds of water–rock interactions.


Sign in / Sign up

Export Citation Format

Share Document