scholarly journals Joint stochastic constraint of a large data set from a salt dome

Geophysics ◽  
2016 ◽  
Vol 81 (2) ◽  
pp. ID1-ID24 ◽  
Author(s):  
Alan W. Roberts ◽  
Richard W. Hobbs ◽  
Michael Goldstein ◽  
Max Moorkamp ◽  
Marion Jegen ◽  
...  

Understanding the uncertainty associated with large joint geophysical surveys, such as 3D seismic, gravity, and magnetotelluric (MT) studies, is a challenge, conceptually and practically. By demonstrating the use of emulators, we have adopted a Monte Carlo forward screening scheme to globally test a prior model space for plausibility. This methodology means that the incorporation of all types of uncertainty is made conceptually straightforward, by designing an appropriate prior model space, upon which the results are dependent, from which to draw candidate models. We have tested the approach on a salt dome target, over which three data sets had been obtained; wide-angle seismic refraction, MT and gravity data. We have considered the data sets together using an empirically measured uncertain physical relationship connecting the three different model parameters: seismic velocity, density, and resistivity, and we have indicated the value of a joint approach, rather than considering individual parameter models. The results were probability density functions over the model parameters, together with a halite probability map. The emulators give a considerable speed advantage over running the full simulator codes, and we consider their use to have great potential in the development of geophysical statistical constraint methods.

2020 ◽  
Vol 70 (1) ◽  
pp. 145-161 ◽  
Author(s):  
Marnus Stoltz ◽  
Boris Baeumer ◽  
Remco Bouckaert ◽  
Colin Fox ◽  
Gordon Hiscott ◽  
...  

Abstract We describe a new and computationally efficient Bayesian methodology for inferring species trees and demographics from unlinked binary markers. Likelihood calculations are carried out using diffusion models of allele frequency dynamics combined with novel numerical algorithms. The diffusion approach allows for analysis of data sets containing hundreds or thousands of individuals. The method, which we call Snapper, has been implemented as part of the BEAST2 package. We conducted simulation experiments to assess numerical error, computational requirements, and accuracy recovering known model parameters. A reanalysis of soybean SNP data demonstrates that the models implemented in Snapp and Snapper can be difficult to distinguish in practice, a characteristic which we tested with further simulations. We demonstrate the scale of analysis possible using a SNP data set sampled from 399 fresh water turtles in 41 populations. [Bayesian inference; diffusion models; multi-species coalescent; SNP data; species trees; spectral methods.]


Geophysics ◽  
2005 ◽  
Vol 70 (1) ◽  
pp. J1-J12 ◽  
Author(s):  
Lopamudra Roy ◽  
Mrinal K. Sen ◽  
Donald D. Blankenship ◽  
Paul L. Stoffa ◽  
Thomas G. Richter

Interpretation of gravity data warrants uncertainty estimation because of its inherent nonuniqueness. Although the uncertainties in model parameters cannot be completely reduced, they can aid in the meaningful interpretation of results. Here we have employed a simulated annealing (SA)–based technique in the inversion of gravity data to derive multilayered earth models consisting of two and three dimensional bodies. In our approach, we assume that the density contrast is known, and we solve for the coordinates or shapes of the causative bodies, resulting in a nonlinear inverse problem. We attempt to sample the model space extensively so as to estimate several equally likely models. We then use all the models sampled by SA to construct an approximate, marginal posterior probability density function (PPD) in model space and several orders of moments. The correlation matrix clearly shows the interdependence of different model parameters and the corresponding trade-offs. Such correlation plots are used to study the effect of a priori information in reducing the uncertainty in the solutions. We also investigate the use of derivative information to obtain better depth resolution and to reduce underlying uncertainties. We applied the technique on two synthetic data sets and an airborne-gravity data set collected over Lake Vostok, East Antarctica, for which a priori constraints were derived from available seismic and radar profiles. The inversion results produced depths of the lake in the survey area along with the thickness of sediments. The resulting uncertainties are interpreted in terms of the experimental geometry and data error.


2020 ◽  
Vol 224 (1) ◽  
pp. 40-68 ◽  
Author(s):  
Thibaut Astic ◽  
Lindsey J Heagy ◽  
Douglas W Oldenburg

SUMMARY In a previous paper, we introduced a framework for carrying out petrophysically and geologically guided geophysical inversions. In that framework, petrophysical and geological information is modelled with a Gaussian mixture model (GMM). In the inversion, the GMM serves as a prior for the geophysical model. The formulation and applications were confined to problems in which a single physical property model was sought, and a single geophysical data set was available. In this paper, we extend that framework to jointly invert multiple geophysical data sets that depend on multiple physical properties. The petrophysical and geological information is used to couple geophysical surveys that, otherwise, rely on independent physics. This requires advancements in two areas. First, an extension from a univariate to a multivariate analysis of the petrophysical data, and their inclusion within the inverse problem, is necessary. Secondly, we address the practical issues of simultaneously inverting data from multiple surveys and finding a solution that acceptably reproduces each one, along with the petrophysical and geological information. To illustrate the efficacy of our approach and the advantages of carrying out multi-physics inversions coupled with petrophysical and geological information, we invert synthetic gravity and magnetic data associated with a kimberlite deposit. The kimberlite pipe contains two distinct facies embedded in a host rock. Inverting the data sets individually, even with petrophysical information, leads to a binary geological model: background or undetermined kimberlite. A multi-physics inversion, with petrophysical information, differentiates between the two main kimberlite facies of the pipe. Through this example, we also highlight the capabilities of our framework to work with interpretive geological assumptions when minimal quantitative information is available. In those cases, the dynamic updates of the GMM allow us to perform multi-physics inversions by learning a petrophysical model.


2017 ◽  
Vol 5 (4) ◽  
pp. 1
Author(s):  
I. E. Okorie ◽  
A. C. Akpanta ◽  
J. Ohakwe ◽  
D. C. Chikezie ◽  
C. U. Onyemachi ◽  
...  

This paper introduces a new generator of probability distribution-the adjusted log-logistic generalized (ALLoG) distribution and a new extension of the standard one parameter exponential distribution called the adjusted log-logistic generalized exponential (ALLoGExp) distribution. The ALLoGExp distribution is a special case of the ALLoG distribution and we have provided some of its statistical and reliability properties. Notably, the failure rate could be monotonically decreasing, increasing or upside-down bathtub shaped depending on the value of the parameters $\delta$ and $\theta$. The method of maximum likelihood estimation was proposed to estimate the model parameters. The importance and flexibility of he ALLoGExp distribution was demonstrated with a real and uncensored lifetime data set and its fit was compared with five other exponential related distributions. The results obtained from the model fittings shows that the ALLoGExp distribution provides a reasonably better fit than the one based on the other fitted distributions. The ALLoGExp distribution is therefore ecommended for effective modelling of lifetime data sets.


2021 ◽  
Vol 37 (3) ◽  
pp. 481-490
Author(s):  
Chenyong Song ◽  
Dongwei Wang ◽  
Haoran Bai ◽  
Weihao Sun

HighlightsThe proposed data enhancement method can be used for small-scale data sets with rich sample image features.The accuracy of the new model reaches 98.5%, which is better than the traditional CNN method.Abstract: GoogLeNet offers far better performance in identifying apple disease compared to traditional methods. However, the complexity of GoogLeNet is relatively high. For small volumes of data, GoogLeNet does not achieve the same performance as it does with large-scale data. We propose a new apple disease identification model using GoogLeNet’s inception module. The model adopts a variety of methods to optimize its generalization ability. First, geometric transformation and image modification of data enhancement methods (including rotation, scaling, noise interference, random elimination, color space enhancement) and random probability and appropriate combination of strategies are used to amplify the data set. Second, we employ a deep convolution generative adversarial network (DCGAN) to enhance the richness of generated images by increasing the diversity of the noise distribution of the generator. Finally, we optimize the GoogLeNet model structure to reduce model complexity and model parameters, making it more suitable for identifying apple tree diseases. The experimental results show that our approach quickly detects and classifies apple diseases including rust, spotted leaf disease, and anthrax. It outperforms the original GoogLeNet in recognition accuracy and model size, with identification accuracy reaching 98.5%, making it a feasible method for apple disease classification. Keywords: Apple disease identification, Data enhancement, DCGAN, GoogLeNet.


Author(s):  
Rajendra Prasad ◽  
Lalit Kumar Gupta ◽  
A. Beesham ◽  
G. K. Goswami ◽  
Anil Kumar Yadav

In this paper, we investigate a Bianchi type I exact Universe by taking into account the cosmological constant as the source of energy at the present epoch. We have performed a [Formula: see text] test to obtain the best fit values of the model parameters of the Universe in the derived model. We have used two types of data sets, viz., (i) 31 values of the Hubble parameter and (ii) the 1048 Pantheon data set of various supernovae distance moduli and apparent magnitudes. From both the data sets, we have estimated the current values of the Hubble constant, density parameters [Formula: see text] and [Formula: see text]. The dynamics of the deceleration parameter shows that the Universe was in a decelerating phase for redshift [Formula: see text]. At a transition redshift [Formula: see text], the present Universe entered an accelerating phase of expansion. The current age of the Universe is obtained as [Formula: see text] Gyrs. This is in good agreement with the value of [Formula: see text] calculated from the Plank collaboration results and WMAP observations.


2021 ◽  
Author(s):  
Gah-Yi Ban ◽  
N. Bora Keskin

We consider a seller who can dynamically adjust the price of a product at the individual customer level, by utilizing information about customers’ characteristics encoded as a d-dimensional feature vector. We assume a personalized demand model, parameters of which depend on s out of the d features. The seller initially does not know the relationship between the customer features and the product demand but learns this through sales observations over a selling horizon of T periods. We prove that the seller’s expected regret, that is, the revenue loss against a clairvoyant who knows the underlying demand relationship, is at least of order [Formula: see text] under any admissible policy. We then design a near-optimal pricing policy for a semiclairvoyant seller (who knows which s of the d features are in the demand model) who achieves an expected regret of order [Formula: see text]. We extend this policy to a more realistic setting, where the seller does not know the true demand predictors, and show that this policy has an expected regret of order [Formula: see text], which is also near-optimal. Finally, we test our theory on simulated data and on a data set from an online auto loan company in the United States. On both data sets, our experimentation-based pricing policy is superior to intuitive and/or widely-practiced customized pricing methods, such as myopic pricing and segment-then-optimize policies. Furthermore, our policy improves upon the loan company’s historical pricing decisions by 47% in expected revenue over a six-month period. This paper was accepted by Noah Gans, stochastic models and simulation.


Geophysics ◽  
2012 ◽  
Vol 77 (4) ◽  
pp. E251-E263 ◽  
Author(s):  
Anna Avdeeva ◽  
Dmitry Avdeev ◽  
Marion Jegen

Detecting a salt dome overhang is known to be problematic by seismic methods alone. We used magnetotellurics (MT) as a complementary method to seismics to investigate the detectability of a salt dome overhang. A comparison of MT responses for 3D synthetic salt models with and without overhang shows that MT is very sensitive to shallow salt structures and suggests that it should be possible to detect an overhang. To further investigate the resolution capability of MT for a salt dome overhang, we performed a 3D MT inversion study and investigated the impact of model parametrization and regularization. We showed that using the logarithms of the conductivities as model parameters is crucial for inverting data from resistive salt structures because, in this case, commonly used Tikhonov-type stabilizers work more equally for smoothing the resistive and conductive structures. The use of a logarithmic parametrization also accelerated the convergence and produced better inversion results. When the Laplace operator was used as a regularization functional, we still observed that the inversion algorithm allows spatial resistivity gradients. These spatial gradients are reduced if a regularization based on first derivatives in contrast to the Laplace operator is introduced. To demonstrate the favorable performance when logarithmic parametrization and gradient-based regularization are employed, we first inverted a data set simulated for a simple model of two adjacent blocks. Subsequently, we applied the code to a more realistic salt dome overhang detectability study. The results from the detectability study are encouraging and suggest that 3D MT inversion can be applied to decide whether the overhang is present in the shallow salt structure even in the case when only profile data are available. However, to resolve the overhang, a dense MT site coverage above the flanks of the salt dome is required.


2020 ◽  
Vol 36 (1) ◽  
pp. 89-115 ◽  
Author(s):  
Harvey Goldstein ◽  
Natalie Shlomo

AbstractThe requirement to anonymise data sets that are to be released for secondary analysis should be balanced by the need to allow their analysis to provide efficient and consistent parameter estimates. The proposal in this article is to integrate the process of anonymisation and data analysis. The first stage uses the addition of random noise with known distributional properties to some or all variables in a released (already pseudonymised) data set, in which the values of some identifying and sensitive variables for data subjects of interest are also available to an external ‘attacker’ who wishes to identify those data subjects in order to interrogate their records in the data set. The second stage of the analysis consists of specifying the model of interest so that parameter estimation accounts for the added noise. Where the characteristics of the noise are made available to the analyst by the data provider, we propose a new method that allows a valid analysis. This is formally a measurement error model and we describe a Bayesian MCMC algorithm that recovers consistent estimates of the true model parameters. A new method for handling categorical data is presented. The article shows how an appropriate noise distribution can be determined.


2021 ◽  
Author(s):  
Shekoofeh Hedayati ◽  
Brad Wyble

Previous research has shown that Attentional blink (AB) data can differ between tasks, or subjects and it can be challenging to interpret these differences. In this paper, we provided a ready-to-use tool that allows researchers to map their data onto the episodic Simultaneous Type, Serial Token (eSTST) model. This tool uses the Markov Chain Monte Carlo algorithm to find the best set of 3 model parameters to simulate a given AB pattern. These 3 parameters have cognitive interpretations, such that differences in these parameters between different paradigms can be used for inferences about the timing of attentional deployment or the encoding of memory. Additionally, our tool allows for a combination of quantitative fitting against the overall pattern of data points, and qualitative fitting for theoretically important features. We demonstrate the algorithm using several data sets, showing that it can find cognitively interpretable parameter sets for some of them, but fails to find a good fit for one data set. This indicates an explanatory boundary of the eSTST model. Finally, we provide a feature to avoid overfitting of individual data points with high uncertainty, such as in the case of individual participant data.


Sign in / Sign up

Export Citation Format

Share Document