scholarly journals Seismic Tomography Using Variational Inference Methods

Author(s):  
Xin Zhang ◽  
Andrew Curtis

<p><span>In a variety of geoscientific applications we require maps of subsurface properties together with the corresponding maps of uncertainties to assess their reliability. Seismic tomography is a method that is widely used to generate those maps. Since tomography is significantly nonlinear, Monte Carlo sampling methods are often used for this purpose, but they are generally computationally intractable for large data sets and high-dimensionality parameter spaces. To extend uncertainty analysis to larger systems, we introduce variational inference methods to conduct seismic tomography. In contrast to Monte Carlo sampling, variational methods solve the Bayesian inference problem as an optimization problem yet still provide fully nonlinear, probabilistic results. This is achieved by minimizing the Kullback-Leibler (KL) divergence between approximate and target probability distributions within a predefined family of probability distributions.</span></p><p><span>We introduce two variational inference methods: automatic differential variational inference (ADVI) and Stein variational gradient descent (SVGD). In ADVI a Gaussian probability distribution is assumed and optimized to approximate the posterior probability distribution. In SVGD a smooth transform is iteratively applied to an initial probability distribution to obtain an approximation to the posterior probability distribution. At each iteration the transform is determined by seeking the steepest descent direction that minimizes the KL-divergence. </span></p><p><span>We apply the two variational inference methods to 2D travel time tomography using both synthetic and real data, and compare the results to those obtained from two different Monte Carlo sampling methods: Metropolis-Hastings Markov chain Monte Carlo (MH-McMC) and reversible jump Markov chain Monte Carlo (rj-McMC). The results show that ADVI provides a biased approximation because of its Gaussian approximation, whereas SVGD produces more accurate approximations to the results of MH-McMC. In comparison rj-McMC produces smoother mean velocity models and lower standard deviations because the parameterization used in rj-McMC (Voronoi cells) imposes prior restrictions on the pixelated form of models: all pixels within each Voronoi cell have identical velocities. This suggests that the results of rj-McMC need to be interpreted in the light of the specific prior information imposed by the parameterization. Both variational methods estimate the posterior distribution at significantly lower computational cost, provided that gradients of parameters with respect to data can be calculated efficiently. We therefore expect that the methods can be applied fruitfully to many other types of geophysical inverse problems.</span></p>

Author(s):  
Andreas Raue ◽  
Clemens Kreutz ◽  
Fabian Joachim Theis ◽  
Jens Timmer

Increasingly complex applications involve large datasets in combination with nonlinear and high-dimensional mathematical models. In this context, statistical inference is a challenging issue that calls for pragmatic approaches that take advantage of both Bayesian and frequentist methods. The elegance of Bayesian methodology is founded in the propagation of information content provided by experimental data and prior assumptions to the posterior probability distribution of model predictions. However, for complex applications, experimental data and prior assumptions potentially constrain the posterior probability distribution insufficiently. In these situations, Bayesian Markov chain Monte Carlo sampling can be infeasible. From a frequentist point of view, insufficient experimental data and prior assumptions can be interpreted as non-identifiability. The profile-likelihood approach offers to detect and to resolve non-identifiability by experimental design iteratively. Therefore, it allows one to better constrain the posterior probability distribution until Markov chain Monte Carlo sampling can be used securely. Using an application from cell biology, we compare both methods and show that a successive application of the two methods facilitates a realistic assessment of uncertainty in model predictions.


2021 ◽  
Author(s):  
Nicola Piana Agostinetti ◽  
Christina Dahnér-Lindkvist ◽  
Savka Dineva

<p>Rock elasticity in the subsurface can change in response to natural phenomena (e.g. massive precipitation, magmatic processes) and human activities (e.g. water injection in geothermal wells, ore-body exploitation). However, understanding and monitoring the evolution of physical properties of the crust is a challenging due to the limited possibility of reaching such depths and making direct measurements of the state of the rocks. Indirect measurements, like seismic tomography, can give some insights, but are generally biased by the un-even distribution (in space and time) of the information collected from seismic observations (travel-times and/or waveforms). Here we apply a Bayesian approach to overcome such limitations, so that data uncertainties and data distribution are fully accounted in the reconstruction of the posterior probability distribution of the rock elasticity  We compute a full 4D local earthquake tomography based on trans-dimensional Markov chain Monte Carlo sampling of 4D elastic models, where the resolution in space and time is fully data-driven. To test our workflow, we make use of a “controlled laboratory”: we record seismic data during one month of mining activities across a 800x700x600 m volume of Kiruna mine (LKAB, Sweden). During such period, we obtain about 260 000 P-wave and 240 000 S-wave travel-times coming from about 36000  seismic events. We operate a preliminary selection of the well-located events, using a Monte Carlo search. Arrival-times of about 19 000 best-located events (location errors less than 20m) are used as input to the tomography workflow. Preliminary results indicate that: (1) short-term (few hours) evolutions of the elastic field are mainly driven by seismic activation, i.e. the occurrence of a seismic swarm, close to the mine ore-passes. Such phenomena partially mask the effects of explosions; (2) long-term (2-3 days) evolutions of the elastic field closely match the local measurements of the stress field at a colocated stress cell. </p>


2021 ◽  
Author(s):  
Xuebin Zhao ◽  
Andrew Curtis ◽  
Xin Zhang

<p>Seismic travel time tomography is used widely to image the Earth's interior structure and to infer subsurface properties. Tomography is an inverse problem, and computationally expensive nonlinear inverse methods are often deployed in order to understand uncertainties in the tomographic results. Monte Carlo sampling methods estimate the posterior probability distribution which describes the solution to Bayesian tomographic problems, but they are computationally expensive and often intractable for high dimensional model spaces and large data sets due to the curse of dimensionality. We therefore introduce a new method of variational inference to solve Bayesian seismic tomography problems using optimization methods, while still providing fully nonlinear, probabilistic results. The new method, known as normalizing flows, warps a simple and known distribution (for example a Uniform or Gaussian distribution) into an optimal approximation to the posterior distribution through a chain of invertible transforms. These transforms are selected from a library of suitable functions, some of which invoke neural networks internally. We test the method using both synthetic and field data. The results show that normalizing flows can produce similar mean and uncertainty maps to those obtained from both Monte Carlo and another variational method (Stein varational gradient descent), at significantly decreased computational cost. In our tomographic tests, normalizing flows improves both accuracy and efficiency, producing maps of UK surface wave speeds and their uncertainties at the finest resolution and the lowest computational cost to-date, allowing results to be interrogated efficiently and quantitatively for subsurface structure.</p>


2012 ◽  
Vol 20 (4) ◽  
pp. 257-263
Author(s):  
Hiroyuki Okazaki

Summary In [14] we formalized probability and probability distribution on a finite sample space. In this article first we propose a formalization of the class of finite sample spaces whose element’s probability distributions are equivalent with each other. Next, we formalize the probability measure of the class of sample spaces we have formalized above. Finally, we formalize the sampling and posterior probability.


Author(s):  
Munir S Pathan ◽  
S M Pradhan ◽  
T Palani Selvam

Abstract In this study, the Bayesian probabilistic approach is applied for the estimation of the actual dose using personnel monitoring dose records of occupational workers. To implement the Bayesian approach, the probability distribution of the uncertainty in the reported dose as a function of the actual dose is derived. Using the uncertainty distribution function of reported dose and prior knowledge of dose levels generally observed in a monitoring period, the posterior probability distribution of the actual dose is estimated. The posterior distributions of each monitoring period in a year are convoluted to arrive at actual annual dose distribution. The estimated actual doses distributions show a significant deviation from reported annual doses particularly for low annual doses.


2000 ◽  
Vol 16 (5) ◽  
pp. 1487-1522 ◽  
Author(s):  
Jari P Kaipio ◽  
Ville Kolehmainen ◽  
Erkki Somersalo ◽  
Marko Vauhkonen

2010 ◽  
Vol 27 (6) ◽  
pp. 795-803 ◽  
Author(s):  
Raymond R. Hill ◽  
Derek A. Leggio ◽  
Shay R. Capehart ◽  
August G. Roesener

2020 ◽  
Vol 09 (04) ◽  
pp. 2050017
Author(s):  
Benjamin D. Donovan ◽  
Randall L. McEntaffer ◽  
Casey T. DeRoo ◽  
James H. Tutt ◽  
Fabien Grisé ◽  
...  

The soft X-ray grating spectrometer on board the Off-plane Grating Rocket Experiment (OGRE) hopes to achieve the highest resolution soft X-ray spectrum of an astrophysical object when it is launched via suborbital rocket. Paramount to the success of the spectrometer are the performance of the [Formula: see text] reflection gratings populating its reflection grating assembly. To test current grating fabrication capabilities, a grating prototype for the payload was fabricated via electron-beam lithography at The Pennsylvania State University’s Materials Research Institute and was subsequently tested for performance at Max Planck Institute for Extraterrestrial Physics’ PANTER X-ray Test Facility. Bayesian modeling of the resulting data via Markov chain Monte Carlo (MCMC) sampling indicated that the grating achieved the OGRE single-grating resolution requirement of [Formula: see text] at the 94% confidence level. The resulting [Formula: see text] posterior probability distribution suggests that this confidence level is likely a conservative estimate though, since only a finite [Formula: see text] parameter space was sampled and the model could not constrain the upper bound of [Formula: see text] to less than infinity. Raytrace simulations of the tested system found that the observed data can be reproduced with a grating performing at [Formula: see text]. It is therefore postulated that the behavior of the obtained [Formula: see text] posterior probability distribution can be explained by a finite measurement limit of the system and not a finite limit on [Formula: see text]. Implications of these results and improvements to the test setup are discussed.


Sign in / Sign up

Export Citation Format

Share Document