Inversion and uncertainty estimation of gravity data using simulated annealing: An application over Lake Vostok, East Antarctica

Geophysics ◽  
2005 ◽  
Vol 70 (1) ◽  
pp. J1-J12 ◽  
Author(s):  
Lopamudra Roy ◽  
Mrinal K. Sen ◽  
Donald D. Blankenship ◽  
Paul L. Stoffa ◽  
Thomas G. Richter

Interpretation of gravity data warrants uncertainty estimation because of its inherent nonuniqueness. Although the uncertainties in model parameters cannot be completely reduced, they can aid in the meaningful interpretation of results. Here we have employed a simulated annealing (SA)–based technique in the inversion of gravity data to derive multilayered earth models consisting of two and three dimensional bodies. In our approach, we assume that the density contrast is known, and we solve for the coordinates or shapes of the causative bodies, resulting in a nonlinear inverse problem. We attempt to sample the model space extensively so as to estimate several equally likely models. We then use all the models sampled by SA to construct an approximate, marginal posterior probability density function (PPD) in model space and several orders of moments. The correlation matrix clearly shows the interdependence of different model parameters and the corresponding trade-offs. Such correlation plots are used to study the effect of a priori information in reducing the uncertainty in the solutions. We also investigate the use of derivative information to obtain better depth resolution and to reduce underlying uncertainties. We applied the technique on two synthetic data sets and an airborne-gravity data set collected over Lake Vostok, East Antarctica, for which a priori constraints were derived from available seismic and radar profiles. The inversion results produced depths of the lake in the survey area along with the thickness of sediments. The resulting uncertainties are interpreted in terms of the experimental geometry and data error.

Geophysics ◽  
2006 ◽  
Vol 71 (3) ◽  
pp. J41-J50 ◽  
Author(s):  
Tim van Zon ◽  
Kabir Roy-Chowdhury

Structural inversion of gravity data — deriving robust images of the subsurface by delineating lithotype boundaries using density anomalies — is an important goal in a range of exploration settings (e.g., ore bodies, salt flanks). Application of conventional inversion techniques in such cases, using [Formula: see text]-norms and regularization, produces smooth results and is thus suboptimal. We investigate an [Formula: see text]-norm-based approach which yields structural images without the need for explicit regularization. The density distribution of the subsurface is modeled with a uniform grid of cells. The density of each cell is inverted by minimizing the [Formula: see text]-norm of the data misfit using linear programming (LP) while satisfying a priori density constraints. The estimate of the noise level in a given data set is used to qualitatively determine an appropriate parameterization. The 2.5D and 3D synthetic tests adequately reconstruct the structure of the test models. The quality of the inversion depends upon a good prior estimation of the minimum depth of the anomalous body. A comparison of our results with one using truncated singular value decomposition (TSVD) on a noisy synthetic data set favors the LP-based method. There are two advantages in using LP for structural inversion of gravity data. First, it offers a natural way to incorporate a priori information regarding the model parameters. Second, it produces subsurface images with sharp boundaries (structure).


1997 ◽  
Vol 43 (143) ◽  
pp. 180-191 ◽  
Author(s):  
Ε. M. Morris ◽  
H. -P. Bader ◽  
P. Weilenmann

AbstractA physics-based snow model has been calibrated using data collected at Halley Bay, Antarctica, during the International Geophysical Year. Variations in snow temperature and density are well-simulated using values for the model parameters within the range reported from other polar field experiments. The effect of uncertainty in the parameter values on the accuracy of the predictions is no greater than the effect of instrumental error in the input data. Thus, this model can be used with parameters determined a priori rather than by optimization. The model has been validated using an independent data set from Halley Bay and then used to estimate 10 m temperatures on the Antarctic Peninsula plateau over the last half-century.


Geophysics ◽  
2016 ◽  
Vol 81 (4) ◽  
pp. ID59-ID71 ◽  
Author(s):  
Kyle Basler-Reeder ◽  
John Louie ◽  
Satish Pullammanappallil ◽  
Graham Kent

Joint seismic and gravity analyses of the San Emidio geothermal field in the northwest Basin and Range province of Nevada demonstrate that joint optimization changes interpretation outcomes. The prior 0.3–0.5 km deep basin interpretation gives way to a deeper than 1.3 km basin model. Kirchoff prestack depth migrations reveal that joint optimization ameliorates shallow velocity artifacts, flattening antiformal reflectors that could have been interpreted as folds. Furthermore, joint optimization provides a clearer picture of the rangefront fault by increasing the depth of constrained velocities, which improves reflector coherency at depth. This technique provides new insight when applied to existing data sets and could replace the existing strategy of forward modeling to match gravity data. We have achieved stable joint optimization through simulated annealing, a global optimization algorithm that does not require an accurate initial model. Balancing the combined seismic-gravity objective function is accomplished by a new approach based on analysis of Pareto charts. Gravity modeling uses an efficient convolution model, and the basis of seismic modeling is the highly efficient Vidale eikonal equation traveltime generation technique. Synthetic tests found that joint optimization improves velocity model accuracy and provides velocity control below the deepest headwave raypath. Restricted offset-range migration analysis provides insights into precritical and gradient reflections in the data set.


Geophysics ◽  
2001 ◽  
Vol 66 (5) ◽  
pp. 1438-1449 ◽  
Author(s):  
Seiichi Nagihara ◽  
Stuart A. Hall

In the northern continental slope of the Gulf of Mexico, large oil and gas reservoirs are often found beneath sheetlike, allochthonous salt structures that are laterally extensive. Some of these salt structures retain their diapiric feeders or roots beneath them. These hidden roots are difficult to image seismically. In this study, we develop a method to locate and constrain the geometry of such roots through 3‐D inverse modeling of the gravity anomalies observed over the salt structures. This inversion method utilizes a priori information such as the upper surface topography of the salt, which can be delineated by a limited coverage of 2‐D seismic data; the sediment compaction curve in the region; and the continuity of the salt body. The inversion computation is based on the simulated annealing (SA) global optimization algorithm. The SA‐based gravity inversion has some advantages over the approach based on damped least‐squares inversion. It is computationally efficient, can solve underdetermined inverse problems, can more easily implement complex a priori information, and does not introduce smoothing effects in the final density structure model. We test this inversion method using synthetic gravity data for a type of salt geometry that is common among the allochthonous salt structures in the Gulf of Mexico and show that it is highly effective in constraining the diapiric root. We also show that carrying out multiple inversion runs helps reduce the uncertainty in the final density model.


Geophysics ◽  
2016 ◽  
Vol 81 (2) ◽  
pp. ID1-ID24 ◽  
Author(s):  
Alan W. Roberts ◽  
Richard W. Hobbs ◽  
Michael Goldstein ◽  
Max Moorkamp ◽  
Marion Jegen ◽  
...  

Understanding the uncertainty associated with large joint geophysical surveys, such as 3D seismic, gravity, and magnetotelluric (MT) studies, is a challenge, conceptually and practically. By demonstrating the use of emulators, we have adopted a Monte Carlo forward screening scheme to globally test a prior model space for plausibility. This methodology means that the incorporation of all types of uncertainty is made conceptually straightforward, by designing an appropriate prior model space, upon which the results are dependent, from which to draw candidate models. We have tested the approach on a salt dome target, over which three data sets had been obtained; wide-angle seismic refraction, MT and gravity data. We have considered the data sets together using an empirically measured uncertain physical relationship connecting the three different model parameters: seismic velocity, density, and resistivity, and we have indicated the value of a joint approach, rather than considering individual parameter models. The results were probability density functions over the model parameters, together with a halite probability map. The emulators give a considerable speed advantage over running the full simulator codes, and we consider their use to have great potential in the development of geophysical statistical constraint methods.


2006 ◽  
Vol 7 (11) ◽  
pp. n/a-n/a ◽  
Author(s):  
John W. Holt ◽  
Thomas G. Richter ◽  
Scott D. Kempf ◽  
David L. Morse ◽  
Donald D. Blankenship

Author(s):  
Lopamudra Roy ◽  
Mrinal K. Sen ◽  
Donald D. Blankenship ◽  
Paul L. Stoffa ◽  
Thomas Richter

Geophysics ◽  
1993 ◽  
Vol 58 (4) ◽  
pp. 496-507 ◽  
Author(s):  
Mrinal K. Sen ◽  
Bimalendu B. Bhattacharya ◽  
Paul L. Stoffa

The resistivity interpretation problem involves the estimation of resistivity as a function of depth from the apparent resistivity values measured in the field as a function of electrode separation. This is commonly done either by curve matching using master curves or by more formal linearized inversion methods. The problems with linearized inversion schemes are fairly well known; they require that the starting model be close to the true solution. In this paper, we report the results from the application of a nonlinear global optimization method known as simulated annealing (SA) in the direct interpretation of resistivity sounding data. This method does not require a good starting model but is computationally more expensive. We used the heat bath algorithm of simulated annealing in which the mean square error (difference between observed and synthetic data) is used as the energy function that we attempt to minimize. Samples are drawn from the Gibbs probability distribution while the control parameter the temperature is slowly lowered, finally resulting in models that are very close to the globally optimal solutions. This method is also described in the framework of Bayesian statistics in which the Gibbs distribution is identified as the a posteriori probability density function in model space. Computation of the true posterior distribution requires computation of the energy function at each point in model space. However, a fairly good estimate of the most significant portion(s) of the function can be obtained from simulated annealing run in a reasonable computation time. This can be achieved by making several repeat runs of SA, each time starting with a new random number seed so that the most significant portion of the model space is adequately sampled. Once the posterior density function is known, many measures of dispersion can be made. In particular, we compute a mean model and the a posteriori covariance matrix. We have applied this method successfully to synthetic and field data. The resulting correlation covariance matrices indicate how the model parameters affect one another and are very useful in relating geology to the resulting resisitivity values.


Geophysics ◽  
2014 ◽  
Vol 79 (4) ◽  
pp. EN49-EN59 ◽  
Author(s):  
Daniele Boiero ◽  
Laura Valentina Socco

We implemented a joint inversion method to build P- and S-wave velocity models from Rayleigh-wave and P-wave refraction data, specifically designed to deal with laterally varying layered environments. A priori information available over the site and any physical law to link model parameters can be also incorporated. We tested and applied the algorithm behind the method. The results from a field data set revealed advantages with respect to individual surface-wave analysis (SWA) and body wave tomography (BWT). The algorithm imposed internal consistency for all the model parameters relaxing the required a priori assumptions (i.e., Poisson’s ratio level of confidence in SWA) and the inherent limitations of the two methods (i.e., velocity decreases for BWT).


Geophysics ◽  
2013 ◽  
Vol 78 (3) ◽  
pp. WB3-WB15 ◽  
Author(s):  
Shashi Prakash Sharma ◽  
Arkoprovo Biswas

A very fast simulated-annealing (VFSA) global optimization procedure is developed for the interpretation of self-potential (SP) anomaly measured over a 2D inclined sheet-type structure. Model parameters such as electric current dipole density ([Formula: see text]), horizontal and vertical locations of the center of the causative body ([Formula: see text] and [Formula: see text]), half-width ([Formula: see text]), and polarization/inclination angle ([Formula: see text]) of the sheet are optimized. VFSA optimization yields a large number of well-fitting solutions in a vast model space. Even though the assumed model space (minimum and maximum limits for each model parameter) is appropriate, it has been observed that models obtained by the VFSA process in the predefined model space could also be geologically erroneous. This offers new insight into the interpretation of self-potential data. Our optimization results indicate that there exist at least two sets of solutions that can fit the observed data equally well. The first set of solutions represents a local optimum and is geologically inappropriate. The second set of solutions represents the actual subsurface structure. The mean model estimated from the latter models represents the global solution. The efficacy of the developed approach has been demonstrated using various synthetic examples. Field data from the Surda area of Rakha Mines, India and the Bavarian woods, Germany are also interpreted. The computation time for finding this versatile solution is very short (52 s on a simple PC) and the proposed approach is found to be more advantageous than other approaches.


Sign in / Sign up

Export Citation Format

Share Document