Compilation and Evaluation of a Consistent Marine Gravity Data Set Surrounding Europe

Author(s):  
Heiner Denker ◽  
Markus Roland
Keyword(s):  
Data Set ◽  

Geophysics ◽  
1994 ◽  
Vol 59 (5) ◽  
pp. 722-732 ◽  
Author(s):  
Carlos Alberto Mendonça ◽  
João B. C. Silva

The equivalent layer calculation becomes more efficient by first converting the observed potential data set to a much smaller equivalent data set, thus saving considerable CPU time. This makes the equivalent‐source method of data interpolation very competitive with other traditional gridding techniques that ignore the fact that potential anomalies are harmonic functions. The equivalent data set is obtained by using a least‐squares iterative algorithm at each iteration that solves an underdetermined system fitting all observations selected from previous iterations and the observation with the greatest residual in the preceding iteration. The residuals are obtained by computing a set of “predicted observations” using the estimated parameters at the current iteration and subtracting them from the observations. The use of Cholesky’s decomposition to implement the algorithm leads to an efficient solution update everytime a new datum is processed. In addition, when applied to interpolation problems using equivalent layers, the method is optimized by approximating dot products by the discrete form of an analytic integration that can be evaluated with much less computational effort. Finally, the technique is applied to gravity data in a 2 × 2 degrees area containing 3137 observations, from Equant‐2 marine gravity survey offshore northern Brazil. Only 294 equivalent data are selected and used to interpolate the anomalies, creating a regular grid by using the equivalent‐layer technique. For comparison, the interpolation using the minimum‐curvature method was also obtained, producing equivalent results. The number of equivalent observations is usually one order of magnitude smaller than the total number of observations. As a result, the saving in computer time and memory is at least two orders of magnitude as compared to interpolation by equivalent layer using all observations.



2014 ◽  
Vol 37 (4) ◽  
pp. 419-439 ◽  
Author(s):  
Wenjin Chen ◽  
Robert Tenzer ◽  
Xiang Gu
Keyword(s):  


Geophysics ◽  
2005 ◽  
Vol 70 (1) ◽  
pp. J1-J12 ◽  
Author(s):  
Lopamudra Roy ◽  
Mrinal K. Sen ◽  
Donald D. Blankenship ◽  
Paul L. Stoffa ◽  
Thomas G. Richter

Interpretation of gravity data warrants uncertainty estimation because of its inherent nonuniqueness. Although the uncertainties in model parameters cannot be completely reduced, they can aid in the meaningful interpretation of results. Here we have employed a simulated annealing (SA)–based technique in the inversion of gravity data to derive multilayered earth models consisting of two and three dimensional bodies. In our approach, we assume that the density contrast is known, and we solve for the coordinates or shapes of the causative bodies, resulting in a nonlinear inverse problem. We attempt to sample the model space extensively so as to estimate several equally likely models. We then use all the models sampled by SA to construct an approximate, marginal posterior probability density function (PPD) in model space and several orders of moments. The correlation matrix clearly shows the interdependence of different model parameters and the corresponding trade-offs. Such correlation plots are used to study the effect of a priori information in reducing the uncertainty in the solutions. We also investigate the use of derivative information to obtain better depth resolution and to reduce underlying uncertainties. We applied the technique on two synthetic data sets and an airborne-gravity data set collected over Lake Vostok, East Antarctica, for which a priori constraints were derived from available seismic and radar profiles. The inversion results produced depths of the lake in the survey area along with the thickness of sediments. The resulting uncertainties are interpreted in terms of the experimental geometry and data error.



2021 ◽  
Author(s):  
Hélène Le Mével ◽  
Craig A. Miller ◽  
Yan Zhan

<p>In May 2018, a submarine eruption started offshore Mayotte (Comoros archipelago, Indian Ocean), and was first detected as a series of earthquake swarms. Since then, at least 6.4 km<sup>3</sup> of lava has erupted from a newly mapped volcanic edifice (MAYOBS campaigns), about 50 km east of Mayotte island. Since the onset of the eruption, GNSS stations on the island have recorded subsidence (up to 17 cm) and eastward displacement (up to 23 cm). We combine marine gravity data derived from satellite altimetry with finite element models to examine the magmatic system structure and its dynamics. First, we calculate the Mantle Bouguer Anomaly (MBA) by taking into account the gravitational effect of the bathymetry and the Moho interfaces, assuming a crust of constant thickness of 17.5 km and correction densities of 2.8 g/cm<sup>3</sup> and 3.3 g/cm<sup>3</sup> for the crust and mantle, respectively. We then invert the MBA to determine the anomalous density structures within the lithosphere, using the mixed Lp-norm inversion and Gauss-Newton optimization implemented in the SimPEG framework. The gravity inversion reveals two zones of low density, east of Mayotte island. The first is located NE of Petite Terre island between ~15 and 35 km depth, and the second is located further east, south of La Jumelle seamounts and extends from ~25 to 35 km depth. We interpret these low density regions as regions of partial melt stored in the lithosphere and estimate the volume of stored magma. Finally, we use the newly imaged low density bodies to constrain the magma reservoir geometry and simulate magma flow from this reservoir to the eruptive vent in a 3D, time-dependent, numerical model. The model parameters are adjusted by minimizing the misfit between the modeled surface displacement and that measured at the 6 GPS sites, between May 2018 and 2020. The deformation modeling reveals the temporal evolution of the magma flux during the eruption, and the resulting stress distribution in the crust explains the patterns of recorded seismicity. Together with the existing seismic and geodetic studies, the gravity data analysis and FEM models bring new constraints on the architecture of the magma plumbing system and the magmatic processes behind the largest submarine eruption ever documented.</p>



2021 ◽  
Author(s):  
Mirko Scheinert ◽  
Philipp Zingerle ◽  
Theresa Schaller ◽  
Roland Pail ◽  
Martin Willberg

<p>In the frame of the IAG Subcommission 2.4f “Gravity and Geoid in Antarctica” (AntGG) a first Antarctic-wide grid of ground-based gravity anomalies was released in 2016 (Scheinert et al. 2016). That data set was provided with a grid space of 10 km and covered about 73% of the Antarctic continent. Since then a considerably amount of new data has been made available, mainly collected by means of airborne gravimetry. Regions which were formerly void of any terrestrial gravity observations and have now been surveyed include especially the polar data gap originating from GOCE satellite gravimetry. Thus, it is timely to come up with an updated and enhanced regional gravity field solution for Antarctica. For this, we aim to improve further aspects in comparison to the AntGG 2016 solution: The grid spacing will be enhanced to 5 km. Instead of providing gravity anomalies only for parts of Antarctica, now the entire continent should be covered. In addition to the gravity anomaly also a regional geoid solution should be provided along with further desirable functionals (e.g. gravity anomaly vs. disturbance, different height levels).</p><p>We will discuss the expanded AntGG data base which now includes terrestrial gravity data from Antarctic surveys conducted over the past 40 years. The methodology applied in the analysis is based on the remove-compute-restore technique. Here we utilize the newly developed combined spherical-harmonic gravity field model SATOP1 (Zingerle et al. 2019) which is based on the global satellite-only model GOCO05s and the high-resolution topographic model EARTH2014. We will demonstrate the feasibility to adequately reduce the original gravity data and, thus, to also cross-validate and evaluate the accuracy of the data especially where different data set overlap. For the compute step the recently developed partition-enhanced least-squares collocation (PE-LSC) has been used (Zingerle et al. 2021, in review; cf. the contribution of Zingerle et al. in the same session). This method allows to treat all data available in Antarctica in one single computation step in an efficient and fast way. Thus, it becomes feasible to iterate the computations within short time once any input data or parameters are changed, and to easily predict the desirable functionals also in regions void of terrestrial measurements as well as at any height level (e.g. gravity anomalies at the surface or gravity disturbances at constant height).</p><p>We will discuss the results and give an outlook on the data products which shall be finally provided to present the new regional gravity field solution for Antarctica. Furthermore, implications for further applications will be discussed e.g. with respect to geophysical modelling of the Earth’s interior (cf. the contribution of Schaller et al. in session G4.3).</p>



2020 ◽  
Vol 8 (2) ◽  
pp. SH1-SH17 ◽  
Author(s):  
J. Kim Welford ◽  
Deric Cameron ◽  
Erin Gillis ◽  
Victoria Mitchell ◽  
Richard Wright

A regional long-offset 2D seismic reflection program undertaken along the Labrador margin of the Labrador Sea, Canada, and complemented by the acquisition of coincident gravity data, has provided an extensive data set with which to image and model the sparsely investigated outer shelf, slope, and deepwater regions. Previous interpretation of the seismic data revealed the extent of Mesozoic and Cenozoic basins and resulted in the remapping of the basin configuration for the entire margin. To map the synrift package and improve understanding of the geometry and extent of these basins, we have undertaken joint seismic interpretation and gravity forward modeling to reduce uncertainty in the identification of the prerift basement, which varies between Paleozoic shelfal deposits and Precambrian crystalline rocks, with similar density characteristics. With this iterative approach, we have obtained new depth to basement constraints and have deduced further constraints on crustal thickness variations along the Labrador margin. At the crustal scale, extreme localized crustal thinning has been revealed along the southern and central portions of the Labrador margin, whereas a broad, margin-parallel zone of thicker crust has been detected outboard of the continental shelf along the northern Labrador margin. Our final gravity models suggest that Late Cretaceous rift packages from further south extend along the entire Labrador margin and open the possibility of a Late Cretaceous source rock fairway extending into the Labrador basins.



Geophysics ◽  
2017 ◽  
Vol 82 (1) ◽  
pp. G1-G21 ◽  
Author(s):  
William J. Titus ◽  
Sarah J. Titus ◽  
Joshua R. Davis

We apply a Bayesian Markov chain Monte Carlo formalism to the gravity inversion of a single localized 2D subsurface object. The object is modeled as a polygon described by five parameters: the number of vertices, a density contrast, a shape-limiting factor, and the width and depth of an encompassing container. We first constrain these parameters with an interactive forward model and explicit geologic information. Then, we generate an approximate probability distribution of polygons for a given set of parameter values. From these, we determine statistical distributions such as the variance between the observed and model fields, the area, the center of area, and the occupancy probability (the probability that a spatial point lies within the subsurface object). We introduce replica exchange to mitigate trapping in local optima and to compute model probabilities and their uncertainties. We apply our techniques to synthetic data sets and a natural data set collected across the Rio Grande Gorge Bridge in New Mexico. On the basis of our examples, we find that the occupancy probability is useful in visualizing the results, giving a “hazy” cross section of the object. We also find that the role of the container is important in making predictions about the subsurface object.



Geophysics ◽  
2020 ◽  
Vol 85 (6) ◽  
pp. G129-G141
Author(s):  
Diego Takahashi ◽  
Vanderlei C. Oliveira Jr. ◽  
Valéria C. F. Barbosa

We have developed an efficient and very fast equivalent-layer technique for gravity data processing by modifying an iterative method grounded on an excess mass constraint that does not require the solution of linear systems. Taking advantage of the symmetric block-Toeplitz Toeplitz-block (BTTB) structure of the sensitivity matrix that arises when regular grids of observation points and equivalent sources (point masses) are used to set up a fictitious equivalent layer, we develop an algorithm that greatly reduces the computational complexity and RAM memory necessary to estimate a 2D mass distribution over the equivalent layer. The structure of symmetric BTTB matrix consists of the elements of the first column of the sensitivity matrix, which, in turn, can be embedded into a symmetric block-circulant with circulant-block (BCCB) matrix. Likewise, only the first column of the BCCB matrix is needed to reconstruct the full sensitivity matrix completely. From the first column of the BCCB matrix, its eigenvalues can be calculated using the 2D fast Fourier transform (2D FFT), which can be used to readily compute the matrix-vector product of the forward modeling in the fast equivalent-layer technique. As a result, our method is efficient for processing very large data sets. Tests with synthetic data demonstrate the ability of our method to satisfactorily upward- and downward-continue gravity data. Our results show very small border effects and noise amplification compared to those produced by the classic approach in the Fourier domain. In addition, they show that, whereas the running time of our method is [Formula: see text] s for processing [Formula: see text] observations, the fast equivalent-layer technique used [Formula: see text] s with [Formula: see text]. A test with field data from the Carajás Province, Brazil, illustrates the low computational cost of our method to process a large data set composed of [Formula: see text] observations.



Sign in / Sign up

Export Citation Format

Share Document