Tomographic filtering via the generalized inverse: a way to account for seismic data uncertainty

2020 ◽  
Vol 223 (1) ◽  
pp. 254-269
Author(s):  
Roman Freissler ◽  
Christophe Zaroli ◽  
Sophie Lambotte ◽  
Bernhard S A Schuberth

SUMMARY Tomographic-geodynamic model comparisons are a key component in studies of the present-day state and evolution of Earth’s mantle. To account for the limited seismic resolution, ‘tomographic filtering’ of the geodynamically predicted mantle structures is a standard processing step in this context. The filtered model provides valuable information on how heterogeneities are smeared and modified in amplitude given the available seismic data and underlying inversion strategy. An important aspect that has so far not been taken into account are the effects of data uncertainties. We present a new method for ‘tomographic filtering’ in which it is possible to include the effects of random and systematic errors in the seismic measurements and to analyse the associated uncertainties in the tomographic model space. The ‘imaged’ model is constructed by computing the generalized-inverse projection (GIP) of synthetic data calculated in an earth model of choice. An advantage of this approach is that a reparametrization onto the tomographic grid can be avoided, depending on how the synthetic data are calculated. To demonstrate the viability of the method, we compute traveltimes in an existing mantle circulation model (MCM), add specific realizations of random seismic ‘noise’ to the synthetic data and apply the generalized inverse operator of a recent Backus–Gilbert-type global S-wave tomography. GIP models based on different noise realizations show a significant variability of the shape and amplitude of seismic anomalies. This highlights the importance of interpreting tomographic images in a prudent and cautious manner. Systematic errors, such as event mislocation or imperfect crustal corrections, can be investigated by introducing an additional term to the noise component so that the resulting noise distributions are biased. In contrast to Gaussian zero-mean noise, this leads to a bias in model space; that is, the mean of all GIP realizations also is non-zero. Knowledge of the statistical properties of model uncertainties together with tomographic resolution is crucial for obtaining meaningful estimates of Earth’s present-day thermodynamic state. A practicable treatment of error propagation and uncertainty quantification will therefore be increasingly important, especially in view of geodynamic inversions that aim at ‘retrodicting’ past mantle evolution based on tomographic images.

2020 ◽  
Author(s):  
Bernhard S.A. Schuberth ◽  
Roman Freissler ◽  
Christophe Zaroli ◽  
Sophie Lambotte

<p>For a comprehensive link between seismic tomography and geodynamic models, uncertainties in the seismic model space play a non-negligible role. More specifically, knowledge of the tomographic uncertainties is important for obtaining meaningful estimates of the present-day thermodynamic state of Earth's mantle, which form the basis of retrodictions of past mantle evolution using the geodynamic adjoint method. A standard tool in tomographic-geodynamic model comparisons nowadays is tomographic filtering of mantle circulation models using the resolution operator <em><strong>R</strong></em> associated with the particular seismic inversion of interest. However, in this classical approach it is not possible to consider tomographic uncertainties and their impact on the geodynamic interpretation. </p><p>Here, we present a new method for 'filtering' synthetic Earth models, which makes use of the generalised inverse operator <strong>G</strong><sup>†</sup>, instead of using <em><strong>R</strong></em>. In our case, <strong>G</strong><sup>†</sup> is taken from a recent global SOLA Backus–Gilbert <em>S</em>-wave tomography. In contrast to classical tomographic filtering, the 'imaged' model is constructed by computing the <em>Generalised-Inverse Projection</em> (GIP) of synthetic data calculated in an Earth model of choice. This way, it is possible to include the effects of noise in the seismic data and thus to analyse uncertainties in the resulting model parameters. In order to demonstrate the viability of the method, we compute a set of travel times in an existing mantle circulation model, add specific realisations of Gaussian, zero-mean seismic noise to the synthetic data and apply <strong>G</strong><sup>†</sup>. <br> <br>Our results show that the resulting GIP model without noise is equivalent to the mean model of all GIP realisations from the suite of synthetic 'noisy' data and also closely resembles the model tomographically filtered using <em><strong>R</strong></em>. Most important, GIP models that include noise in the data show a significant variability of the shape and amplitude of seismic anomalies in the mantle. The significant differences between the various GIP realisations highlight the importance of interpreting and assessing tomographic images in a prudent and cautious manner. With the GIP approach, we can moreover investigate the effect of systematic errors in the data, which we demonstrate by adding an extra term to the noise component that aims at mimicking the effects of uncertain crustal corrections. In our presentation, we will finally discuss ways to construct the model covariance matrix based on the GIP approach and point out possible research directions on how to make use of this information in future geodynamic modelling efforts.</p>


Geophysics ◽  
2016 ◽  
Vol 81 (6) ◽  
pp. A17-A21 ◽  
Author(s):  
Juan I. Sabbione ◽  
Mauricio D. Sacchi

The coefficients that synthesize seismic data via the hyperbolic Radon transform (HRT) are estimated by solving a linear-inverse problem. In the classical HRT, the computational cost of the inverse problem is proportional to the size of the data and the number of Radon coefficients. We have developed a strategy that significantly speeds up the implementation of time-domain HRTs. For this purpose, we have defined a restricted model space of coefficients applying hard thresholding to an initial low-resolution Radon gather. Then, an iterative solver that operated on the restricted model space was used to estimate the group of coefficients that synthesized the data. The method is illustrated with synthetic data and tested with a marine data example.


Geosciences ◽  
2019 ◽  
Vol 9 (1) ◽  
pp. 45
Author(s):  
Marwan Charara ◽  
Christophe Barnes

Full-waveform inversion for borehole seismic data is an ill-posed problem and constraining the problem is crucial. Constraints can be imposed on the data and model space through covariance matrices. Usually, they are set to a diagonal matrix. For the data space, signal polarization information can be used to evaluate the data uncertainties. The inversion forces the synthetic data to fit the polarization of observed data. A synthetic inversion for a 2D-2C data estimating a 1D elastic model shows a clear improvement, especially at the level of the receivers. For the model space, horizontal and vertical spatial correlations using a Laplace distribution can be used to fill the model space covariance matrix. This approach reduces the degree of freedom of the inverse problem, which can be quantitatively evaluated. Strong horizontal spatial correlation distances favor a tabular geological model whenever it does not contradict the data. The relaxation of the spatial correlation distances from large to small during the iterative inversion process allows the recovery of geological objects of the same size, which regularizes the inverse problem. Synthetic constrained and unconstrained inversions for 2D-2C crosswell data show the clear improvement of the inversion results when constraints are used.


Geophysics ◽  
1993 ◽  
Vol 58 (6) ◽  
pp. 873-882 ◽  
Author(s):  
Roelof Jan Versteeg

To get a correct earth image from seismic data acquired over complex structures it is essential to use prestack depth migration. A necessary condition for obtaining a correct image is that the prestack depth migration is done with an accurate velocity model. In cases where we need to use prestack depth migration determination of such a model using conventional methods does not give satisfactory results. Thus, new iterative methods for velocity model determination have been developed. The convergence of these methods can be accelerated by defining constraints on the model in such a way that the method only looks for those components of the true earth velocity field that influence the migrated image. In order to determine these components, the sensitivity of the prestack depth migration result to the velocity model is examined using a complex synthetic data set (the Marmousi data set) for which the exact model is known. The images obtained with increasingly smoothed versions of the true model are compared, and it is shown that the minimal spatial wavelength that needs to be in the model to obtain an accurate depth image from the data set is of the order of 200 m. The model space that has to be examined to find an accurate velocity model from complex seismic data can thus be constrained. This will increase the speed and probability of convergence of iterative velocity model determination methods.


Geophysics ◽  
2001 ◽  
Vol 66 (3) ◽  
pp. 871-882 ◽  
Author(s):  
D. Lebrun ◽  
V. Richard ◽  
D. Mace ◽  
M. Cuer

Acquisition of the full elastic response (compressional and shear) of the subsurface is an important technology in the seismic industry because of its potential to improve the quality of seismic data and to infer accurate information about rock properties (fluid type and rock lithology). In the framework of 3-D propagation in 1-D media, we propose a computational tool to analyze the information about elastic parameters contained in the amplitudes of reflected waves with offset. The approach is based on singular value decomposition (SVD) analysis of the linearized elastic inversion problem and can be applied to any particular seismic data. We applied this tool to examine the type of information in the model space that can be retrieved from sea‐bottom multicomponent measurements. The results are compared with those obtained from conventional streamer acquisition techniques. We also present multiparameter linearized inversion results obtained from synthetic data that illustrate the resolution of elastic parameters. This approach allows us to investigate the reliability of the elastic parameters estimated for different offset ranges, wave modes, data types, and noise levels involved in data space.


Geophysics ◽  
2006 ◽  
Vol 71 (5) ◽  
pp. U67-U76 ◽  
Author(s):  
Robert J. Ferguson

The possibility of improving regularization/datuming of seismic data is investigated by treating wavefield extrapolation as an inversion problem. Weighted, damped least squares is then used to produce the regularized/datumed wavefield. Regularization/datuming is extremely costly because of computing the Hessian, so an efficient approximation is introduced. Approximation is achieved by computing a limited number of diagonals in the operators involved. Real and synthetic data examples demonstrate the utility of this approach. For synthetic data, regularization/datuming is demonstrated for large extrapolation distances using a highly irregular recording array. Without approximation, regularization/datuming returns a regularized wavefield with reduced operator artifacts when compared to a nonregularizing method such as generalized phase shift plus interpolation (PSPI). Approximate regularization/datuming returns a regularized wavefield for approximately two orders of magnitude less in cost; but it is dip limited, though in a controllable way, compared to the full method. The Foothills structural data set, a freely available data set from the Rocky Mountains of Canada, demonstrates application to real data. The data have highly irregular sampling along the shot coordinate, and they suffer from significant near-surface effects. Approximate regularization/datuming returns common receiver data that are superior in appearance compared to conventional datuming.


2014 ◽  
Vol 172 (2) ◽  
pp. 389-413 ◽  
Author(s):  
Juan Zhao ◽  
Laurent Moretti ◽  
Anne Mangeney ◽  
Eléonore Stutzmann ◽  
Hiroo Kanamori ◽  
...  

Geophysics ◽  
2006 ◽  
Vol 71 (5) ◽  
pp. C81-C92 ◽  
Author(s):  
Helene Hafslund Veire ◽  
Hilde Grude Borgos ◽  
Martin Landrø

Effects of pressure and fluid saturation can have the same degree of impact on seismic amplitudes and differential traveltimes in the reservoir interval; thus, they are often inseparable by analysis of a single stacked seismic data set. In such cases, time-lapse AVO analysis offers an opportunity to discriminate between the two effects. We quantify the uncertainty in estimations to utilize information about pressure- and saturation-related changes in reservoir modeling and simulation. One way of analyzing uncertainties is to formulate the problem in a Bayesian framework. Here, the solution of the problem will be represented by a probability density function (PDF), providing estimations of uncertainties as well as direct estimations of the properties. A stochastic model for estimation of pressure and saturation changes from time-lapse seismic AVO data is investigated within a Bayesian framework. Well-known rock physical relationships are used to set up a prior stochastic model. PP reflection coefficient differences are used to establish a likelihood model for linking reservoir variables and time-lapse seismic data. The methodology incorporates correlation between different variables of the model as well as spatial dependencies for each of the variables. In addition, information about possible bottlenecks causing large uncertainties in the estimations can be identified through sensitivity analysis of the system. The method has been tested on 1D synthetic data and on field time-lapse seismic AVO data from the Gullfaks Field in the North Sea.


Geophysics ◽  
2006 ◽  
Vol 71 (3) ◽  
pp. V79-V86 ◽  
Author(s):  
Hakan Karsli ◽  
Derman Dondurur ◽  
Günay Çifçi

Time-dependent amplitude and phase information of stacked seismic data are processed independently using complex trace analysis in order to facilitate interpretation by improving resolution and decreasing random noise. We represent seismic traces using their envelopes and instantaneous phases obtained by the Hilbert transform. The proposed method reduces the amplitudes of the low-frequency components of the envelope, while preserving the phase information. Several tests are performed in order to investigate the behavior of the present method for resolution improvement and noise suppression. Applications on both 1D and 2D synthetic data show that the method is capable of reducing the amplitudes and temporal widths of the side lobes of the input wavelets, and hence, the spectral bandwidth of the input seismic data is enhanced, resulting in an improvement in the signal-to-noise ratio. The bright-spot anomalies observed on the stacked sections become clearer because the output seismic traces have a simplified appearance allowing an easier data interpretation. We recommend applying this simple signal processing for signal enhancement prior to interpretation, especially for single channel and low-fold seismic data.


Geophysics ◽  
2019 ◽  
Vol 84 (2) ◽  
pp. N29-N40
Author(s):  
Modeste Irakarama ◽  
Paul Cupillard ◽  
Guillaume Caumon ◽  
Paul Sava ◽  
Jonathan Edwards

Structural interpretation of seismic images can be highly subjective, especially in complex geologic settings. A single seismic image will often support multiple geologically valid interpretations. However, it is usually difficult to determine which of those interpretations are more likely than others. We have referred to this problem as structural model appraisal. We have developed the use of misfit functions to rank and appraise multiple interpretations of a given seismic image. Given a set of possible interpretations, we compute synthetic data for each structural interpretation, and then we compare these synthetic data against observed seismic data; this allows us to assign a data-misfit value to each structural interpretation. Our aim is to find data-misfit functions that enable a ranking of interpretations. To do so, we formalize the problem of appraising structural interpretations using seismic data and we derive a set of conditions to be satisfied by the data-misfit function for a successful appraisal. We investigate vertical seismic profiling (VSP) and surface seismic configurations. An application of the proposed method to a realistic synthetic model shows promising results for appraising structural interpretations using VSP data, provided that the target region is well-illuminated. However, we find appraising structural interpretations using surface seismic data to be more challenging, mainly due to the difficulty of computing phase-shift data misfits.


Sign in / Sign up

Export Citation Format

Share Document