Localized anisotropic tomography with well information in VTI media

Geophysics ◽  
2010 ◽  
Vol 75 (5) ◽  
pp. D37-D45 ◽  
Author(s):  
Andrey Bakulin ◽  
Marta Woodward ◽  
Dave Nichols ◽  
Konstantin Osypov ◽  
Olga Zdraveva

We develop a concept of localized seismic grid tomography constrained by well information and apply it to building vertically transversely isotropic (VTI) velocity models in depth. The goal is to use a highly automated migration velocity analysis to build anisotropic models that combine optimal image focusing with accurate depth positioning in one step. We localize tomography to a limited volume around the well and jointly invert the surface seismic and well data. Well information is propagated into the local volume by using the method of preconditioning, whereby model updates are shaped to follow geologic layers with spatial smoothing constraints. We analyze our concept with a synthetic data example of anisotropic tomography applied to a 1D VTI model. We demonstrate four cases of introducing additionalinformation. In the first case, vertical velocity is assumed to be known, and the tomography inverts only for Thomsen’s [Formula: see text] and [Formula: see text] profiles using surface seismic data alone. In the second case, tomography simultaneously inverts for all three VTI parameters, including vertical velocity, using a joint data set that consists of surface seismic data and vertical check-shot traveltimes. In the third and fourth cases, sparse depth markers and walkaway vertical seismic profiling (VSP) are used, respectively, to supplement the seismic data. For all four examples, tomography reliably recovers the anisotropic velocity field up to a vertical resolution comparable to that of the well data. Even though walkaway VSP has the additional dimension of angle or offset, it offers no further increase in this resolution limit. Anisotropic tomography with well constraints has multiple advantages over other approaches and deserves a place in the portfolio of model-building tools.

Geophysics ◽  
2010 ◽  
Vol 75 (4) ◽  
pp. D27-D36 ◽  
Author(s):  
Andrey Bakulin ◽  
Marta Woodward ◽  
Dave Nichols ◽  
Konstantin Osypov ◽  
Olga Zdraveva

Tilted transverse isotropy (TTI) is increasingly recognized as a more geologically plausible description of anisotropy in sedimentary formations than vertical transverse isotropy (VTI). Although model-building approaches for VTI media are well understood, similar approaches for TTI media are in their infancy, even when the symmetry-axis direction is assumed known. We describe a tomographic approach that builds localized anisotropic models by jointly inverting surface-seismic and well data. We present a synthetic data example of anisotropic tomography applied to a layered TTI model with a symmetry-axis tilt of 45 degrees. We demonstrate three scenarios for constraining the solution. In the first scenario, velocity along the symmetry axis is known and tomography inverts for Thomsen’s [Formula: see text] and [Formula: see text] parame-ters. In the second scenario, tomography inverts for [Formula: see text], [Formula: see text], and velocity, using surface-seismic data and vertical check-shot traveltimes. In contrast to the VTI case, both these inversions are nonunique. To combat nonuniqueness, in the third scenario, we supplement check-shot and seismic data with the [Formula: see text] profile from an offset well. This allows recovery of the correct profiles for velocity along the symmetry axis and [Formula: see text]. We conclude that TTI is more ambiguous than VTI for model building. Additional well data or rock-physics assumptions may be required to constrain the tomography and arrive at geologically plausible TTI models. Furthermore, we demonstrate that VTI models with atypical Thomsen parameters can also fit the same joint seismic and check-shot data set. In this case, although imaging with VTI models can focus the TTI data and match vertical event depths, it leads to substantial lateral mispositioning of the reflections.


2019 ◽  
Vol 38 (11) ◽  
pp. 872a1-872a9 ◽  
Author(s):  
Mauricio Araya-Polo ◽  
Stuart Farris ◽  
Manuel Florez

Exploration seismic data are heavily manipulated before human interpreters are able to extract meaningful information regarding subsurface structures. This manipulation adds modeling and human biases and is limited by methodological shortcomings. Alternatively, using seismic data directly is becoming possible thanks to deep learning (DL) techniques. A DL-based workflow is introduced that uses analog velocity models and realistic raw seismic waveforms as input and produces subsurface velocity models as output. When insufficient data are used for training, DL algorithms tend to overfit or fail. Gathering large amounts of labeled and standardized seismic data sets is not straightforward. This shortage of quality data is addressed by building a generative adversarial network (GAN) to augment the original training data set, which is then used by DL-driven seismic tomography as input. The DL tomographic operator predicts velocity models with high statistical and structural accuracy after being trained with GAN-generated velocity models. Beyond the field of exploration geophysics, the use of machine learning in earth science is challenged by the lack of labeled data or properly interpreted ground truth, since we seldom know what truly exists beneath the earth's surface. The unsupervised approach (using GANs to generate labeled data)illustrates a way to mitigate this problem and opens geology, geophysics, and planetary sciences to more DL applications.


Geophysics ◽  
2006 ◽  
Vol 71 (5) ◽  
pp. U67-U76 ◽  
Author(s):  
Robert J. Ferguson

The possibility of improving regularization/datuming of seismic data is investigated by treating wavefield extrapolation as an inversion problem. Weighted, damped least squares is then used to produce the regularized/datumed wavefield. Regularization/datuming is extremely costly because of computing the Hessian, so an efficient approximation is introduced. Approximation is achieved by computing a limited number of diagonals in the operators involved. Real and synthetic data examples demonstrate the utility of this approach. For synthetic data, regularization/datuming is demonstrated for large extrapolation distances using a highly irregular recording array. Without approximation, regularization/datuming returns a regularized wavefield with reduced operator artifacts when compared to a nonregularizing method such as generalized phase shift plus interpolation (PSPI). Approximate regularization/datuming returns a regularized wavefield for approximately two orders of magnitude less in cost; but it is dip limited, though in a controllable way, compared to the full method. The Foothills structural data set, a freely available data set from the Rocky Mountains of Canada, demonstrates application to real data. The data have highly irregular sampling along the shot coordinate, and they suffer from significant near-surface effects. Approximate regularization/datuming returns common receiver data that are superior in appearance compared to conventional datuming.


Geophysics ◽  
2006 ◽  
Vol 71 (5) ◽  
pp. C81-C92 ◽  
Author(s):  
Helene Hafslund Veire ◽  
Hilde Grude Borgos ◽  
Martin Landrø

Effects of pressure and fluid saturation can have the same degree of impact on seismic amplitudes and differential traveltimes in the reservoir interval; thus, they are often inseparable by analysis of a single stacked seismic data set. In such cases, time-lapse AVO analysis offers an opportunity to discriminate between the two effects. We quantify the uncertainty in estimations to utilize information about pressure- and saturation-related changes in reservoir modeling and simulation. One way of analyzing uncertainties is to formulate the problem in a Bayesian framework. Here, the solution of the problem will be represented by a probability density function (PDF), providing estimations of uncertainties as well as direct estimations of the properties. A stochastic model for estimation of pressure and saturation changes from time-lapse seismic AVO data is investigated within a Bayesian framework. Well-known rock physical relationships are used to set up a prior stochastic model. PP reflection coefficient differences are used to establish a likelihood model for linking reservoir variables and time-lapse seismic data. The methodology incorporates correlation between different variables of the model as well as spatial dependencies for each of the variables. In addition, information about possible bottlenecks causing large uncertainties in the estimations can be identified through sensitivity analysis of the system. The method has been tested on 1D synthetic data and on field time-lapse seismic AVO data from the Gullfaks Field in the North Sea.


Geophysics ◽  
2017 ◽  
Vol 82 (3) ◽  
pp. R199-R217 ◽  
Author(s):  
Xintao Chai ◽  
Shangxu Wang ◽  
Genyang Tang

Seismic data are nonstationary due to subsurface anelastic attenuation and dispersion effects. These effects, also referred to as the earth’s [Formula: see text]-filtering effects, can diminish seismic resolution. We previously developed a method of nonstationary sparse reflectivity inversion (NSRI) for resolution enhancement, which avoids the intrinsic instability associated with inverse [Formula: see text] filtering and generates superior [Formula: see text] compensation results. Applying NSRI to data sets that contain multiples (addressing surface-related multiples only) requires a demultiple preprocessing step because NSRI cannot distinguish primaries from multiples and will treat them as interference convolved with incorrect [Formula: see text] values. However, multiples contain information about subsurface properties. To use information carried by multiples, with the feedback model and NSRI theory, we adapt NSRI to the context of nonstationary seismic data with surface-related multiples. Consequently, not only are the benefits of NSRI (e.g., circumventing the intrinsic instability associated with inverse [Formula: see text] filtering) extended, but also multiples are considered. Our method is limited to be a 1D implementation. Theoretical and numerical analyses verify that given a wavelet, the input [Formula: see text] values primarily affect the inverted reflectivities and exert little effect on the estimated multiples; i.e., multiple estimation need not consider [Formula: see text] filtering effects explicitly. However, there are benefits for NSRI considering multiples. The periodicity and amplitude of the multiples imply the position of the reflectivities and amplitude of the wavelet. Multiples assist in overcoming scaling and shifting ambiguities of conventional problems in which multiples are not considered. Experiments using a 1D algorithm on a synthetic data set, the publicly available Pluto 1.5 data set, and a marine data set support the aforementioned findings and reveal the stability, capabilities, and limitations of the proposed method.


Geophysics ◽  
2018 ◽  
Vol 83 (4) ◽  
pp. M41-M48 ◽  
Author(s):  
Hongwei Liu ◽  
Mustafa Naser Al-Ali

The ideal approach for continuous reservoir monitoring allows generation of fast and accurate images to cope with the massive data sets acquired for such a task. Conventionally, rigorous depth-oriented velocity-estimation methods are performed to produce sufficiently accurate velocity models. Unlike the traditional way, the target-oriented imaging technology based on the common-focus point (CFP) theory can be an alternative for continuous reservoir monitoring. The solution is based on a robust data-driven iterative operator updating strategy without deriving a detailed velocity model. The same focusing operator is applied on successive 3D seismic data sets for the first time to generate efficient and accurate 4D target-oriented seismic stacked images from time-lapse field seismic data sets acquired in a [Formula: see text] injection project in Saudi Arabia. Using the focusing operator, target-oriented prestack angle domain common-image gathers (ADCIGs) could be derived to perform amplitude-versus-angle analysis. To preserve the amplitude information in the ADCIGs, an amplitude-balancing factor is applied by embedding a synthetic data set using the real acquisition geometry to remove the geometry imprint artifact. Applying the CFP-based target-oriented imaging to time-lapse data sets revealed changes at the reservoir level in the poststack and prestack time-lapse signals, which is consistent with the [Formula: see text] injection history and rock physics.


Geophysics ◽  
2014 ◽  
Vol 79 (4) ◽  
pp. EN77-EN90 ◽  
Author(s):  
Paolo Bergamo ◽  
Laura Valentina Socco

Surface-wave (SW) techniques are mainly used to retrieve 1D velocity models and are therefore characterized by a 1D approach, which might prove unsatisfactory when relevant 2D effects are present in the investigated subsurface. In the case of sharp and sudden lateral heterogeneities in the subsurface, a strategy to tackle this limitation is to estimate the location of the discontinuities and to separately process seismic traces belonging to quasi-1D subsurface portions. We have addressed our attention to methods aimed at locating discontinuities by identifying anomalies in SW propagation and attenuation. The considered methods are the autospectrum computation and the attenuation analysis of Rayleigh waves (AARW). These methods were developed for purposes and/or scales of analysis that are different from those of this work, which aims at detecting and characterizing sharp subvertical discontinuities in the shallow subsurface. We applied both methods to two data sets, synthetic data from a finite-element method simulation and a field data set acquired over a fault system, both presenting an abrupt lateral variation perpendicularly crossing the acquisition line. We also extended the AARW method to the detection of sharp discontinuities from large and multifold data sets and we tested these novel procedures on the field case. The two methods are proven to be effective for the detection of the discontinuity, by portraying propagation phenomena linked to the presence of the heterogeneity, such as the interference between incident and reflected wavetrains, and energy concentration as well as subsequent decay at the fault location. The procedures we developed for the processing of multifold seismic data set showed to be reliable tools in locating and characterizing subvertical sharp heterogeneities.


2017 ◽  
Vol 5 (3) ◽  
pp. SJ81-SJ90 ◽  
Author(s):  
Kainan Wang ◽  
Jesse Lomask ◽  
Felix Segovia

Well-log-to-seismic tying is a key step in many interpretation workflows for oil and gas exploration. Synthetic seismic traces from the wells are often manually tied to seismic data; this process can be very time consuming and, in some cases, inaccurate. Automatic methods, such as dynamic time warping (DTW), can match synthetic traces to seismic data. Although these methods are extremely fast, they tend to create interval velocities that are not geologically realistic. We have described the modification of DTW to create a blocked dynamic warping (BDW) method. BDW generates an automatic, optimal well tie that honors geologically consistent velocity constraints. Consequently, it results in updated velocities that are more realistic than other methods. BDW constrains the updated velocity to be constant or linearly variable inside each geologic layer. With an optimal correlation between synthetic seismograms and surface seismic data, this algorithm returns an automatically updated time-depth curve and an updated interval velocity model that still retains the original geologic velocity boundaries. In other words, the algorithm finds the optimal solution for tying the synthetic to the seismic data while restricting the interval velocity changes to coincide with the initial input blocking. We have determined the application of the BDW technique on a synthetic data example and field data set.


Geophysics ◽  
1993 ◽  
Vol 58 (1) ◽  
pp. 91-100 ◽  
Author(s):  
Claude F. Lafond ◽  
Alan R. Levander

Prestack depth migration still suffers from the problems associated with building appropriate velocity models. The two main after‐migration, before‐stack velocity analysis techniques currently used, depth focusing and residual moveout correction, have found good use in many applications but have also shown their limitations in the case of very complex structures. To address this issue, we have extended the residual moveout analysis technique to the general case of heterogeneous velocity fields and steep dips, while keeping the algorithm robust enough to be of practical use on real data. Our method is not based on analytic expressions for the moveouts and requires no a priori knowledge of the model, but instead uses geometrical ray tracing in heterogeneous media, layer‐stripping migration, and local wavefront analysis to compute residual velocity corrections. These corrections are back projected into the velocity model along raypaths in a way that is similar to tomographic reconstruction. While this approach is more general than existing migration velocity analysis implementations, it is also much more computer intensive and is best used locally around a particularly complex structure. We demonstrate the technique using synthetic data from a model with strong velocity gradients and then apply it to a marine data set to improve the positioning of a major fault.


Geophysics ◽  
2005 ◽  
Vol 70 (5) ◽  
pp. U51-U65 ◽  
Author(s):  
Stig-Kyrre Foss ◽  
Bjørn Ursin ◽  
Maarten V. de Hoop

We present a method of reflection tomography for anisotropic elastic parameters from PP and PS reflection seismic data. The method is based upon the differential semblance misfit functional in scattering angle and azimuth (DSA) acting on common-image-point gathers (CIGs) to find fitting velocity models. The CIGs are amplitude corrected using a generalized Radon transform applied to the data. Depth consistency between the PP and PS images is enforced by penalizing any mis-tie between imaged key reflectors. The mis-tie is evaluated by means of map migration-demigration applied to the geometric information (times and slopes) contained in the data. In our implementation, we simplify the codepthing approach to zero-scattering-angle data only. The resulting measure is incorporated as a regularization in the DSA misfit functional. We then resort to an optimization procedure, restricting ourselves to transversely isotropic (TI) velocity models. In principle, depending on the available surface-offset range and orientation of reflectors in the subsurface, by combining the DSA with codepthing, the anisotropic parameters for TI models can be determined, provided the orientation of the symmetry axis is known. A proposed strategy is applied to an ocean-bottom-seismic field data set from the North Sea.


Sign in / Sign up

Export Citation Format

Share Document