3D ray+Born migration/inversion—Part 2: Application to the SEG/EAGE overthrust experiment

Geophysics ◽  
2003 ◽  
Vol 68 (4) ◽  
pp. 1357-1370 ◽  
Author(s):  
Stéphane Operto ◽  
Gilles Lambaré ◽  
Pascal Podvin ◽  
Philippe Thierry

The SEG/EAGE overthrust model is a synthetic onshore velocity model that was used to generate several large synthetic seismic data sets using acoustic finite‐difference modeling. From this database, several realistic subdata sets were extracted and made available for testing 3D processing methods. For example, classic onshore‐type data‐acquisition geometries are available such as a swath acquisition, which is characterized by a nonuniform distribution of long offsets with azimuth and midpoints. In this paper, we present an application of 2.5D and 3D ray‐Born migration/inversion to several classical data sets from the SEG/EAGE overthrust experiment. The method is formulated as a linearized inversion of the scattered wavefield. The method allows quantitative estimates of short wavelength components of the velocity model. First, we apply a 3D migration/inversion formula formerly developed for marine acquisitions to the swath data set. The migrated sections exhibit significant amplitude artifacts and acquisition footprints, also revealed by the shape of the local spatial resolution filters. From the analysis of these spatial resolution filters, we propose a new formula significantly improving the migrated dip section. We also present 3D migrated results for the strike section and a small 3D target containing a channel. Finally, the applications demonstrate, that the ray+Born migration formula must be adapted to the acquisition geometry to obtain reliable estimates of the true amplitude of the model perturbations. This adaptation is relatively straightforward in the frame of the ray+Born formalism and can be guided by the analysis of the resolution operator.

Geophysics ◽  
2009 ◽  
Vol 74 (6) ◽  
pp. WCC79-WCC89 ◽  
Author(s):  
Hansruedi Maurer ◽  
Stewart Greenhalgh ◽  
Sabine Latzel

Analyses of synthetic frequency-domain acoustic waveform data provide new insights into the design and imaging capability of crosshole surveys. The full complex Fourier spectral data offer significantly more information than other data representations such as the amplitude, phase, or Hartley spectrum. Extensive eigenvalue analyses are used for further inspection of the information content offered by the seismic data. The goodness of different experimental configurations is investigated by varying the choice of (1) the frequencies, (2) the source and receiver spacings along the boreholes, and (3) the borehole separation. With only a few carefully chosen frequencies, a similar amount of information can be extracted from the seismic data as can be extracted with a much larger suite of equally spaced frequencies. Optimized data sets should include at least one very low frequencycomponent. The remaining frequencies should be chosen fromthe upper end of the spectrum available. This strategy proved to be applicable to a simple homogeneous and a very complex velocity model. Further tests are required, but it appears on the available evidence to be model independent. Source and receiver spacings also have an effect on the goodness of an experimental setup, but there are only minor benefits to denser sampling when the increment is much smaller than the shortest wavelength included in a data set. If the borehole separation becomes unfavorably large, the information content of the data is degraded, even when many frequencies and small source and receiver spacings are considered. The findings are based on eigenvalue analyses using the true velocity models. Because under realistic conditions the true model is not known, it is shown that the optimized data sets are sufficiently robust to allow the iterative inversion schemes to converge to the global minimum. This is demonstrated by means of tomographic inversions of several optimized data sets.


Geophysics ◽  
2017 ◽  
Vol 82 (3) ◽  
pp. R199-R217 ◽  
Author(s):  
Xintao Chai ◽  
Shangxu Wang ◽  
Genyang Tang

Seismic data are nonstationary due to subsurface anelastic attenuation and dispersion effects. These effects, also referred to as the earth’s [Formula: see text]-filtering effects, can diminish seismic resolution. We previously developed a method of nonstationary sparse reflectivity inversion (NSRI) for resolution enhancement, which avoids the intrinsic instability associated with inverse [Formula: see text] filtering and generates superior [Formula: see text] compensation results. Applying NSRI to data sets that contain multiples (addressing surface-related multiples only) requires a demultiple preprocessing step because NSRI cannot distinguish primaries from multiples and will treat them as interference convolved with incorrect [Formula: see text] values. However, multiples contain information about subsurface properties. To use information carried by multiples, with the feedback model and NSRI theory, we adapt NSRI to the context of nonstationary seismic data with surface-related multiples. Consequently, not only are the benefits of NSRI (e.g., circumventing the intrinsic instability associated with inverse [Formula: see text] filtering) extended, but also multiples are considered. Our method is limited to be a 1D implementation. Theoretical and numerical analyses verify that given a wavelet, the input [Formula: see text] values primarily affect the inverted reflectivities and exert little effect on the estimated multiples; i.e., multiple estimation need not consider [Formula: see text] filtering effects explicitly. However, there are benefits for NSRI considering multiples. The periodicity and amplitude of the multiples imply the position of the reflectivities and amplitude of the wavelet. Multiples assist in overcoming scaling and shifting ambiguities of conventional problems in which multiples are not considered. Experiments using a 1D algorithm on a synthetic data set, the publicly available Pluto 1.5 data set, and a marine data set support the aforementioned findings and reveal the stability, capabilities, and limitations of the proposed method.


Geophysics ◽  
2003 ◽  
Vol 68 (6) ◽  
pp. 1782-1791 ◽  
Author(s):  
M. Graziella Kirtland Grech ◽  
Don C. Lawton ◽  
Scott Cheadle

We have developed an anisotropic prestack depth migration code that can migrate either vertical seismic profile (VSP) or surface seismic data. We use this migration code in a new method for integrated VSP and surface seismic depth imaging. Instead of splicing the VSP image into the section derived from surface seismic data, we use the same migration algorithm and a single velocity model to migrate both data sets to a common output grid. We then scale and sum the two images to yield one integrated depth‐migrated section. After testing this method on synthetic surface seismic and VSP data, we applied it to field data from a 2D surface seismic line and a multioffset VSP from the Rocky Mountain Foothills of southern Alberta, Canada. Our results show that the resulting integrated image exhibits significant improvement over that obtained from (a) the migration of either data set alone or (b) the conventional splicing approach. The integrated image uses the broader frequency bandwidth of the VSP data to provide higher vertical resolution than the migration of the surface seismic data. The integrated image also shows enhanced structural detail, since no part of the surface seismic section is eliminated, and good event continuity through the use of a single migration–velocity model, obtained by an integrated interpretation of borehole and surface seismic data. This enhanced migrated image enabled us to perform a more robust interpretation with good well ties.


Geophysics ◽  
2018 ◽  
Vol 83 (4) ◽  
pp. M41-M48 ◽  
Author(s):  
Hongwei Liu ◽  
Mustafa Naser Al-Ali

The ideal approach for continuous reservoir monitoring allows generation of fast and accurate images to cope with the massive data sets acquired for such a task. Conventionally, rigorous depth-oriented velocity-estimation methods are performed to produce sufficiently accurate velocity models. Unlike the traditional way, the target-oriented imaging technology based on the common-focus point (CFP) theory can be an alternative for continuous reservoir monitoring. The solution is based on a robust data-driven iterative operator updating strategy without deriving a detailed velocity model. The same focusing operator is applied on successive 3D seismic data sets for the first time to generate efficient and accurate 4D target-oriented seismic stacked images from time-lapse field seismic data sets acquired in a [Formula: see text] injection project in Saudi Arabia. Using the focusing operator, target-oriented prestack angle domain common-image gathers (ADCIGs) could be derived to perform amplitude-versus-angle analysis. To preserve the amplitude information in the ADCIGs, an amplitude-balancing factor is applied by embedding a synthetic data set using the real acquisition geometry to remove the geometry imprint artifact. Applying the CFP-based target-oriented imaging to time-lapse data sets revealed changes at the reservoir level in the poststack and prestack time-lapse signals, which is consistent with the [Formula: see text] injection history and rock physics.


2019 ◽  
Vol 38 (4) ◽  
pp. 268-273
Author(s):  
Maheswara Phani ◽  
Sushobhan Dutta ◽  
Kondal Reddy ◽  
Sreedurga Somasundaram

Raageshwari Deep Gas (RDG) Field is situated in the southern part of the Barmer Basin in Rajasthan, India, at a depth of 3000 m. With both clastic and volcanic lithologies, the main reservoirs are tight, and hydraulic fracturing is required to enhance productivity, especially to improve permeability through interaction of induced fractures with natural fractures. Therefore, optimal development of the RDG Field reservoirs requires characterization of faults and natural fractures. To address this challenge, a wide-azimuth 3D seismic data set over the RDG Field was processed to sharply define faults and capture anisotropy related to open natural fractures. Anisotropy was indicated by the characteristic sinusoidal nature of gather reflection events processed using conventional tilted transverse imaging (TTI); accordingly, we used orthorhombic imaging to correct for these, to quantify fracture-related anisotropy, and to yield a more correct subsurface image. During prestack depth migration (PSDM) processing of the RDG data, TTI and orthorhombic velocity modeling was done with azimuthal sectoring of reflection arrivals. The resultant PSDM data using this velocity model show substantial improvement in image quality and vertical resolution at the reservoir level compared to vintage seismic data. The improved data quality enabled analysis of specialized seismic attributes like curvature and thinned fault likelihood for more reliable characterization of faults and fractures. These attributes delineate the location and distribution of probable fracture networks within the volcanic reservoirs. Interpreted subtle faults, associated with fracture zones, were validated with microseismic, production, and image log data.


2015 ◽  
Vol 19 (12) ◽  
pp. 4747-4764 ◽  
Author(s):  
F. Alshawaf ◽  
B. Fersch ◽  
S. Hinz ◽  
H. Kunstmann ◽  
M. Mayer ◽  
...  

Abstract. Data fusion aims at integrating multiple data sources that can be redundant or complementary to produce complete, accurate information of the parameter of interest. In this work, data fusion of precipitable water vapor (PWV) estimated from remote sensing observations and data from the Weather Research and Forecasting (WRF) modeling system are applied to provide complete grids of PWV with high quality. Our goal is to correctly infer PWV at spatially continuous, highly resolved grids from heterogeneous data sets. This is done by a geostatistical data fusion approach based on the method of fixed-rank kriging. The first data set contains absolute maps of atmospheric PWV produced by combining observations from the Global Navigation Satellite Systems (GNSS) and Interferometric Synthetic Aperture Radar (InSAR). These PWV maps have a high spatial density and a millimeter accuracy; however, the data are missing in regions of low coherence (e.g., forests and vegetated areas). The PWV maps simulated by the WRF model represent the second data set. The model maps are available for wide areas, but they have a coarse spatial resolution and a still limited accuracy. The PWV maps inferred by the data fusion at any spatial resolution show better qualities than those inferred from single data sets. In addition, by using the fixed-rank kriging method, the computational burden is significantly lower than that for ordinary kriging.


2019 ◽  
Vol 38 (11) ◽  
pp. 872a1-872a9 ◽  
Author(s):  
Mauricio Araya-Polo ◽  
Stuart Farris ◽  
Manuel Florez

Exploration seismic data are heavily manipulated before human interpreters are able to extract meaningful information regarding subsurface structures. This manipulation adds modeling and human biases and is limited by methodological shortcomings. Alternatively, using seismic data directly is becoming possible thanks to deep learning (DL) techniques. A DL-based workflow is introduced that uses analog velocity models and realistic raw seismic waveforms as input and produces subsurface velocity models as output. When insufficient data are used for training, DL algorithms tend to overfit or fail. Gathering large amounts of labeled and standardized seismic data sets is not straightforward. This shortage of quality data is addressed by building a generative adversarial network (GAN) to augment the original training data set, which is then used by DL-driven seismic tomography as input. The DL tomographic operator predicts velocity models with high statistical and structural accuracy after being trained with GAN-generated velocity models. Beyond the field of exploration geophysics, the use of machine learning in earth science is challenged by the lack of labeled data or properly interpreted ground truth, since we seldom know what truly exists beneath the earth's surface. The unsupervised approach (using GANs to generate labeled data)illustrates a way to mitigate this problem and opens geology, geophysics, and planetary sciences to more DL applications.


2019 ◽  
Vol 7 (3) ◽  
pp. SE113-SE122 ◽  
Author(s):  
Yunzhi Shi ◽  
Xinming Wu ◽  
Sergey Fomel

Salt boundary interpretation is important for the understanding of salt tectonics and velocity model building for seismic migration. Conventional methods consist of computing salt attributes and extracting salt boundaries. We have formulated the problem as 3D image segmentation and evaluated an efficient approach based on deep convolutional neural networks (CNNs) with an encoder-decoder architecture. To train the model, we design a data generator that extracts randomly positioned subvolumes from large-scale 3D training data set followed by data augmentation, then feed a large number of subvolumes into the network while using salt/nonsalt binary labels generated by thresholding the velocity model as ground truth labels. We test the model on validation data sets and compare the blind test predictions with the ground truth. Our results indicate that our method is capable of automatically capturing subtle salt features from the 3D seismic image with less or no need for manual input. We further test the model on a field example to indicate the generalization of this deep CNN method across different data sets.


2017 ◽  
Vol 5 (3) ◽  
pp. SJ81-SJ90 ◽  
Author(s):  
Kainan Wang ◽  
Jesse Lomask ◽  
Felix Segovia

Well-log-to-seismic tying is a key step in many interpretation workflows for oil and gas exploration. Synthetic seismic traces from the wells are often manually tied to seismic data; this process can be very time consuming and, in some cases, inaccurate. Automatic methods, such as dynamic time warping (DTW), can match synthetic traces to seismic data. Although these methods are extremely fast, they tend to create interval velocities that are not geologically realistic. We have described the modification of DTW to create a blocked dynamic warping (BDW) method. BDW generates an automatic, optimal well tie that honors geologically consistent velocity constraints. Consequently, it results in updated velocities that are more realistic than other methods. BDW constrains the updated velocity to be constant or linearly variable inside each geologic layer. With an optimal correlation between synthetic seismograms and surface seismic data, this algorithm returns an automatically updated time-depth curve and an updated interval velocity model that still retains the original geologic velocity boundaries. In other words, the algorithm finds the optimal solution for tying the synthetic to the seismic data while restricting the interval velocity changes to coincide with the initial input blocking. We have determined the application of the BDW technique on a synthetic data example and field data set.


2020 ◽  
Vol 8 (1) ◽  
pp. T141-T149
Author(s):  
Ritesh Kumar Sharma ◽  
Satinder Chopra ◽  
Larry R. Lines

Multicomponent seismic data offer several advantages for characterizing reservoirs with the use of the vertical component (PP) and mode-converted (PS) data. Joint impedance inversion inverts both of these data sets simultaneously; hence, it is considered superior to simultaneous impedance inversion. However, the success of joint impedance inversion depends on how accurately the PS data are mapped on the PP time domain. Normally, this is attempted by performing well-to-seismic ties for PP and PS data sets and matching different horizons picked on PP and PS data. Although it seems to be a straightforward approach, there are a few issues associated with it. One of them is the lower resolution of the PS data compared with the PP data that presents difficulties in the correlation of the equivalent reflection events on both the data sets. Even after a few consistent horizons get tracked, the horizon matching process introduces some artifacts on the PS data when mapped into PP time. We have evaluated such challenges using a data set from the Western Canadian Sedimentary Basin and then develop a novel workflow for addressing them. The importance of our workflow was determined by comparing data examples generated with and without its adoption.


Sign in / Sign up

Export Citation Format

Share Document