Inpainting of local wavefront attributes using artificial intelligence for enhancement of massive 3-D pre-stack seismic data

2020 ◽  
Vol 223 (3) ◽  
pp. 1888-1898
Author(s):  
Kirill Gadylshin ◽  
Ilya Silvestrov ◽  
Andrey Bakulin

SUMMARY We propose an advanced version of non-linear beamforming assisted by artificial intelligence (NLBF-AI) that includes additional steps of encoding and interpolating of wavefront attributes using inpainting with deep neural network (DNN). Inpainting can efficiently and accurately fill the holes in waveform attributes caused by acquisition geometry gaps and data quality issues. Inpainting with DNN delivers excellent quality of interpolation with the negligible computational effort and performs particularly well for a challenging case of irregular holes where other interpolation methods struggle. Since conventional brute-force attribute estimation is very costly, we can further intentionally create additional holes or masks to restrict expensive conventional estimation to a smaller subvolume and obtain missing attributes with cost-effective inpainting. Using a marine seismic data set with ocean bottom nodes, we show that inpainting can reliably recover wavefront attributes even with masked areas reaching 50–75 per cent. We validate the quality of the results by comparing attributes and enhanced data from NLBF-AI and conventional NLBF using full-density data without decimation.

Geophysics ◽  
2007 ◽  
Vol 72 (4) ◽  
pp. V79-V86 ◽  
Author(s):  
Kurang Mehta ◽  
Andrey Bakulin ◽  
Jonathan Sheiman ◽  
Rodney Calvert ◽  
Roel Snieder

The virtual source method has recently been proposed to image and monitor below complex and time-varying overburden. The method requires surface shooting recorded at downhole receivers placed below the distorting or changing part of the overburden. Redatuming with the measured Green’s function allows the reconstruction of a complete downhole survey as if the sources were also buried at the receiver locations. There are still some challenges that need to be addressed in the virtual source method, such as limited acquisition aperture and energy coming from the overburden. We demonstrate that up-down wavefield separation can substantially improve the quality of virtual source data. First, it allows us to eliminate artifacts associated with the limited acquisition aperture typically used in practice. Second, it allows us to reconstruct a new optimized response in the absence of downgoing reflections and multiples from the overburden. These improvements are illustrated on a synthetic data set of a complex layered model modeled after the Fahud field in Oman, and on ocean-bottom seismic data acquired in the Mars field in the deepwater Gulf of Mexico.


Geophysics ◽  
2013 ◽  
Vol 78 (5) ◽  
pp. W31-W44 ◽  
Author(s):  
Anton Ziolkowski

I consider the problem of finding the impulse response, or Green’s function, from a measured response including noise, given an estimate of the source time function. This process is usually known as signature deconvolution. Classical signature deconvolution provides no measure of the quality of the result and does not separate signal from noise. Recovery of the earth impulse response is here formulated as the calculation of a Wiener filter in which the estimated source signature is the input and the measured response is the desired output. Convolution of this filter with the estimated source signature is the part of the measured response that is correlated with the estimated signature. Subtraction of the correlated part from the measured response yields the estimated noise, or the uncorrelated part. The fraction of energy not contained in this uncorrelated component is defined as the quality of the filter. If the estimated source signature contains errors, the estimated earth impulse response is incomplete, and the estimated noise contains signal, recognizable as trace-to-trace correlation. The method can be applied to many types of geophysical data, including earthquake seismic data, exploration seismic data, and controlled source electromagnetic data; it is illustrated here with examples of marine seismic and marine transient electromagnetic data.


Geophysics ◽  
2005 ◽  
Vol 70 (5) ◽  
pp. U51-U65 ◽  
Author(s):  
Stig-Kyrre Foss ◽  
Bjørn Ursin ◽  
Maarten V. de Hoop

We present a method of reflection tomography for anisotropic elastic parameters from PP and PS reflection seismic data. The method is based upon the differential semblance misfit functional in scattering angle and azimuth (DSA) acting on common-image-point gathers (CIGs) to find fitting velocity models. The CIGs are amplitude corrected using a generalized Radon transform applied to the data. Depth consistency between the PP and PS images is enforced by penalizing any mis-tie between imaged key reflectors. The mis-tie is evaluated by means of map migration-demigration applied to the geometric information (times and slopes) contained in the data. In our implementation, we simplify the codepthing approach to zero-scattering-angle data only. The resulting measure is incorporated as a regularization in the DSA misfit functional. We then resort to an optimization procedure, restricting ourselves to transversely isotropic (TI) velocity models. In principle, depending on the available surface-offset range and orientation of reflectors in the subsurface, by combining the DSA with codepthing, the anisotropic parameters for TI models can be determined, provided the orientation of the symmetry axis is known. A proposed strategy is applied to an ocean-bottom-seismic field data set from the North Sea.


Geophysics ◽  
2006 ◽  
Vol 71 (3) ◽  
pp. S99-S110
Author(s):  
Daniel A. Rosales ◽  
Biondo Biondi

A new partial-prestack migration operator to manipulate multicomponent data, called converted-wave azimuth moveout (PS-AMO), transforms converted-wave prestack data with an arbitrary offset and azimuth to equivalent data with a new offset and azimuth position. This operator is a sequential application of converted-wave dip moveout and its inverse. As expected, PS-AMO reduces to the known expression of AMO for the extreme case when the P velocity is the same as the S velocity. Moreover, PS-AMO preserves the resolution of dipping events and internally applies a correction for the lateral shift between the common-midpoint and the common-reflection/conversion point. An implementation of PS-AMO in the log-stretch frequency-wavenumber domain is computationally efficient. The main applications for the PS-AMO operator are geometry regularization, data-reduction through partial stacking, and interpolation of unevenly sampled data. We test our PS-AMO operator by solving 3D acquisition geometry-regularization problems for multicomponent, ocean-bottom seismic data. The geometry-regularization problem is defined as a regularized least-squares-objective function. To preserve the resolution of dipping events, the regularization term uses the PS-AMO operator. Application of this methodology on a portion of the Alba 3D, multicomponent, ocean-bottom seismic data set shows that we can satisfactorily obtain an interpolated data set that honors the physics of converted waves.


Geophysics ◽  
1996 ◽  
Vol 61 (6) ◽  
pp. 1804-1812 ◽  
Author(s):  
Ho‐Young Lee ◽  
Byung‐Koo Hyun ◽  
Young‐Sae Kong

We have improved the quality of high‐resolution marine seismic data using a simple PC‐based acquisition and processing system. The system consists of a PC, an A/D converter, and a magneto‐optical disk drive. The system has been designed to acquire single‐channel data at up to 60,000 samples per second and to perform data processing of seismic data by a simple procedure. Test surveys have been carried out off Pohang, southern East Sea of Korea. The seismic systems used for the test were an air gun and a 3.5 kHz sub‐bottom profiling system. Spectral characteristics of the sources were analyzed. Simple digital signal processes which include gain recovery, deconvolution, band‐pass filter, and swell filter were performed. The quality of seismic sections produced by the system is greatly enhanced in comparison to analog sections. The PC‐based system for acquisition and processing of high‐resolution marine seismic data is economical and versatile.


2021 ◽  
pp. 1-60
Author(s):  
Darrell A. Terry ◽  
Camelia C. Knapp

The presence of marine gas hydrates is routinely inferred based on the identification of bottom simulating reflectors (BSRs) in common depth-point (CDP) seismic images. Additional seismic studies such as amplitude variation with offset (AVO) analysis can be applied for corroboration. Though confirmation is needed by drilling and sampling, seismic analysis has proven to be a cost-effective approach to identify the presence of marine gas hydrates. Single channel far offset seismic images are investigated for what appears to be a more reliable and cost-effective indicator for the presence of bottom simulating reflectors than traditional CDP processing or AVO analysis. A non-traditional approach to processing seismic data is taken to be more relevant to imaging the gas/gas hydrate contact. Instead of applying the traditional CDP seismic processing workflows from the oil industry, we more carefully review the significant amount of information existing in the data to explore how the character of the data changes as offset angle increases. Three cases from different environments are selected for detailed analysis. These include 1) stratigraphy running parallel with the ocean bottom; 2) a potential bottom simulating reflector, running parallel to the ocean bottom, and cross-cutting dipping reflections, and 3) a suspected thermal intrusion without a recognizable bottom simulating reflector. This investigation considers recently collected multi-channel seismic data from the deep waters of the central Aleutian Basin beneath the Bering Sea, the pre-processing of the data sets, and the methodology for processing and display to generate single channel seismic images. Descriptions are provided for the single channel near and far offset seismic images for the example cases. Results indicate that BSRs related to marine gas hydrates, and originating due to the presence of free gas, are more easily and uniquely identifiable from single channel displays of far offset seismic images than from traditional CDP displays.


Geophysics ◽  
2008 ◽  
Vol 73 (6) ◽  
pp. B109-B115 ◽  
Author(s):  
Michael V. DeAngelo ◽  
Paul E. Murray ◽  
Bob A. Hardage ◽  
Randy L. Remington

Using 2D four-component ocean-bottom-cable (2D 4-C OBC) seismic data processed in common-receiver gathers, we developed robust [Formula: see text] and [Formula: see text] interval velocities for the near-seafloor strata. A vital element of the study was to implement iterative interpretation techniques to correlate near-seafloor P-P and P-SV images. Initially, depth-equivalent P-P and P-SV layers were interpreted by visually matching similar events in both seismic modes. Complementary 1D ray-tracing analyses then determined interval values of subsea-floor [Formula: see text] and [Formula: see text] velocities across a series of earth layers extending from the seafloor to below the base of the hydrate stability zone (BHSZ) to further constrain these interpretations. Iterating interpretation of depth-equivalent horizons with velocity analyses allowed us to converge on physically reasonable velocity models. Simultaneous [Formula: see text] and [Formula: see text] velocity analysis provided additional model constraints in areas where data quality of one reflection mode (usually [Formula: see text] in the near-seafloor environments) would not provide adequate information to derive reliable velocity information.


Geophysics ◽  
2010 ◽  
Vol 75 (4) ◽  
pp. R83-R91 ◽  
Author(s):  
Hassan Masoomzadeh ◽  
Penny J. Barton ◽  
Satish C. Singh

We have developed a pragmatic new processing strategy to enhance seismic information obtained from long-offset multichannel seismic data. The conventional processing approach, which treats data on a sample-by-sample basis, is applied at a coarser scale on groups of samples. Using this approach, a reflected event and its vicinity remain unstretched during the normal moveout correction. Isomoveout curves (lines of equal moveout) in the time-velocity panel are employed to apply a constant moveout correction to selected individual events, leading to a nonstretch stack. A zigzag stacking-velocity function is introduced as a combination of segments of appropriate isomoveout curves. By employing a zigzag velocity function, stretching of key events is avoided and thus information at far offset is preserved in the stack. The method is also computationally cost-effective. However, the zigzag stacking-velocity field must be consistent with target horizons. This method of horizon-consistent nonstretch moveout has been applied to a wide-angle data set from the North Atlantic margin, providing improved images of the basement interface, which was previously poorly imaged.


2020 ◽  
Author(s):  
Mark Tamisiea ◽  
Benjamin Krichman ◽  
Himanshu Save ◽  
Srinivas Bettadpur ◽  
Zhigui Kang ◽  
...  

<p>To assess the quality of the CSR solutions, we compare results against external data sets that have contemporaneous availability.  These evaluations fall into three categories: changes in terrestrial water storage against data from the North American and Global Land Data Assimilation Systems, variations in ocean bottom pressure against data from the Deep Ocean Assessment of Tsunami Network, and estimates of the low degree and order Stokes coefficients compared against those inferred from satellite laser ranging observations (i.e. the CSR monthly 5x5 gravity harmonics from the MEaSUREs project).   As the mission provides a unique measurement of mass changes in the Earth system, evaluation of the new solutions against other data sets and models is challenging.  Thus, we primarily focus on relative agreement with these data set with the GRACE-FO solutions in relation to the historic agreement of the data sets with the GRACE solutions.</p>


Sign in / Sign up

Export Citation Format

Share Document