Improving the virtual source method by wavefield separation

Geophysics ◽  
2007 ◽  
Vol 72 (4) ◽  
pp. V79-V86 ◽  
Author(s):  
Kurang Mehta ◽  
Andrey Bakulin ◽  
Jonathan Sheiman ◽  
Rodney Calvert ◽  
Roel Snieder

The virtual source method has recently been proposed to image and monitor below complex and time-varying overburden. The method requires surface shooting recorded at downhole receivers placed below the distorting or changing part of the overburden. Redatuming with the measured Green’s function allows the reconstruction of a complete downhole survey as if the sources were also buried at the receiver locations. There are still some challenges that need to be addressed in the virtual source method, such as limited acquisition aperture and energy coming from the overburden. We demonstrate that up-down wavefield separation can substantially improve the quality of virtual source data. First, it allows us to eliminate artifacts associated with the limited acquisition aperture typically used in practice. Second, it allows us to reconstruct a new optimized response in the absence of downgoing reflections and multiples from the overburden. These improvements are illustrated on a synthetic data set of a complex layered model modeled after the Fahud field in Oman, and on ocean-bottom seismic data acquired in the Mars field in the deepwater Gulf of Mexico.

Geophysics ◽  
2008 ◽  
Vol 73 (3) ◽  
pp. S73-S80 ◽  
Author(s):  
Kurang Mehta ◽  
Jon L. Sheiman ◽  
Roel Snieder ◽  
Rodney Calvert

Time-lapse monitoring is a powerful tool for tracking subsurface changes resulting from fluid migration. Conventional time-lapse monitoring can be done by observing differences between two seismic surveys over the surveillance period. Along with the changes in the subsurface, differences in the two seismic surveys are also caused by variations in the near-surface overburden and acquisition discrepancies. The virtual-source method monitors below the time-varying near-surface by redatuming the data down to the subsurface receiver locations. It crosscorrelates the signal that results from surface shooting recorded by subsurface receivers placed below the near-surface. For the Mars field data, redatuming the recorded response down to the permanently placed ocean-bottom cable (OBC) receivers using the virtual-source method allows one to reconstruct a survey as if virtualsources were buried at the OBC receiver locations and the medium above them were a homogeneous half-space. Separating the recorded wavefields into upgoing and downgoing (up-down) waves before crosscorrelation makes the resultant virtual-source data independent of the time-varying near-surface (seawater). For time-lapse monitoring, varying source signature for the two surveys and for each shot is also undesirable. Deconvolving the prestack crosscorrelated data (correlation gather) by the power spectrum of the source-time function results in virtual-source data independent of the source signature. Incorporating up-down wavefield separation and deconvolution of the correlation gather by the source power spectrum into the virtual-source method suppresses the causes of nonrepeatability in the seawater along with acquisition and source signature discrepancies. This processing combination strengthens the virtual-source method for time-lapse monitoring.


2020 ◽  
Vol 223 (3) ◽  
pp. 1888-1898
Author(s):  
Kirill Gadylshin ◽  
Ilya Silvestrov ◽  
Andrey Bakulin

SUMMARY We propose an advanced version of non-linear beamforming assisted by artificial intelligence (NLBF-AI) that includes additional steps of encoding and interpolating of wavefront attributes using inpainting with deep neural network (DNN). Inpainting can efficiently and accurately fill the holes in waveform attributes caused by acquisition geometry gaps and data quality issues. Inpainting with DNN delivers excellent quality of interpolation with the negligible computational effort and performs particularly well for a challenging case of irregular holes where other interpolation methods struggle. Since conventional brute-force attribute estimation is very costly, we can further intentionally create additional holes or masks to restrict expensive conventional estimation to a smaller subvolume and obtain missing attributes with cost-effective inpainting. Using a marine seismic data set with ocean bottom nodes, we show that inpainting can reliably recover wavefront attributes even with masked areas reaching 50–75 per cent. We validate the quality of the results by comparing attributes and enhanced data from NLBF-AI and conventional NLBF using full-density data without decimation.


Geophysics ◽  
2017 ◽  
Vol 82 (1) ◽  
pp. O11-O22 ◽  
Author(s):  
James Beckwith ◽  
Roger Clark ◽  
Linda Hodgson

The intrinsic seismic quality factor [Formula: see text] is known from poroelastic rock-physics theory to be frequency dependent, even within typical bandwidths of individual surface- and borehole-based surveys in which measurement methods usually deliver frequency-independent [Formula: see text]. Thus, measuring frequency-dependent [Formula: see text] instead offers better characterization of seismic properties and moreover a potential step toward estimating permeability directly from seismic data. Therefore, we have introduced a method to measure frequency-dependent [Formula: see text] from pairs of reflections in prestack [Formula: see text]-[Formula: see text] domain surface seismic data — a data type that, unlike a vertical seismic profile, offers useful areal coverage. Although, in principle, any analytic form with a manageable number of parameters could be prescribed, the frequency dependence of [Formula: see text] is modeled as a power law, [Formula: see text]. Inversion is done with a simple grid search over coefficient ([Formula: see text]) and exponent [Formula: see text], seeking a minimum [Formula: see text]-norm. We have found, using a numerical experiment and a synthetic data set, that it is robust and also accurate down to a signal-to-noise ratio of approximately 0.65. Then, [Formula: see text] is estimated for some 955 [Formula: see text] superbins of a 3D prestack ocean bottom cable data set over the Kinnoull field, central North Sea. All combinations of eight prominent reflections between Top Miocene and Base Cretaceous were treated together to give some 21,000 frequency-dependent and (for comparison) constant-[Formula: see text] results. The median coefficient ([Formula: see text]) and exponent [Formula: see text] were 0.0074 and 0.06, respectively, with sharply peaked distributions (excess kurtosis [Formula: see text]). Outlier, strongly frequency-dependent results, given by [Formula: see text], coincide with low-frequency “shadows” under amplitude anomalies, adversely affecting the spectra of reflections. The inferred frequency-dependent [Formula: see text] at 32.5 Hz, the center of the available bandwidth, is not statistically different from the frequency-independent [Formula: see text], 181 with a standard error from the distribution of one, derived from the same data. Hence for this data set, a constant-[Formula: see text] assumption would in fact be adequate. However, our method has the ability to measure stable estimates of [Formula: see text].


Author(s):  
Raul E. Avelar ◽  
Karen Dixon ◽  
Boniphace Kutela ◽  
Sam Klump ◽  
Beth Wemple ◽  
...  

The calibration of safety performance functions (SPFs) is a mechanism included in the Highway Safety Manual (HSM) to adjust SPFs in the HSM for use in intended jurisdictions. Critically, the quality of the calibration procedure must be assessed before using the calibrated SPFs. Multiple resources to aid practitioners in calibrating SPFs have been developed in the years following the publication of the HSM 1st edition. Similarly, the literature suggests multiple ways to assess the goodness-of-fit (GOF) of a calibrated SPF to a data set from a given jurisdiction. This paper uses the calibration results of multiple intersection SPFs to a large Mississippi safety database to examine the relations between multiple GOF metrics. The goal is to develop a sensible single index that leverages the joint information from multiple GOF metrics to assess overall quality of calibration. A factor analysis applied to the calibration results revealed three underlying factors explaining 76% of the variability in the data. From these results, the authors developed an index and performed a sensitivity analysis. The key metrics were found to be, in descending order: the deviation of the cumulative residual (CURE) plot from the 95% confidence area, the mean absolute deviation, the modified R-squared, and the value of the calibration factor. This paper also presents comparisons between the index and alternative scoring strategies, as well as an effort to verify the results using synthetic data. The developed index is recommended to comprehensively assess the quality of the calibrated intersection SPFs.


Geophysics ◽  
2006 ◽  
Vol 71 (5) ◽  
pp. U67-U76 ◽  
Author(s):  
Robert J. Ferguson

The possibility of improving regularization/datuming of seismic data is investigated by treating wavefield extrapolation as an inversion problem. Weighted, damped least squares is then used to produce the regularized/datumed wavefield. Regularization/datuming is extremely costly because of computing the Hessian, so an efficient approximation is introduced. Approximation is achieved by computing a limited number of diagonals in the operators involved. Real and synthetic data examples demonstrate the utility of this approach. For synthetic data, regularization/datuming is demonstrated for large extrapolation distances using a highly irregular recording array. Without approximation, regularization/datuming returns a regularized wavefield with reduced operator artifacts when compared to a nonregularizing method such as generalized phase shift plus interpolation (PSPI). Approximate regularization/datuming returns a regularized wavefield for approximately two orders of magnitude less in cost; but it is dip limited, though in a controllable way, compared to the full method. The Foothills structural data set, a freely available data set from the Rocky Mountains of Canada, demonstrates application to real data. The data have highly irregular sampling along the shot coordinate, and they suffer from significant near-surface effects. Approximate regularization/datuming returns common receiver data that are superior in appearance compared to conventional datuming.


Geophysics ◽  
2006 ◽  
Vol 71 (5) ◽  
pp. C81-C92 ◽  
Author(s):  
Helene Hafslund Veire ◽  
Hilde Grude Borgos ◽  
Martin Landrø

Effects of pressure and fluid saturation can have the same degree of impact on seismic amplitudes and differential traveltimes in the reservoir interval; thus, they are often inseparable by analysis of a single stacked seismic data set. In such cases, time-lapse AVO analysis offers an opportunity to discriminate between the two effects. We quantify the uncertainty in estimations to utilize information about pressure- and saturation-related changes in reservoir modeling and simulation. One way of analyzing uncertainties is to formulate the problem in a Bayesian framework. Here, the solution of the problem will be represented by a probability density function (PDF), providing estimations of uncertainties as well as direct estimations of the properties. A stochastic model for estimation of pressure and saturation changes from time-lapse seismic AVO data is investigated within a Bayesian framework. Well-known rock physical relationships are used to set up a prior stochastic model. PP reflection coefficient differences are used to establish a likelihood model for linking reservoir variables and time-lapse seismic data. The methodology incorporates correlation between different variables of the model as well as spatial dependencies for each of the variables. In addition, information about possible bottlenecks causing large uncertainties in the estimations can be identified through sensitivity analysis of the system. The method has been tested on 1D synthetic data and on field time-lapse seismic AVO data from the Gullfaks Field in the North Sea.


Geophysics ◽  
2017 ◽  
Vol 82 (3) ◽  
pp. R199-R217 ◽  
Author(s):  
Xintao Chai ◽  
Shangxu Wang ◽  
Genyang Tang

Seismic data are nonstationary due to subsurface anelastic attenuation and dispersion effects. These effects, also referred to as the earth’s [Formula: see text]-filtering effects, can diminish seismic resolution. We previously developed a method of nonstationary sparse reflectivity inversion (NSRI) for resolution enhancement, which avoids the intrinsic instability associated with inverse [Formula: see text] filtering and generates superior [Formula: see text] compensation results. Applying NSRI to data sets that contain multiples (addressing surface-related multiples only) requires a demultiple preprocessing step because NSRI cannot distinguish primaries from multiples and will treat them as interference convolved with incorrect [Formula: see text] values. However, multiples contain information about subsurface properties. To use information carried by multiples, with the feedback model and NSRI theory, we adapt NSRI to the context of nonstationary seismic data with surface-related multiples. Consequently, not only are the benefits of NSRI (e.g., circumventing the intrinsic instability associated with inverse [Formula: see text] filtering) extended, but also multiples are considered. Our method is limited to be a 1D implementation. Theoretical and numerical analyses verify that given a wavelet, the input [Formula: see text] values primarily affect the inverted reflectivities and exert little effect on the estimated multiples; i.e., multiple estimation need not consider [Formula: see text] filtering effects explicitly. However, there are benefits for NSRI considering multiples. The periodicity and amplitude of the multiples imply the position of the reflectivities and amplitude of the wavelet. Multiples assist in overcoming scaling and shifting ambiguities of conventional problems in which multiples are not considered. Experiments using a 1D algorithm on a synthetic data set, the publicly available Pluto 1.5 data set, and a marine data set support the aforementioned findings and reveal the stability, capabilities, and limitations of the proposed method.


Geophysics ◽  
2021 ◽  
pp. 1-42
Author(s):  
Yike Liu ◽  
Yanbao Zhang ◽  
Yingcai Zheng

Multiples follow long paths and carry more information on the subsurface than primary reflections, making them particularly useful for imaging. However, seismic migration using multiples can generate crosstalk artifacts in the resulting images because multiples of different orders interfere with each others, and crosstalk artifacts greatly degrade the quality of an image. We propose to form a supergather by applying phase-encoding functions to image multiples and stacking several encoded controlled-order multiples. The multiples are separated into different orders using multiple decomposition strategies. The method is referred to as the phase-encoded migration of all-order multiples (PEM). The new migration can be performed by applying only two finite-difference solutions to the wave equation. The solutions include backward-extrapolating the blended virtual receiver data and forward-propagating the summed virtual source data. The proposed approach can significantly attenuate crosstalk artifacts and also significantly reduce computational costs. Numerical examples demonstrate that the PEM can remove relatively strong crosstalk artifacts generated by multiples and is a promising approach for imaging subsurface targets.


2017 ◽  
Vol 5 (3) ◽  
pp. SJ81-SJ90 ◽  
Author(s):  
Kainan Wang ◽  
Jesse Lomask ◽  
Felix Segovia

Well-log-to-seismic tying is a key step in many interpretation workflows for oil and gas exploration. Synthetic seismic traces from the wells are often manually tied to seismic data; this process can be very time consuming and, in some cases, inaccurate. Automatic methods, such as dynamic time warping (DTW), can match synthetic traces to seismic data. Although these methods are extremely fast, they tend to create interval velocities that are not geologically realistic. We have described the modification of DTW to create a blocked dynamic warping (BDW) method. BDW generates an automatic, optimal well tie that honors geologically consistent velocity constraints. Consequently, it results in updated velocities that are more realistic than other methods. BDW constrains the updated velocity to be constant or linearly variable inside each geologic layer. With an optimal correlation between synthetic seismograms and surface seismic data, this algorithm returns an automatically updated time-depth curve and an updated interval velocity model that still retains the original geologic velocity boundaries. In other words, the algorithm finds the optimal solution for tying the synthetic to the seismic data while restricting the interval velocity changes to coincide with the initial input blocking. We have determined the application of the BDW technique on a synthetic data example and field data set.


Geophysics ◽  
2005 ◽  
Vol 70 (5) ◽  
pp. U51-U65 ◽  
Author(s):  
Stig-Kyrre Foss ◽  
Bjørn Ursin ◽  
Maarten V. de Hoop

We present a method of reflection tomography for anisotropic elastic parameters from PP and PS reflection seismic data. The method is based upon the differential semblance misfit functional in scattering angle and azimuth (DSA) acting on common-image-point gathers (CIGs) to find fitting velocity models. The CIGs are amplitude corrected using a generalized Radon transform applied to the data. Depth consistency between the PP and PS images is enforced by penalizing any mis-tie between imaged key reflectors. The mis-tie is evaluated by means of map migration-demigration applied to the geometric information (times and slopes) contained in the data. In our implementation, we simplify the codepthing approach to zero-scattering-angle data only. The resulting measure is incorporated as a regularization in the DSA misfit functional. We then resort to an optimization procedure, restricting ourselves to transversely isotropic (TI) velocity models. In principle, depending on the available surface-offset range and orientation of reflectors in the subsurface, by combining the DSA with codepthing, the anisotropic parameters for TI models can be determined, provided the orientation of the symmetry axis is known. A proposed strategy is applied to an ocean-bottom-seismic field data set from the North Sea.


Sign in / Sign up

Export Citation Format

Share Document