Automatic, geologic layer-constrained well-seismic tie through blocked dynamic warping

2017 ◽  
Vol 5 (3) ◽  
pp. SJ81-SJ90 ◽  
Author(s):  
Kainan Wang ◽  
Jesse Lomask ◽  
Felix Segovia

Well-log-to-seismic tying is a key step in many interpretation workflows for oil and gas exploration. Synthetic seismic traces from the wells are often manually tied to seismic data; this process can be very time consuming and, in some cases, inaccurate. Automatic methods, such as dynamic time warping (DTW), can match synthetic traces to seismic data. Although these methods are extremely fast, they tend to create interval velocities that are not geologically realistic. We have described the modification of DTW to create a blocked dynamic warping (BDW) method. BDW generates an automatic, optimal well tie that honors geologically consistent velocity constraints. Consequently, it results in updated velocities that are more realistic than other methods. BDW constrains the updated velocity to be constant or linearly variable inside each geologic layer. With an optimal correlation between synthetic seismograms and surface seismic data, this algorithm returns an automatically updated time-depth curve and an updated interval velocity model that still retains the original geologic velocity boundaries. In other words, the algorithm finds the optimal solution for tying the synthetic to the seismic data while restricting the interval velocity changes to coincide with the initial input blocking. We have determined the application of the BDW technique on a synthetic data example and field data set.

Geophysics ◽  
1993 ◽  
Vol 58 (6) ◽  
pp. 873-882 ◽  
Author(s):  
Roelof Jan Versteeg

To get a correct earth image from seismic data acquired over complex structures it is essential to use prestack depth migration. A necessary condition for obtaining a correct image is that the prestack depth migration is done with an accurate velocity model. In cases where we need to use prestack depth migration determination of such a model using conventional methods does not give satisfactory results. Thus, new iterative methods for velocity model determination have been developed. The convergence of these methods can be accelerated by defining constraints on the model in such a way that the method only looks for those components of the true earth velocity field that influence the migrated image. In order to determine these components, the sensitivity of the prestack depth migration result to the velocity model is examined using a complex synthetic data set (the Marmousi data set) for which the exact model is known. The images obtained with increasingly smoothed versions of the true model are compared, and it is shown that the minimal spatial wavelength that needs to be in the model to obtain an accurate depth image from the data set is of the order of 200 m. The model space that has to be examined to find an accurate velocity model from complex seismic data can thus be constrained. This will increase the speed and probability of convergence of iterative velocity model determination methods.


Geophysics ◽  
2006 ◽  
Vol 71 (3) ◽  
pp. U37-U46 ◽  
Author(s):  
Tariq Alkhalifah ◽  
Claudio Bagaini

Wave-equation-based redatuming is expensive and requires a detailed knowledge of the shallow velocity field. We derive the analytical expression of a new prestack wavefield extrapolation operator, the Topographic Datuming Operator (TDO), which applies redatuming based on straight-rays approximation above and below a chosen datum. This redatuming operator is directly applied to common-source gathers to downward continue the source and the receivers, simultaneously, to the datum level without resorting to common-receiver gathers. As a result, the method is far more efficient and robust than the conventional wave-equation-based redatuming and does not require an accurate depth-domain interval velocity model. In addition, TDO, unlike wave-equation-based redatuming, requires effective velocities above datum, and thus can be applied using attributes valid for static correction methods. Effective velocities beneath the datum permit us to replace the surface integral, which is needed for wave-equation redatuming with a line integral. In the particular case of infinite (in practice, very high with respect to the shallow layers) velocity beneath the datum, the TDO impulse response collapses to a point, and TDO redatuming is equivalent to conventional static correction, which may, therefore, be regarded as a special case of the newly derived operator. The computational cost of applying TDO is slightly larger than static corrections, yet provides higher quality results partially attributable to the ability of TDO to suppress diffractions emanating from anomalies above datum. Since TDO is an operation based on geometrical optics approximation, velocity after TDO is not biased by the vertical shift correction associated with conventional static correction. Application to a synthetic data set demonstrates the features of the method.


Energies ◽  
2021 ◽  
Vol 14 (14) ◽  
pp. 4105
Author(s):  
Shaoyong Liu ◽  
Wenting Zhu ◽  
Zhe Yan ◽  
Peng Xu ◽  
Huazhong Wang

The estimation of the subsurface acoustic impedance (AI) model is an important step of seismic data processing for oil and gas exploration. The full waveform inversion (FWI) is a powerful way to invert the subsurface parameters with surface acquired seismic data. Nevertheless, the strong nonlinear relationship between the seismic data and the subsurface model will cause nonconvergence and unstable problems in practice. To divide the nonlinear inversion into some more linear steps, a 2D AI inversion imaging method is proposed to estimate the broadband AI model based on a broadband reflectivity. Firstly, a novel scheme based on Gaussian beam migration (GBM) is proposed to produce the point spread function (PSF) and conventional image of the subsurface. Then, the broadband reflectivity can be obtained by implementing deconvolution on the image with respect to the calculated PSF. Assuming that the low-wavenumber part of the AI model can be deduced by the background velocity, we implemented the AI inversion imaging scheme by merging the obtained broadband reflectivity as the high-wavenumber part of the AI model and produced a broadband AI result. The developed broadband migration based on GBM as the computational hotspot of the proposed 2D AI inversion imaging includes only two GBM and one Gaussian beam demigraton (Born modeling) processes. Hence, the developed broadband GBM is more efficient than the broadband imaging using the least-squares migrations (LSMs) that require multiple iterations (every iteration includes one Born modeling and one migration process) to minimize the objective function of data residuals. Numerical examples of both synthetic data and field data have demonstrated the validity and application potential of the proposed method.


Geophysics ◽  
2006 ◽  
Vol 71 (5) ◽  
pp. U67-U76 ◽  
Author(s):  
Robert J. Ferguson

The possibility of improving regularization/datuming of seismic data is investigated by treating wavefield extrapolation as an inversion problem. Weighted, damped least squares is then used to produce the regularized/datumed wavefield. Regularization/datuming is extremely costly because of computing the Hessian, so an efficient approximation is introduced. Approximation is achieved by computing a limited number of diagonals in the operators involved. Real and synthetic data examples demonstrate the utility of this approach. For synthetic data, regularization/datuming is demonstrated for large extrapolation distances using a highly irregular recording array. Without approximation, regularization/datuming returns a regularized wavefield with reduced operator artifacts when compared to a nonregularizing method such as generalized phase shift plus interpolation (PSPI). Approximate regularization/datuming returns a regularized wavefield for approximately two orders of magnitude less in cost; but it is dip limited, though in a controllable way, compared to the full method. The Foothills structural data set, a freely available data set from the Rocky Mountains of Canada, demonstrates application to real data. The data have highly irregular sampling along the shot coordinate, and they suffer from significant near-surface effects. Approximate regularization/datuming returns common receiver data that are superior in appearance compared to conventional datuming.


Geophysics ◽  
2006 ◽  
Vol 71 (5) ◽  
pp. C81-C92 ◽  
Author(s):  
Helene Hafslund Veire ◽  
Hilde Grude Borgos ◽  
Martin Landrø

Effects of pressure and fluid saturation can have the same degree of impact on seismic amplitudes and differential traveltimes in the reservoir interval; thus, they are often inseparable by analysis of a single stacked seismic data set. In such cases, time-lapse AVO analysis offers an opportunity to discriminate between the two effects. We quantify the uncertainty in estimations to utilize information about pressure- and saturation-related changes in reservoir modeling and simulation. One way of analyzing uncertainties is to formulate the problem in a Bayesian framework. Here, the solution of the problem will be represented by a probability density function (PDF), providing estimations of uncertainties as well as direct estimations of the properties. A stochastic model for estimation of pressure and saturation changes from time-lapse seismic AVO data is investigated within a Bayesian framework. Well-known rock physical relationships are used to set up a prior stochastic model. PP reflection coefficient differences are used to establish a likelihood model for linking reservoir variables and time-lapse seismic data. The methodology incorporates correlation between different variables of the model as well as spatial dependencies for each of the variables. In addition, information about possible bottlenecks causing large uncertainties in the estimations can be identified through sensitivity analysis of the system. The method has been tested on 1D synthetic data and on field time-lapse seismic AVO data from the Gullfaks Field in the North Sea.


10.1144/sp509 ◽  
2021 ◽  
Vol 509 (1) ◽  
pp. NP-NP
Author(s):  
J. Hendry ◽  
P. Burgess ◽  
D. Hunt ◽  
X. Janson ◽  
V. Zampetti

Modern seismic data have become an essential toolkit for studying carbonate platforms and reservoirs in impressive detail. Whilst driven primarily by oil and gas exploration and development, data sharing and collaboration are delivering fundamental geological knowledge on carbonate systems, revealing platform geomorphologies and how their evolution on millennial time scales, as well as kilometric length scales, was forced by long-term eustatic, oceanographic or tectonic factors. Quantitative interrogation of modern seismic attributes in carbonate reservoirs permits flow units and barriers arising from depositional and diagenetic processes to be imaged and extrapolated between wells.This volume reviews the variety of carbonate platform and reservoir characteristics that can be interpreted from modern seismic data, illustrating the benefits of creative interaction between geophysical and carbonate geological experts at all stages of a seismic campaign. Papers cover carbonate exploration, including the uniquely challenging South Atlantic pre-salt reservoirs, seismic modelling of carbonates, and seismic indicators of fluid flow and diagenesis.


2019 ◽  
Vol 217 (3) ◽  
pp. 1727-1741 ◽  
Author(s):  
D W Vasco ◽  
Seiji Nakagawa ◽  
Petr Petrov ◽  
Greg Newman

SUMMARY We introduce a new approach for locating earthquakes using arrival times derived from waveforms. The most costly computational step of the algorithm scales as the number of stations in the active seismographic network. In this approach, a variation on existing grid search methods, a series of full waveform simulations are conducted for all receiver locations, with sources positioned successively at each station. The traveltime field over the region of interest is calculated by applying a phase picking algorithm to the numerical wavefields produced from each simulation. An event is located by subtracting the stored traveltime field from the arrival time at each station. This provides a shifted and time-reversed traveltime field for each station. The shifted and time-reversed fields all approach the origin time of the event at the source location. The mean or median value at the source location thus approximates the event origin time. Measures of dispersion about this mean or median time at each grid point, such as the sample standard error and the average deviation, are minimized at the correct source position. Uncertainty in the event position is provided by the contours of standard error defined over the grid. An application of this technique to a synthetic data set indicates that the approach provides stable locations even when the traveltimes are contaminated by additive random noise containing a significant number of outliers and velocity model errors. It is found that the waveform-based method out-performs one based upon the eikonal equation for a velocity model with rapid spatial variations in properties due to layering. A comparison with conventional location algorithms in both a laboratory and field setting demonstrates that the technique performs at least as well as existing techniques.


Geophysics ◽  
2017 ◽  
Vol 82 (3) ◽  
pp. R199-R217 ◽  
Author(s):  
Xintao Chai ◽  
Shangxu Wang ◽  
Genyang Tang

Seismic data are nonstationary due to subsurface anelastic attenuation and dispersion effects. These effects, also referred to as the earth’s [Formula: see text]-filtering effects, can diminish seismic resolution. We previously developed a method of nonstationary sparse reflectivity inversion (NSRI) for resolution enhancement, which avoids the intrinsic instability associated with inverse [Formula: see text] filtering and generates superior [Formula: see text] compensation results. Applying NSRI to data sets that contain multiples (addressing surface-related multiples only) requires a demultiple preprocessing step because NSRI cannot distinguish primaries from multiples and will treat them as interference convolved with incorrect [Formula: see text] values. However, multiples contain information about subsurface properties. To use information carried by multiples, with the feedback model and NSRI theory, we adapt NSRI to the context of nonstationary seismic data with surface-related multiples. Consequently, not only are the benefits of NSRI (e.g., circumventing the intrinsic instability associated with inverse [Formula: see text] filtering) extended, but also multiples are considered. Our method is limited to be a 1D implementation. Theoretical and numerical analyses verify that given a wavelet, the input [Formula: see text] values primarily affect the inverted reflectivities and exert little effect on the estimated multiples; i.e., multiple estimation need not consider [Formula: see text] filtering effects explicitly. However, there are benefits for NSRI considering multiples. The periodicity and amplitude of the multiples imply the position of the reflectivities and amplitude of the wavelet. Multiples assist in overcoming scaling and shifting ambiguities of conventional problems in which multiples are not considered. Experiments using a 1D algorithm on a synthetic data set, the publicly available Pluto 1.5 data set, and a marine data set support the aforementioned findings and reveal the stability, capabilities, and limitations of the proposed method.


Geophysics ◽  
2003 ◽  
Vol 68 (6) ◽  
pp. 1782-1791 ◽  
Author(s):  
M. Graziella Kirtland Grech ◽  
Don C. Lawton ◽  
Scott Cheadle

We have developed an anisotropic prestack depth migration code that can migrate either vertical seismic profile (VSP) or surface seismic data. We use this migration code in a new method for integrated VSP and surface seismic depth imaging. Instead of splicing the VSP image into the section derived from surface seismic data, we use the same migration algorithm and a single velocity model to migrate both data sets to a common output grid. We then scale and sum the two images to yield one integrated depth‐migrated section. After testing this method on synthetic surface seismic and VSP data, we applied it to field data from a 2D surface seismic line and a multioffset VSP from the Rocky Mountain Foothills of southern Alberta, Canada. Our results show that the resulting integrated image exhibits significant improvement over that obtained from (a) the migration of either data set alone or (b) the conventional splicing approach. The integrated image uses the broader frequency bandwidth of the VSP data to provide higher vertical resolution than the migration of the surface seismic data. The integrated image also shows enhanced structural detail, since no part of the surface seismic section is eliminated, and good event continuity through the use of a single migration–velocity model, obtained by an integrated interpretation of borehole and surface seismic data. This enhanced migrated image enabled us to perform a more robust interpretation with good well ties.


2021 ◽  
pp. 1-45
Author(s):  
Qin Su ◽  
Huahui Zeng ◽  
Yancan Tian ◽  
HaiLiang Li ◽  
Lei Lyu ◽  
...  

Seismic processing and interpretation techniques provide important tools for the oil and gas exploration of the Songliao Basin in eastern China, which is dominated by terrestrial facies. In the Songliao Basin, a large number of thin-sand reservoirs are widely distributed, which are the primary targets of potential oil and gas exploration and exploitation. An important job of the exploration in the Songliao Basin is to accurately describe the distribution of these thin-sand belts and the sand-body shapes. However, the thickness of these thin-sand reservoirs are generally below the resolution of the conventional seismic processing. Most of the reservoirs are thin-interbeds of sand and mudstones with strong vertical and lateral variations. This makes it difficult to accurately predict the vertical and horizontal distribution of the thin-sand bodies using the conventional seismic processing and interpretation methods. Additionally, these lithologic traps are difficult to identify due to the complex controlling factor and distribution characteristics, and strong concealment. These challenges motivate us to improve the seismic data quality to help delineate the thin-sand reservoirs. In this paper, we use the broadband, wide-azimuth, and high-density integrated seismic exploration technique to help delineate the thin-reservoirs. We first use field single-point excitation and single-point receiver acquisition to obtain seismic data with wide frequency-bands, wide-azimuth angles, and high folds, which contain rich geological information. Next, we perform the near-surface Q-compensation, viscoelastic prestack time migration, seismic attributes, and seismic waveform indication inversion on the new acquired seismic data. The 3D case study indicates the benefits of improving the imaging of thin-sand body and the accuracy of inversion and reservoir characterization using the method in this paper.


Sign in / Sign up

Export Citation Format

Share Document