Automatic phase correction of common‐midpoint stacked data

Geophysics ◽  
1987 ◽  
Vol 52 (1) ◽  
pp. 51-59 ◽  
Author(s):  
S. Levy ◽  
D. W. Oldenburg

The residual wavelet on a processed seismic section is often not zero phase despite all efforts to make it so. In this paper we adopt the convolutional model for the processed seismogram, assume that the residual phase shift can be approximated by a frequency‐independent constant, and use the varimax norm to generate an algorithm to estimate the residual phase directly. Application of our algorithm to reflectivities from well logs suggests that it should work in the majority of cases so long as the reflectivity is non‐Gaussian. An application of our algorithm to stacked data enhances the interpretability of the seismic section and leads to an improved match between the recovered relative acoustic impedance and a measured velocity log.

1988 ◽  
Vol 42 (2) ◽  
pp. 336-341 ◽  
Author(s):  
Colleen A. McCoy ◽  
James A. De Haseth

Several sources of phase-correction-induced spectral anomalies in FT-IR vibrational circular dichroism (VCD) spectra have been investigated. Misidentification of the zero-phase retardation position in dichroic interferograms that exhibit no optical or electronic bias can produce spectral errors. Production of such errors is from the introduction of linear phase error into the phase curve. When the zero-phase retardation position is correctly identified, other spectral anomalies, such as “reflected peaks,” can appear in VCD spectra. These peaks are readily observed in quarterwave plate reference spectra. The anomalies are directly correlated to the arctangent function used to define the phase curve and result only from the nature of the VCD signal. VCD spectra can exhibit negative, as well as positive, peaks; consequently the phase correction must be designed to accommodate negative features. Both Mertz and Forman phase-correction algorithms have been modified to correct the phase of VCD interferograms without error. Such corrections are not necessary, or even desirable, for normal absorption spectrometry.


1985 ◽  
Vol 25 (1) ◽  
pp. 254
Author(s):  
T.J.C. Prudence ◽  
J. Flentri

The Kanpa 1A Vertical Seismic Profile (VSP) was conducted for Shell by Schlumberger and incorporated variable time and depth sampling, different source offsets and recording in cased and uncased hole. Processing was performed using Shell proprietary programs, with particular attention to:Editing and resampling of the data setSeparation of upgoing and downgoing waves using FK and median filtersComparison of gain recovery based on modelling the amplitude decay of direct arrivals with an averaging process (AGC)Sensitivity of the final VSP stack to blanking of residual tube wave noiseZero-phase whitening of the VSP traceStacked VSP traces for Kanpa 1A were compared with a zero-phase seismic section and synthetic seismogram at the well. The VSP/seismic match is good and, due to poor synthetic/seismic correlation, was the basis for the final seismic/well tie. Interpretation of deep VSP data enabled the estimation of formation boundaries below the total depth of the well.It is concluded that VSPs can be invaluable in establishing well ties where seismic is poor or when detailed correlation is required (e.g. stratigraphic traps). Reflectors "ahead of the bit" can be interpreted from VSPs based on assumed velocities and VSP/seismic tie, and the predicted thickness and seismic character of the target interval. A consistent field configuration is recommended for acquisition with attention to tube wave suppression and adequate spatial and temporal sampling. Previous processing experience is advantageous if quick and reliable VSP results are required for decisions while drilling.


2020 ◽  
pp. 1-1 ◽  
Author(s):  
Liang Hu ◽  
Xueyang Tian ◽  
Guiling Wu ◽  
Mengya Kong ◽  
Jianguo Shen ◽  
...  

Geophysics ◽  
1984 ◽  
Vol 49 (4) ◽  
pp. 379-397 ◽  
Author(s):  
Bruce Gibson ◽  
Ken Larner

Predictive deconvolution is commonly applied to seismic data generated with a Vibroseisr® source. Unfortunately, when this process invokes a minimum‐phase assumption, the phase of the resulting trace will not be correct. Nonetheless, spiking deconvolution is an attractive process because it restores attenuated higher frequencies, thus increasing resolution. For detailed stratigraphic analyses, however, it is desirable that the phase of the data be treated properly as well. The most common solution is to apply a phase‐shifting filter that corrects for errors attributable to a zero‐phase source. The phase correction is given by the minimum‐phase spectrum of the correlated Vibroseis wavelet. Because no minimum‐phase spectrum truly exists for this bandlimited wavelet, white noise is added to its amplitude spectrum in order to design the phase‐correction filter. Different levels of white noise, however, produce markedly different results when field data sections are filtered. A simple argument suggests that the amount of white noise used should match that added in designing the (minimum‐phase) spiking deconvolution operator. This choice, however, also produces inconsistent results; field data again show that the phase treatment is sensitive to the amount of added white noise. Synthetic data tests show that the standard phase‐correction procedure breaks down when earth attenuation is severe. Deterministically reducing the earth‐filter effects before deconvolution improved the resulting phase treatment for the synthetic data. After application of the inverse attenuation filter to the field data, however, phase differences again remain for different levels of added white noise. These inconsistencies are attributable to the phase action of spiking deconvolution. This action is dependent upon the shape of the signal spectrum as well as the spectral shape and level of contaminating noise. Thus, in practice the proper treatment of phase in data-dependent processing requires extensive knowledge of the spectral characteristics of both signal and noise. With such knowledge, one could apply deterministic techniques that either eliminate the need for statistical deconvolution or condition the data so as to satisfy better the statistical model assumed in data‐dependent processing.


2019 ◽  
Vol 10 (3) ◽  
pp. 1227-1242
Author(s):  
O. Abiola ◽  
F. O. Obasuyi

AbstractCapillary pressure is an important characteristic that indicates the zones of interaction between two-phase fluids or fluid and rock occurring in the subsurface. The analysis of transition zones (TZs) using Goda (Sam) et al.’s empirical capillary pressure from well logs and 3D seismic data in ‘Stephs’ field, Niger Delta, was carried out to remove the effect of mobile water above the oil–water contact in reservoirs in the absence of core data/information. Two reservoirs (RES B and C) were utilized for this study with net thicknesses (NTG) ranging from 194.14 to 209.08 m. Petrophysical parameters computed from well logs indicate that the reservoirs’ effective porosity ranges from 10 to 30% and the permeability ranges from 100 to > 1000 mD, which are important characteristics of good hydrocarbon bearing zone. Checkshot data were used to tie the well to the seismic section. Faults and horizons were mapped on the seismic section. Time structure maps were generated, and a velocity model was used to convert the time structure maps to its depth equivalent. A total of six faults were mapped, three of which are major growth faults (F1, F4 and F5) and cut across the study area. Reservoir properties were modelled using SIS and SGS. The capillary pressure log, curves and models generated were useful in identifying the impact of mobile water in the reservoir as they show the trend of saturating and interacting fluids. The volume of oil estimated from reservoirs B and C without taking TZ into consideration was 273 × 106 and 406 × 106 mmbbls, respectively, and was found to be higher than the volume of oil estimated from the two reservoirs while taking TZ into consideration which was 242 × 106 and 256 × 106 mmbbls, respectively. The results have indicated the presence of mobile water, which have further established that conventionally recoverable hydrocarbon (RHC) is usually overestimated; hence, TZ analysis has to be performed for enhancing RHC for cost-effective extraction and profit maximization.


1963 ◽  
Vol 51 (3) ◽  
pp. 536-536
Author(s):  
P. Rademacher ◽  
D. Randise

Sign in / Sign up

Export Citation Format

Share Document