A Comparison of Different Methods of Velocity Analysis in Reflection Seismic Data Processing with AVO Anomalies

Author(s):  
M. Bashardoust ◽  
S. Torabi ◽  
M. Nabi-Bidhendi
Author(s):  
Aiello Gemma

The techniques of seismic surveying, especially reflection seismic, considerably varied during last year’s. The contribution to this variation mainly came from the oil industry, which has developed the geophysical methods for oil searching. The basic techniques of seismic exploration consist of the generation of seismic waves artificially in the ground (source) and of the measurement of the requested times to cover the source-receiver path. Seismic data processing of three multichannel seismic profiles located in the Gulf of Naples for an overall length of 150 kilometers is herein presented. The techniques of seismic processing used for the elaboration of the seismic data are up-to-date. Some of them are based on complex mathematical models, allowing obtaining good velocity analysis for the production of stacked sections, ready to be interpreted. In this paper the procedures of processing of multichannel seismic data starting from the field data are shown. Sketch diagrams of the elaboration processes applied during several phases of the whole processing have been constructed. The used software are the Promax2D (Landmark Ltd.) and the Seismic Unix (Colorado School of Mines). The steps of the seismic data processes included the pre-processing, the sorting, the velocity analysis, the normal move-out (NMO), the stacking, the band-pass filtering, the multiple removals, the predictive de-convolution and the spiking de-convolution.


Geophysics ◽  
2016 ◽  
Vol 81 (3) ◽  
pp. U1-U12
Author(s):  
Michelângelo G. Silva ◽  
Milton J. Porsani ◽  
Bjorn Ursin

Velocity-independent seismic data processing requires information about the local slope in the data. From estimates of local time and space derivatives of the data, a total least-squares algorithm gives an estimate of the local slope at each data point. Total least squares minimizes the orthogonal distance from the data points (the local time and space derivatives) to the fitted straight line defining the local slope. This gives a more consistent estimate of the local slope than standard least squares because it takes into account uncertainty in the temporal and spatial derivatives. The total least-squares slope estimate is the same as the one obtained from using the structure tensor with a rectangular window function. The estimate of the local slope field is used to extrapolate all traces in a seismic gather to the smallest recorded offset without using velocity information. Extrapolation to zero offset is done using a hyperbolic traveltime function in which slope information replaces the knowledge of the normal moveout (NMO) velocity. The new data processing method requires no velocity analysis and there is little stretch effect. All major reflections and diffractions that are present at zero offset will be reproduced in the output zero-offset section. Therefore, if multiple reflections are undesired in the output, they should be removed before data extrapolation to zero offset. The automatic method is sensitive to noise, so for poor signal-to-noise ratios, standard NMO velocities for primary reflections can be used to compute the slope field. Synthetic and field data examples indicate that compared with standard seismic data processing (velocity analysis, mute, NMO correction, and stack), our method provides an improved zero-offset section in complex data areas.


Geophysics ◽  
1971 ◽  
Vol 36 (6) ◽  
pp. 1043-1073 ◽  
Author(s):  
William A. Schneider

The subject matter of this review paper pertains to developments during the past several years in the area of reflection seismic data processing and analysis. While this subject area is extensive in both its breadth and scope, one indisputable fact emerges: the computer is now more pervasive than ever. Processing areas which were computer intensive, such as signal enhancement, are now even more so; and those formerly exclusive domains of man, such as seismic interpretation, are beginning to feel the encroachment of the large number crunchers. What the future holds is anyone’s guess, but it is quite probable that man and computer will share the throne if the interactive seismic processing systems on the drawing boards come to pass. For the present and recent past, however, the most intensively developed areas of seismic data processing and analysis include 1) computer extraction of processing parameters such as stacking velocity and statics, 2) automated detection and tracking of reflections in multidimensional parameter space to provide continuous estimates of traveltime, amplitude, moveout (velocity), dip, etc., 3) direct digital migration in two dimensions, giving improved subsurface “pictures” and utilizing diffraction energy normally lost by specular processing techniques, and 4) development of quantitative understanding of the limitations imposed by current seismic processing practice and assumptions with regard to structural and stratigraphic model building, and recognition of the ultimate need for an iterative signal processing—information extraction—model building closed loop system.


Sign in / Sign up

Export Citation Format

Share Document