scholarly journals Techniques and methods of seismic data processing in active volcanic areas: some applications to multichannel seismic profiles (Gulf of Naples, Southern Tyrrhenian sea, Italy)

Author(s):  
Aiello Gemma

The techniques of seismic surveying, especially reflection seismic, considerably varied during last year’s. The contribution to this variation mainly came from the oil industry, which has developed the geophysical methods for oil searching. The basic techniques of seismic exploration consist of the generation of seismic waves artificially in the ground (source) and of the measurement of the requested times to cover the source-receiver path. Seismic data processing of three multichannel seismic profiles located in the Gulf of Naples for an overall length of 150 kilometers is herein presented. The techniques of seismic processing used for the elaboration of the seismic data are up-to-date. Some of them are based on complex mathematical models, allowing obtaining good velocity analysis for the production of stacked sections, ready to be interpreted. In this paper the procedures of processing of multichannel seismic data starting from the field data are shown. Sketch diagrams of the elaboration processes applied during several phases of the whole processing have been constructed. The used software are the Promax2D (Landmark Ltd.) and the Seismic Unix (Colorado School of Mines). The steps of the seismic data processes included the pre-processing, the sorting, the velocity analysis, the normal move-out (NMO), the stacking, the band-pass filtering, the multiple removals, the predictive de-convolution and the spiking de-convolution.

2013 ◽  
Vol 748 ◽  
pp. 1099-1103
Author(s):  
Xiao Bo Peng

The current understanding of complex lithology reservoirs is not enough, and it’s impossible to figure out all the problems during the exploration stage, we need to combine exploration with development, and gradually develop the knowledge on complex geological conditions in the rolling development, effectively combine exploration and development. This paper based on previous studies and goes further to research from all aspects by applying the seismic data of oil region and the adjacent blocks. The purpose is to describe in detail on geological characteristics and reservoir characteristics. Considering seismic data processing, we focus on the main layer data processing to ensure the information under the premise of efforts to improve the data resolution. Finally, we discuss the analysis process of seismic data; the structural interpretation seeks to start on the basis of traps study, for small faults and minor structural analysis, advanced technology were applied to describe the reservoir.


2013 ◽  
pp. 42-55
Author(s):  
P. Zagorodnyuk ◽  
G. Lisny

The researches shows that the explicit account of the seismic waves velocities anisotropy is preferable comparing to the traditional one, the meaning of which became clear due to the authors’ publications. The main advantages of an explicit account of seismic anisotropy are:  the usage of a depth scale, not the time scale, in seismic data processing and interpretation;  depth-velocity modeling without using the technologies of common midpoint method, thus, preventing from a number of errors, which is highly important for 3-D seismic;  possibility to use technology of seismic images anisotropic decomposition. Dnieper-Donetsk depression seismic data processing results convincingly demonstrate the advantages of the seismic anisotropy explicit account for seismic imaging of anisotropic media.   


Geophysics ◽  
2016 ◽  
Vol 81 (3) ◽  
pp. U1-U12
Author(s):  
Michelângelo G. Silva ◽  
Milton J. Porsani ◽  
Bjorn Ursin

Velocity-independent seismic data processing requires information about the local slope in the data. From estimates of local time and space derivatives of the data, a total least-squares algorithm gives an estimate of the local slope at each data point. Total least squares minimizes the orthogonal distance from the data points (the local time and space derivatives) to the fitted straight line defining the local slope. This gives a more consistent estimate of the local slope than standard least squares because it takes into account uncertainty in the temporal and spatial derivatives. The total least-squares slope estimate is the same as the one obtained from using the structure tensor with a rectangular window function. The estimate of the local slope field is used to extrapolate all traces in a seismic gather to the smallest recorded offset without using velocity information. Extrapolation to zero offset is done using a hyperbolic traveltime function in which slope information replaces the knowledge of the normal moveout (NMO) velocity. The new data processing method requires no velocity analysis and there is little stretch effect. All major reflections and diffractions that are present at zero offset will be reproduced in the output zero-offset section. Therefore, if multiple reflections are undesired in the output, they should be removed before data extrapolation to zero offset. The automatic method is sensitive to noise, so for poor signal-to-noise ratios, standard NMO velocities for primary reflections can be used to compute the slope field. Synthetic and field data examples indicate that compared with standard seismic data processing (velocity analysis, mute, NMO correction, and stack), our method provides an improved zero-offset section in complex data areas.


Sign in / Sign up

Export Citation Format

Share Document