SEISMIC SIGNAL DETECTION USING SIGN BITS

Geophysics ◽  
1973 ◽  
Vol 38 (6) ◽  
pp. 1042-1052 ◽  
Author(s):  
M. D. Cochran

By casting the problem of seismic signal detection as one of statistical detection theory, one can develop a myriad of detection statistics or detectors. Of these, one of the most promising appears to be sign‐bit semblance. This nonparametric detector makes use of only the sign bits of the seismic data and, hence, requires less storage and is faster to compute than other detection statistics. In addition, it is independent of the noise statistics, as are all nonparametric detectors. An automatic velocity analysis and interpretation program has been developed using sign‐bit semblance as the detection statistic. The statistical properties of sign‐bit semblance were such that this system could do a velocity analysis and an interpretation with no human intervention. In this mode of operation it yielded state‐of‐the‐art accuracy at greatly increased speed and with greatly reduced storage requirements. These results indicate that sign‐bit semblance can be used to advantage for certain other seismic‐data processing problems.

Geophysics ◽  
1996 ◽  
Vol 61 (6) ◽  
pp. 1846-1858 ◽  
Author(s):  
Claudio Bagaini ◽  
Umberto Spagnolini

Continuation to zero offset [better known as dip moveout (DMO)] is a standard tool for seismic data processing. In this paper, the concept of DMO is extended by introducing a set of operators: the continuation operators. These operators, which are implemented in integral form with a defined amplitude distribution, perform the mapping between common shot or common offset gathers for a given velocity model. The application of the shot continuation operator for dip‐independent velocity analysis allows a direct implementation in the acquisition domain by exploiting the comparison between real data and data continued in the shot domain. Shot and offset continuation allow the restoration of missing shot or missing offset by using a velocity model provided by common shot velocity analysis or another dip‐independent velocity analysis method.


2013 ◽  
Vol 373-375 ◽  
pp. 694-697 ◽  
Author(s):  
Guang Xun Chen ◽  
Yan Hui Du ◽  
Lei Zhang ◽  
Pan Ke Qin

The commonly used method for high resolution velocity analysis in seismic data processing and interpreting is based on signal estimation algorithm. However, the numerical realization of this method is complicated and time-consuming due to the process of signal-noise separation requiring enormous loop calculations before constructing the energy function. In this paper, we improved the method on the base of multi-trace signal estimation. This improved method made full use of amplitude information that can enhance the anti-noise ability and improve the resolution greatly. Meanwhile, this method has more economical calculation cost than other methods for it didnt require multiple loop calculations.


2021 ◽  
pp. 555-564
Author(s):  
Kamal K. Ali ◽  
Ahmed Wanas ◽  
Mohanad E. Mahdi

     In the current study, 2D seismic data in west An-Najaf (WN-36 line) were received after many steps of processing by Oil Exploration Company in 2018. Surface Consistent Amplitude Compensation (SCAC) was applied on the seismic data. The processing sequence in our study started by sorting data in a common mid-point (CMP) gather, in order to apply the velocity analysis using Interactive Velocity Analysis Application (INVA) with Omega system. Semblance of velocity was prepared to preform normal move-out (NMO) vs. Time. Accurate root mean square velocity (VRMS) was selected, which was controlled by flatness of the primary events. The resultant seismic velocity section for the study area shows that the velocity analysis became smother and provided an accurate seismic section.


Geophysics ◽  
2017 ◽  
Vol 82 (5) ◽  
pp. V297-V309 ◽  
Author(s):  
Hamish Wilson ◽  
Lutz Gross

Spectral noise, low resolution, and attenuation of semblance peaks due to amplitude variation with offset (AVO) anomalies hamper the reliability of velocity analysis in the semblance spectrum for seismic data processing. Increasing resolution and reducing noise while accounting for AVO has posed a challenge in various semblance schemes due to a trade-off in resolution and AVO detectability. A new semblance scheme is introduced that aims to remove this trade-off. The new scheme uses the concepts of bootstrapped differential semblance with trend-based AB semblance. Results indicate that the new scheme indeed increases spectral resolution, reduces noise, and accounts for AVO anomalies. These improvements facilitate velocity control for automatic and manual picking methods and, hence, provide a means for more reliable apparent velocity models.


Geophysics ◽  
1993 ◽  
Vol 58 (12) ◽  
pp. 1809-1819 ◽  
Author(s):  
Jianchao Li ◽  
Ken Larner

Suppressing noise and enhancing useful seismic signal by filtering is one of the important tasks of seismic data processing. Conventional filtering methods are implemented through either the convolution operation or various mathematical transforms. We describe a methodology for studying and implementing filters, which, unlike conventional filtering methods, is based on solving differential equations in the time and space domain. We call this differential‐equation‐based filtering (DEBF). DEBF does not require that seismic data be stationary, so filtering parameters can vary with every time and space point. Examples with two‐dimensional (2-D) synthetic and field seismic data demonstrate that the DEBF method accomplishes the desired time‐ and space‐varying temporal and move‐out filtering at lower cost than conventional frequency‐wavenumber‐domain filtering. The computational advantage in 3-D would be much greater.


Author(s):  
Aiello Gemma

The techniques of seismic surveying, especially reflection seismic, considerably varied during last year’s. The contribution to this variation mainly came from the oil industry, which has developed the geophysical methods for oil searching. The basic techniques of seismic exploration consist of the generation of seismic waves artificially in the ground (source) and of the measurement of the requested times to cover the source-receiver path. Seismic data processing of three multichannel seismic profiles located in the Gulf of Naples for an overall length of 150 kilometers is herein presented. The techniques of seismic processing used for the elaboration of the seismic data are up-to-date. Some of them are based on complex mathematical models, allowing obtaining good velocity analysis for the production of stacked sections, ready to be interpreted. In this paper the procedures of processing of multichannel seismic data starting from the field data are shown. Sketch diagrams of the elaboration processes applied during several phases of the whole processing have been constructed. The used software are the Promax2D (Landmark Ltd.) and the Seismic Unix (Colorado School of Mines). The steps of the seismic data processes included the pre-processing, the sorting, the velocity analysis, the normal move-out (NMO), the stacking, the band-pass filtering, the multiple removals, the predictive de-convolution and the spiking de-convolution.


Geophysics ◽  
2016 ◽  
Vol 81 (3) ◽  
pp. U1-U12
Author(s):  
Michelângelo G. Silva ◽  
Milton J. Porsani ◽  
Bjorn Ursin

Velocity-independent seismic data processing requires information about the local slope in the data. From estimates of local time and space derivatives of the data, a total least-squares algorithm gives an estimate of the local slope at each data point. Total least squares minimizes the orthogonal distance from the data points (the local time and space derivatives) to the fitted straight line defining the local slope. This gives a more consistent estimate of the local slope than standard least squares because it takes into account uncertainty in the temporal and spatial derivatives. The total least-squares slope estimate is the same as the one obtained from using the structure tensor with a rectangular window function. The estimate of the local slope field is used to extrapolate all traces in a seismic gather to the smallest recorded offset without using velocity information. Extrapolation to zero offset is done using a hyperbolic traveltime function in which slope information replaces the knowledge of the normal moveout (NMO) velocity. The new data processing method requires no velocity analysis and there is little stretch effect. All major reflections and diffractions that are present at zero offset will be reproduced in the output zero-offset section. Therefore, if multiple reflections are undesired in the output, they should be removed before data extrapolation to zero offset. The automatic method is sensitive to noise, so for poor signal-to-noise ratios, standard NMO velocities for primary reflections can be used to compute the slope field. Synthetic and field data examples indicate that compared with standard seismic data processing (velocity analysis, mute, NMO correction, and stack), our method provides an improved zero-offset section in complex data areas.


Sign in / Sign up

Export Citation Format

Share Document