Automatic data extrapolation to zero offset along local slope

Geophysics ◽  
2016 ◽  
Vol 81 (3) ◽  
pp. U1-U12
Author(s):  
Michelângelo G. Silva ◽  
Milton J. Porsani ◽  
Bjorn Ursin

Velocity-independent seismic data processing requires information about the local slope in the data. From estimates of local time and space derivatives of the data, a total least-squares algorithm gives an estimate of the local slope at each data point. Total least squares minimizes the orthogonal distance from the data points (the local time and space derivatives) to the fitted straight line defining the local slope. This gives a more consistent estimate of the local slope than standard least squares because it takes into account uncertainty in the temporal and spatial derivatives. The total least-squares slope estimate is the same as the one obtained from using the structure tensor with a rectangular window function. The estimate of the local slope field is used to extrapolate all traces in a seismic gather to the smallest recorded offset without using velocity information. Extrapolation to zero offset is done using a hyperbolic traveltime function in which slope information replaces the knowledge of the normal moveout (NMO) velocity. The new data processing method requires no velocity analysis and there is little stretch effect. All major reflections and diffractions that are present at zero offset will be reproduced in the output zero-offset section. Therefore, if multiple reflections are undesired in the output, they should be removed before data extrapolation to zero offset. The automatic method is sensitive to noise, so for poor signal-to-noise ratios, standard NMO velocities for primary reflections can be used to compute the slope field. Synthetic and field data examples indicate that compared with standard seismic data processing (velocity analysis, mute, NMO correction, and stack), our method provides an improved zero-offset section in complex data areas.

Geophysics ◽  
1996 ◽  
Vol 61 (6) ◽  
pp. 1846-1858 ◽  
Author(s):  
Claudio Bagaini ◽  
Umberto Spagnolini

Continuation to zero offset [better known as dip moveout (DMO)] is a standard tool for seismic data processing. In this paper, the concept of DMO is extended by introducing a set of operators: the continuation operators. These operators, which are implemented in integral form with a defined amplitude distribution, perform the mapping between common shot or common offset gathers for a given velocity model. The application of the shot continuation operator for dip‐independent velocity analysis allows a direct implementation in the acquisition domain by exploiting the comparison between real data and data continued in the shot domain. Shot and offset continuation allow the restoration of missing shot or missing offset by using a velocity model provided by common shot velocity analysis or another dip‐independent velocity analysis method.


Geophysics ◽  
2018 ◽  
Vol 83 (1) ◽  
pp. A27-A32 ◽  
Author(s):  
Yangkang Chen ◽  
Sergey Fomel

The seislet transform uses a prediction operator that is connected to the local slope or frequency of seismic events. We have combined the 1D nonstationary seislet transform with empirical-mode decomposition (EMD) in the [Formula: see text]-[Formula: see text] domain. We used EMD to decompose data into smoothly variable frequency components for the following 1D seislet transform. The resultant representation showed remarkable sparsity. We developed a detailed algorithm and used a field example to demonstrate the application of the new seislet transform for sparsity-promoting seismic data processing.


Author(s):  
Aiello Gemma

The techniques of seismic surveying, especially reflection seismic, considerably varied during last year’s. The contribution to this variation mainly came from the oil industry, which has developed the geophysical methods for oil searching. The basic techniques of seismic exploration consist of the generation of seismic waves artificially in the ground (source) and of the measurement of the requested times to cover the source-receiver path. Seismic data processing of three multichannel seismic profiles located in the Gulf of Naples for an overall length of 150 kilometers is herein presented. The techniques of seismic processing used for the elaboration of the seismic data are up-to-date. Some of them are based on complex mathematical models, allowing obtaining good velocity analysis for the production of stacked sections, ready to be interpreted. In this paper the procedures of processing of multichannel seismic data starting from the field data are shown. Sketch diagrams of the elaboration processes applied during several phases of the whole processing have been constructed. The used software are the Promax2D (Landmark Ltd.) and the Seismic Unix (Colorado School of Mines). The steps of the seismic data processes included the pre-processing, the sorting, the velocity analysis, the normal move-out (NMO), the stacking, the band-pass filtering, the multiple removals, the predictive de-convolution and the spiking de-convolution.


Geophysics ◽  
1989 ◽  
Vol 54 (11) ◽  
pp. 1455-1465 ◽  
Author(s):  
William S. Harlan

Hyperbolic reflections and convolutional wavelets are fundamental models for seismic data processing. Each sample of a “stacked” zero‐offset section can parameterize an impulsive hyperbolic reflection in a midpoint gather. Convolutional wavelets can model source waveforms and near‐surface filtering at the shot and geophone positions. An optimized inversion of the combined modeling equations for hyperbolic traveltimes and convolutional wavelets makes explicit any interdependence and nonuniqueness in these two sets of parameters. I first estimate stacked traces that best model the recorded data and then find nonimpulsive wavelets to improve the fit with the data. These wavelets are used for a new estimate of the stacked traces, and so on. Estimated stacked traces model short average wavelets with a superposition of approximately parallel hyperbolas; estimated wavelets adjust the phases and amplitudes of inconsistent traces, including static shifts. Deconvolution of land data with estimated wavelets makes wavelets consistent over offset; remaining static shifts are midpoint‐consistent. This phase balancing improves the resolution of stacked data and of velocity analyses. If precise velocity functions are not known, then many stacked traces can be inverted simultaneously, each with a different velocity function. However, the increased number of overlain hyperbolas can more easily model the effects of inconsistent wavelets. As a compromise, I limit velocity functions to reasonable regions selected from a stacking velocity analysis—a few functions cover velocities of primary and multiple reflections. Multiple reflections are modeled separately and then subtracted from marine data. The model can be extended to include more complicated amplitude changes in reflectivity. Migrated reflectivity functions would add an extra constraint on the continuity of reflections over midpoint. Including the effect of dip moveout in the model would make stacking and migration velocities equivalent.


Sign in / Sign up

Export Citation Format

Share Document