SEISMIC DATA ENHANCEMENT—A CASE HISTORY

Geophysics ◽  
1960 ◽  
Vol 25 (1) ◽  
pp. 283-311 ◽  
Author(s):  
R. J. Graebner

The theory relating to many methods—for example, multiple seismometer techniques—which the geophysicist may control to improve record quality is well known. However, its application has not been fully exploited. An example of the reduction of theory to practice in one area characterized by poor records is presented. It comprises a series of analytical tests designed to discover the cause of poor records, to examine the effect of each variable on the signal‐to‐noise ratio, and to evaluate the solutions predicted by theory. The tests showed that the poor record quality was attributable chiefly to relatively strong surface and near‐surface waves propagating outward from the shot. Wave length filtering by means of suitable shot and seismometer patterns, and compositing through data processing methods, greatly improved record quality and permitted magnetic recording of reflected signals over a broad frequency range. The tests established, in the allotted time, that the quality of the data would meet clearly specified standards of performance. Experience has shown that better seismic data can generally be obtained when the design of techniques is based on the special character of the signal and noise determined from simple tests rather than when the design is based on general assumptions.

2020 ◽  
Vol 17 (5) ◽  
pp. 893-905
Author(s):  
Weihua Zhang ◽  
Li Yang ◽  
Wenpeng Si ◽  
Houyu Liu

Abstract Foothill belts ‘dual-complexity’ of the surface and underground structures hinders an accurate seismic imaging of complex geological structures. In this paper, the propagation law of the seismic wavefield in the foothill belt is studied through seismic forward modelling and its influences on the seismic data acquisition and imaging. A foothill belt with typical ‘dual-complexity’ characteristics is investigated. Single-shot records and their imaging effects simulated with different absorption coefficients and different near-surface structure models are analysed. The results suggest that strong surface waves and their scattered noise generated by the complex near surface in the foothill belt are the main reasons for the low signal-to-noise ratio and difficulties in the imaging process of seismic data. The viscoelastic-medium modelling method effectively suppresses the surface waves and their scattered noise, which improves the seismic data quality and imaging in the foothill belt, and thus is a suitable forward modelling method for the foothill belts.


Geophysics ◽  
2019 ◽  
Vol 84 (1) ◽  
pp. A19-A24 ◽  
Author(s):  
Aleksander S. Serdyukov ◽  
Aleksander V. Yablokov ◽  
Anton A. Duchkov ◽  
Anton A. Azarov ◽  
Valery D. Baranov

We have addressed the problem of estimating surface-wave phase velocities through the spectral processing of seismic data. This is the key step of the well-known near-surface seismic exploration method, called multichannel analysis of surface waves. To increase the accuracy and ensure the unambiguity of the selection of dispersion curves, we have developed a new version of the frequency-wavenumber ([Formula: see text]-[Formula: see text]) transform based on the S-transform. We obtain the frequency-time representation of seismic data. We analyze the obtained S-transform frequency-time representation in a slant-stacking manner but use a spatial Fourier transform instead of amplitude stacking. Finally, we build the [Formula: see text]-[Formula: see text] image by analyzing the spatial spectra for different steering values of the surface-wave group velocities. The time localization of the surface-wave packet at each frequency increases the signal-to-noise ratio because of an exclusion of noise in other time steps (which does not fall in the effective width of the corresponding wavelet). The new [Formula: see text]-[Formula: see text] transform, i.e., the slant [Formula: see text]-[Formula: see text] (SFK) transform, renders a better spectral analysis than the conventional [Formula: see text]-[Formula: see text] transform and yields more accurate phase-velocity estimation, which is critical for the surface-wave analysis. The advantages of the SFK transform have been confirmed by synthetic- and field-data processing.


Geophysics ◽  
2009 ◽  
Vol 74 (3) ◽  
pp. V43-V48 ◽  
Author(s):  
Guochang Liu ◽  
Sergey Fomel ◽  
Long Jin ◽  
Xiaohong Chen

Stacking plays an important role in improving signal-to-noise ratio and imaging quality of seismic data. However, for low-fold-coverage seismic profiles, the result of conventional stacking is not always satisfactory. To address this problem, we have developed a method of stacking in which we use local correlation as a weight for stacking common-midpoint gathers after NMO processing or common-image-point gathers after prestack migration. Application of the method to synthetic and field data showed that stacking using local correlation can be more effective in suppressing random noise and artifacts than other stacking methods.


2014 ◽  
Vol 2 (1) ◽  
pp. SA93-SA97 ◽  
Author(s):  
Saleh Al-Dossary ◽  
Yuchun Eugene Wang ◽  
Mark McFarlane

The new seismic disorder attribute quantitatively describes the degree of randomness embedded in 3D poststack seismic data. We compute seismic disorder using a filter operation that removes simple structures including constant values, constant slopes, and steps in axial directions. We define the power of the filtered data as the seismic disorder attribute, which approximately represents data randomness. Seismic data irregularities are caused by a variety of reasons, including random reflection, diffraction, near-surface variations, and acquisition noise. Consequently, the spatial distribution of the seismic disorder attribute may help hydrocarbon exploration in several ways, including identifying geologic features such as fracture zones, gas chimneys, and terminated unconformities; indicating the signal-to-noise ratio to assess data quality; and providing a confidence index for reservoir simulation and engineering projects. We present three case studies and a comparison to other noise-estimation methods.


Geophysics ◽  
1989 ◽  
Vol 54 (11) ◽  
pp. 1384-1396
Author(s):  
Howard Renick ◽  
R. D. Gunn

The Triangle Ranch Headquarters Canyon Reef field is long and narrow and in an area where near‐surface evaporites and associated collapse features degrade seismic data quality and interpretational reliability. Below this disturbed section, the structure of rocks is similar to the deeper Canyon Reef structure. The shallow structure exhibits very gentle relief and can be mapped by drilling shallow holes on a broad grid. The shallow structural interpretation provides a valuable reference datum for mapping, as well as providing a basis for planning a seismic program. By computing an isopach between the variable seismic datum and the Canyon Reef reflection and subtracting the isopach map from the datum map, we map Canyon Reef structure. The datum map is extrapolated from the shallow core holes. In the area, near‐surface complexities produce seismic noise and severe static variations. The crux of the exploration problem is to balance seismic signal‐to‐noise ratio and geologic resolution. Adequate geologic resolution is impossible without understanding the exploration target. As we understood the target better, we modified our seismic acquisition parameters. Studying examples of data with high signal‐to‐noise ratio and poor resolution and examples of better defined structure on apparently noisier data led us to design an acquisition program for resolution and to reduce noise with arithmetic processes that do not reduce structural resolution. Combining acquisition and processing parameters for optimum structural resolution with the isopach mapping method has improved wildcat success from about 1 in 20 to better than 1 in 2. It has also enabled an 80 percent development drilling success ratio as opposed to slightly over 50 percent in all previous drilling.


2021 ◽  
Author(s):  
Ramy Elasrag ◽  
Thuraya Al Ghafri ◽  
Faaeza Al Katheer ◽  
Yousuf Al-Aufi ◽  
Ivica Mihaljevic ◽  
...  

Abstract Acquiring surface seismic data can be challenging in areas of intense human activities, due to presence of infrastructures (roads, houses, rigs), often leaving large gaps in the fold of coverage that can span over several kilometers. Modern interpolation algorithms can interpolate up to a certain extent, but quality of reconstructed seismic data diminishes as the acquisition gap increases. This is where vintage seismic acquisition can aid processing and imaging, especially if previous acquisition did not face the same surface obstacles. In this paper we will present how the legacy seismic survey has helped to fill in the data gaps of the new acquisition and produced improved seismic image. The new acquisition survey is part of the Mega 3D onshore effort undertaken by ADNOC, characterized by dense shot and receiver spacing with focus on full azimuth and broadband. Due to surface infrastructures, data could not be completely acquired leaving sizable gap in the target area. However, a legacy seismic acquisition undertaken in 2014 had access to such gap zones, as infrastructures were not present at the time. Legacy seismic data has been previously processed and imaged, however simple post-imaging merge would not be adequate as two datasets were processed using different workflows and imaging was done using different velocity models. In order to synchronize the two datasets, we have processed them in parallel. Data matching and merging were done before regularization. It has been regularized to radial geometry using 5D Matching Pursuit with Fourier Interpolation (MPFI). This has provided 12 well sampled azimuth sectors that went through surface consistent processing, multiple attenuation, and residual noise attenuation. Near surface model was built using data-driven image-based static (DIBS) while reflection tomography was used to build the anisotropic velocity model. Imaging was done using Pre-Stack Kirchhoff Depth Migration. Processing legacy survey from the beginning has helped to improve signal to noise ratio which assisted with data merging to not degrade the quality of the end image. Building one near surface model allowed both datasets to match well in time domain. Bringing datasets to the same level was an important condition before matching and merging. Amplitude and phase analysis have shown that both surveys are aligned quite well with minimal difference. Only the portion of the legacy survey that covers the gap was used in the regularization, allowing MPFI to reconstruct missing data. Regularized data went through surface multiple attenuation and further noise attenuation as preconditioning for migration. Final image that is created using both datasets has allowed target to be imaged better.


2020 ◽  
Vol 8 (4) ◽  
pp. T941-T952
Author(s):  
Jiachun You ◽  
Yajuan Xue ◽  
Junxing Cao ◽  
Canping Li

Because swell noises are very common in marine seismic data, it is extremely important to attenuate them to improve the signal-to-noise ratio (S/N). Compared to process noises in the time domain, we have built a frequency-domain convolutional neural network (CNN) based on the short-time Fourier transform to address swell noises. In the numerical experiments, we quantitatively evaluate the denoising performances of the time- and frequency-domain CNNs, compare the impacts of network structures on attenuating swell noises, and study how network parameter choices impact the quality of the denoised signal based on peak S/N, structural similarity, and root-mean-square-error indices. These results help us to build an optimal CNN model. Furthermore, to illustrate the superiority of our proposed method, we compare the conventional and proposed CNN methods. To address the generalization capability of CNN, we adopt transfer learning by using fine tuning to adjust the weights of the pretrained model with a small amount of target data. The application of transfer learning improves the quality of the denoised images, which further proves that our proposed method with transfer learning has the potential to be deployed in actual seismic data acquisition.


Geophysics ◽  
2020 ◽  
Vol 85 (3) ◽  
pp. V283-V296 ◽  
Author(s):  
Andrey Bakulin ◽  
Ilya Silvestrov ◽  
Maxim Dmitriev ◽  
Dmitry Neklyudov ◽  
Maxim Protasov ◽  
...  

We have developed nonlinear beamforming (NLBF), a method for enhancing modern 3D prestack seismic data acquired onshore with small field arrays or single sensors in which weak reflected signals are buried beneath the strong scattered noise induced by a complex near surface. The method is based on the ideas of multidimensional stacking techniques, such as the common-reflection-surface stack and multifocusing, but it is designed specifically to improve the prestack signal-to-noise ratio of modern 3D land seismic data. Essentially, NLBF searches for coherent local events in the prestack data and then performs beamforming along the estimated surfaces. Comparing different gathers that can be extracted from modern 3D data acquired with orthogonal acquisition geometries, we determine that the cross-spread domain (CSD) is typically the most convenient and efficient. Conventional noise removal applied to modern data from small arrays or single sensors does not adequately reveal the underlying reflection signal. Instead, NLBF supplements these conventional tools and performs final aggregation of weak and still broken reflection signals, where the strength is controlled by the summation aperture. We have developed the details of the NLBF algorithm in CSD and determined the capabilities of the method on real 3D land data with the focus on enhancing reflections and early arrivals. We expect NLBF to help streamline seismic processing of modern high-channel-count and single-sensor data, leading to improved images as well as better prestack data for estimation of reservoir properties.


Geophysics ◽  
2004 ◽  
Vol 69 (4) ◽  
pp. 1091-1101 ◽  
Author(s):  
Gabriel Perez ◽  
Ken Larner

Founded on the assumption of surface consistency, reflection‐based residual‐statics correct for the time distortions in seismic data arising from rapid near‐surface variations. The assumption is founded on a vertical‐path, ray‐theoretical model of wave propagation in the near surface. Since ray theory does not hold for models with rapid spatial variation, we resort to finite‐difference modeling to study the influence of wave‐theoretical aspects on the character of time distortions and the validity of the surface‐consistency assumption. For near‐surface models that are admittedly simple and idealized, we find that the character of time distortions is highly influenced by a wavefront‐healing phenomenon whose strength depends on the ratio of the size of the frequency‐dependent Fresnel zone to the wavelength of lateral variation in the model. As experienced in practice, the quality of the surface‐consistency assumption degrades with increasing ratio of spreadlength to reflector depth. The validity of the assumption is best for longer‐wavelength anomalies in a weathering layer that is relatively thin. Wavefront healing, however, limits significantly the time‐distortion problem where the weathering layer is relatively thick. Interestingly, wavefront healing also helps to reduce the size of the time‐distortion problem when the velocity in the surface layer is large relative to that in the layers beneath, such as in areas of permafrost.


2014 ◽  
Vol 490-491 ◽  
pp. 1548-1552
Author(s):  
Zhi Xin Di

With the continuously process of prospecting program, our land exploration enter into activity lithostratigrapgy stage. For searching medium and small or subtle reservior, higher seismic data discernibility must be needed. In the explosive source area, surface layer velocity, shot lithology and ghost interface are the three key elements influencing the shot frequency. In view of the trait that the quality of single shot has apparently difference causing by near surface layer Yellow River Delta multiple lithology alternating deposits, we study the characteristics of frequency reponse to lithology and ghost by microseismogram log data, to provide reliable basis for scientific select shooting parameter.


Sign in / Sign up

Export Citation Format

Share Document