scholarly journals Antileakage least-squares spectral analysis for seismic data regularization and random noise attenuation

Geophysics ◽  
2018 ◽  
Vol 83 (3) ◽  
pp. V157-V170 ◽  
Author(s):  
Ebrahim Ghaderpour ◽  
Wenyuan Liao ◽  
Michael P. Lamoureux

Spatial transformation of an irregularly sampled data series to a regularly sampled data series is a challenging problem in many areas such as seismology. The discrete Fourier analysis is limited to regularly sampled data series. On the other hand, the least-squares spectral analysis (LSSA) can analyze an irregularly sampled data series. Although the LSSA method takes into account the correlation among the sinusoidal basis functions of irregularly spaced series, it still suffers from the problem of spectral leakage: Energy leaks from one spectral peak into another. We have developed an iterative method called antileakage LSSA to attenuate the spectral leakage and consequently regularize irregular data series. In this method, we first search for a spectral peak with the highest energy, and then we remove (suppress) it from the original data series. In the next step, we search for a new peak with the highest energy in the residual data series and remove the new and the old components simultaneously from the original data series using a least-squares method. We repeat this procedure until all significant spectral peaks are estimated and removed simultaneously from the original data series. In addition, we address another problem, which is random noise attenuation in the data series, by applying a certain confidence level for significant peaks in the spectrum. We determine the robustness of our method on irregularly sampled synthetic and real data sets, and we compare the results with the antileakage Fourier transform and arbitrary sampled Fourier transform.

Geophysics ◽  
2003 ◽  
Vol 68 (5) ◽  
pp. 1633-1638 ◽  
Author(s):  
Yanghua Wang

The spectrum of a discrete Fourier transform (DFT) is estimated by linear inversion, and used to produce desirable seismic traces with regular spatial sampling from an irregularly sampled data set. The essence of such a wavefield reconstruction method is to solve the DFT inverse problem with a particular constraint which imposes a sparseness criterion on the least‐squares solution. A working definition for the sparseness constraint is presented to improve the stability and efficiency. Then a sparseness measurement is used to measure the relative sparseness of the two DFT spectra obtained from inversion with or without sparseness constraint. It is a pragmatic indicator about the magnitude of sparseness needed for wavefield reconstruction. For seismic trace regularization, an antialiasing condition must be fulfilled for the regularizing trace interval, whereas optimal trace coordinates in the output can be obtained by minimizing the distances between the newly generated traces and the original traces in the input. Application to real seismic data reveals the effectiveness of the technique and the significance of the sparseness constraint in the least‐squares solution.


2013 ◽  
Vol 3 (1) ◽  
pp. 368-372
Author(s):  
A. Zahedi ◽  
M. H. Kahaei

In this paper, a new method for frequency estimation of irregularly sampled data is proposed. In comparison with the previous sparsity-based methods where the sparsity constraint is applied to a least-squares fitting problem, the proposed method is based on a sparsity constrained weighted least-squares problem. The resulting problem is solved in an iterative manner, allowing the usage of the solution obtained at each iteration to determine the weights of the least-squares fitting term at the next iteration. Such an appropriate weighting of the least-squares fitting term enhances the performance of the proposed method. Simulation results verify that the proposed method can detect the spectral peaks using a very short data record. Compared to the previous one, the proposed method is less probable to miss the actual spectral peaks and exhibit spurious peaks.


Geophysics ◽  
2000 ◽  
Vol 65 (5) ◽  
pp. 1364-1371 ◽  
Author(s):  
Shuki Ronen ◽  
Christopher L. Liner

Conventional processing, such as Kirchhoff dip moveout (DMO) and prestack full migration, are based on independent imaging of subsets of the data before stacking or amplitude variation with offset (AVO) analysis. Least‐squares DMO (LSDMO) and least‐squares migration (LSMig) are a family of developing processing methods which are based on inversion of reverse DMO and demigration operators. LSDMO and LSMig find the earth model that best fits the data and a priori assumptions which can be imposed as constraints. Such inversions are more computer intensive, but have significant advantages compared to conventional processing when applied to irregularly sampled data. Various conventional processes are approximations of the inversions in LSDMO and LSMig. Often, processing is equivalent to using the transpose of a matrix which LSDMO/LSMig inverts. Such transpose processing is accurate when the data sampling is adequate. In practice, costly survey design, real‐time coverage quality control, in‐fill acquisition, redundancy editing, and prestack interpolation, are used to create a survey geometry such that the transpose is a good approximation of the inverse. Normalized DMO and migration are approximately equivalent to following the application of the above transpose processing by a diagonal correction. However, in most cases, the required correction is not actually diagonal. In such cases LSDMO and LSMig can produce earth models with higher resolution and higher fidelity than normalized DMO and migration. The promise of LSMig and LSDMO is reduced acquisition cost, improved resolution, and reduced acquisition footprint. The computational cost, and more importantly turn‐around time, is a major factor in the commercialization of these methods. With parallel computing, these methods are now becoming practical.


Geophysics ◽  
2005 ◽  
Vol 70 (4) ◽  
pp. V87-V95 ◽  
Author(s):  
Sheng Xu ◽  
Yu Zhang ◽  
Don Pham ◽  
Gilles Lambaré

Seismic data regularization, which spatially transforms irregularly sampled acquired data to regularly sampled data, is a long-standing problem in seismic data processing. Data regularization can be implemented using Fourier theory by using a method that estimates the spatial frequency content on an irregularly sampled grid. The data can then be reconstructed on any desired grid. Difficulties arise from the nonorthogonality of the global Fourier basis functions on an irregular grid, which results in the problem of “spectral leakage”: energy from one Fourier coefficient leaks onto others. We investigate the nonorthogonality of the Fourier basis on an irregularly sampled grid and propose a technique called “antileakage Fourier transform” to overcome the spectral leakage. In the antileakage Fourier transform, we first solve for the most energetic Fourier coefficient, assuming that it causes the most severe leakage. To attenuate all aliases and the leakage of this component onto other Fourier coefficients, the data component corresponding to this most energetic Fourier coefficient is subtracted from the original input on the irregular grid. We then use this new input to solve for the next Fourier coefficient, repeating the procedure until all Fourier coefficients are estimated. This procedure is equivalent to “reorthogonalizing” the global Fourier basis on an irregularly sampled grid. We demonstrate the robustness and effectiveness of this technique with successful applications to both synthetic and real data examples.


Sign in / Sign up

Export Citation Format

Share Document