Seismic data interpolation method based on wavefield restoration

2018 ◽  
Author(s):  
Xie Junfa ◽  
Wang Xiaowei ◽  
Wang Yuchao ◽  
Hu Ziduo ◽  
Zhang Tao
Geophysics ◽  
2020 ◽  
Vol 85 (4) ◽  
pp. WA115-WA136 ◽  
Author(s):  
Hao Zhang ◽  
Xiuyan Yang ◽  
Jianwei Ma

We have developed an interpolation method based on the denoising convolutional neural network (CNN) for seismic data. It provides a simple and efficient way to break through the problem of the scarcity of geophysical training labels that are often required by deep learning methods. This new method consists of two steps: (1) training a set of CNN denoisers to learn denoising from natural image noisy-clean pairs and (2) integrating the trained CNN denoisers into the project onto convex set (POCS) framework to perform seismic data interpolation. We call it the CNN-POCS method. This method alleviates the demands of seismic data that require shared similar features in the applications of end-to-end deep learning for seismic data interpolation. Additionally, the adopted method is flexible and applicable for different types of missing traces because the missing or down-sampling locations are not involved in the training step; thus, it is of a plug-and-play nature. These indicate the high generalizability of the proposed method and a reduction in the necessity of problem-specific training. The primary results of synthetic and field data show promising interpolation performances of the adopted CNN-POCS method in terms of the signal-to-noise ratio, dealiasing, and weak-feature reconstruction, in comparison with the traditional [Formula: see text]-[Formula: see text] prediction filtering, curvelet transform, and block-matching 3D filtering methods.


Geophysics ◽  
2021 ◽  
pp. 1-52
Author(s):  
Nanying Lan ◽  
Zhang Fanchang ◽  
Chuanhui Li

Due to the limitations imposed by acquisition cost, obstacles, and inaccessible regions, the originally acquired seismic data are often sparsely or irregularly sampled in space, which seriously affects the ability of seismic data to image under-ground structures. Fortunately, compressed sensing provides theoretical support for interpolating and recovering irregularly or under-sampled data. Under the framework of compressed sensing, we propose a robust interpolation method for high-dimensional seismic data, based on elastic half norm regularization and tensor dictionary learning. Inspired by the Elastic-Net, we first develop the elastic half norm regularization as a sparsity constraint, and establish a robust high-dimensional interpolation model with this technique. Then, considering the multi-dimensional structure and spatial correlation of seismic data, we introduce a tensor dictionary learning algorithm to train a high-dimensional adaptive tensor dictionary from the original data. This tensor dictionary is used as the sparse transform for seismic data interpolation because it can capture more detailed seismic features to achieve the optimal and fast sparse representation of high-dimensional seismic data. Finally, we solve the robust interpolation model by an efficient iterative thresholding algorithm in the transform space and perform the space conversion by a modified imputation algorithm to recover the wavefields at the unobserved spatial positions. We conduct high-dimensional interpolation experiments on model and field seismic data on a regular data grid. Experimental results demonstrate that, this method has superior performance and higher computational efficiency in both noise-free and noisy seismic data interpolation, compared to extensively utilized dictionary learning-based interpolation methods.


Geophysics ◽  
2021 ◽  
pp. 1-57
Author(s):  
Yang Liu ◽  
Geng WU ◽  
Zhisheng Zheng

Although there is an increase in the amount of seismic data acquired with wide-azimuth geometry, it is difficult to achieve regular data distributions in spatial directions owing to limitations imposed by the surface environment and economic factor. To address this issue, interpolation is an economical solution. The current state of the art methods for seismic data interpolation are iterative methods. However, iterative methods tend to incur high computational cost which restricts their application in cases of large, high-dimensional datasets. Hence, we developed a two-step non-iterative method to interpolate nonstationary seismic data based on streaming prediction filters (SPFs) with varying smoothness in the time-space domain; and we extended these filters to two spatial dimensions. Streaming computation, which is the kernel of the method, directly calculates the coefficients of nonstationary SPF in the overdetermined equation with local smoothness constraints. In addition to the traditional streaming prediction-error filter (PEF), we proposed a similarity matrix to improve the constraint condition where the smoothness characteristics of the adjacent filter coefficient change with the varying data. We also designed non-causal in space filters for interpolation by using several neighboring traces around the target traces to predict the signal; this was performed to obtain more accurate interpolated results than those from the causal in space version. Compared with Fourier Projection onto a Convex Sets (POCS) interpolation method, the proposed method has the advantages such as fast computational speed and nonstationary event reconstruction. The application of the proposed method on synthetic and nonstationary field data showed that it can successfully interpolate high-dimensional data with low computational cost and reasonable accuracy even in the presence of aliased and conflicting events.


2021 ◽  
Vol 18 (4) ◽  
pp. 529-538
Author(s):  
Liyan Zhang ◽  
Ang Li ◽  
Jianguo Yang ◽  
Shichao Li ◽  
Yulai Yao ◽  
...  

Abstract To improve the imaging quality of wide-azimuth seismic data and enhance the uniformity of the attributes between adjacent bins, we developed a novel interpolation method in the offset-vector tiles (OVT) domain for wide-azimuth data. The orthogonal matching pursuit (OMP) interpolation method based on the Fourier transform is a frequency-domain processing technique based on discrete Fourier interpolation that achieves the goal of anti-aliasing by extracting the weight factor in the effective band from low-frequency data without aliasing. For data reconstruction, the OMP-based data interpolation technique in the OVT domain comprehensively uses the seismic data in five dimensions: the vertical and horizontal coordinates, time, offset and azimuth. Compared with conventional three-dimensional data interpolation, five-dimensional interpolation in the OVT domain is more accurate and achieves better results in practical applications.


2020 ◽  
Vol 1631 ◽  
pp. 012110
Author(s):  
Xiaoguo Xie ◽  
Shuling Pan ◽  
Bing Luo ◽  
Cailing Chen ◽  
Kai Chen

2012 ◽  
Vol 588-589 ◽  
pp. 1312-1315
Author(s):  
Yi Kun Zhang ◽  
Ming Hui Zhang ◽  
Xin Hong Hei ◽  
Deng Xin Hua ◽  
Hao Chen

Aiming at building a Lidar data interpolation model, this paper designs and implements a GA-BP interpolation method. The proposed method uses genetic method to optimize BP neural network, which greatly improves the calculation accuracy and convergence rate of BP neural network. Experimental results show that the proposed method has a higher interpolation accuracy compared with BP neural network as well as linear interpolation method.


Sign in / Sign up

Export Citation Format

Share Document