Truncated nuclear norm minimization for tensor completion

Author(s):  
Long-Ting Huang ◽  
H. C. So ◽  
Yuan Chen ◽  
Wen-Qin Wang
Geophysics ◽  
2020 ◽  
pp. 1-60
Author(s):  
Ouyang Shao ◽  
Lingling Wang ◽  
Xiangyun Hu ◽  
Zhidan Long

Because there are many similar geological structures underground, seismic profiles have an abundance of self-repeating patterns. Thus, we can divide a seismic profile into groups of blocks with similar seismic structure. The matrix formed by stacking together similar blocks in each group should be of low rank. Hence, we can transfer the seismic denoising problem to a serial of low-rank matrix approximation (LRMA) problem. The LRMA-based model commonly adopts the nuclear norm as a convex substitute of the rank of a matrix. However, the nuclear norm minimization (NNM) shrinks the different rank components equally and may cause some biases in practice. Recently introduced truncated nuclear norm (TNN) has been proven to more accurately approximate the rank of a matrix, which is given by the sum of the set of smallest singular values. Based on this, we propose a novel denoising method using truncated nuclear norm minimization (TNNM). The objective function of this method consists of two terms, the F-norm data fidelity and a truncated nuclear norm regularization. We present an efficient two-step iterative algorithm to solve this objective function. Then, we apply the proposed TNNM algorithm to groups of blocks with similar seismic structure, and aggregate all resulting denoised blocks to get the denoised seismic data. We update the denoised results during each iteration to gradually attenuate the heavy noise. Numerical experiments demonstrate that, compared with FX-Decon, the curvelet, and the NNM-based methods, TNNM not only attenuates noise more effectively even when the SNR is as low as -10 dB and seismic data have complex structures, but also accurately preserves the seismic structures without inducing Gibbs artifacts.


2020 ◽  
Vol 130 ◽  
pp. 4-11 ◽  
Author(s):  
Yang Mu ◽  
Ping Wang ◽  
Liangfu Lu ◽  
Xuyun Zhang ◽  
Lianyong Qi

Sign in / Sign up

Export Citation Format

Share Document