The relevance vector machine for seismic Bayesian compressive sensing

Geophysics ◽  
2020 ◽  
Vol 85 (4) ◽  
pp. WA279-WA292
Author(s):  
Georgios Pilikos

Missing traces in seismic surveys create gaps in the data and cause problems in later stages of the seismic processing workflow through aliasing or incoherent noise. Compressive sensing (CS) is a framework that encompasses data reconstruction algorithms and acquisition processes. However, CS algorithms are mainly ad hoc by focusing on data reconstruction without any uncertainty quantification or feature learning. To avoid ad hoc algorithms, a probabilistic data-driven model is used, the relevance vector machine (RVM), to reconstruct seismic data and simultaneously quantify uncertainty. Modeling of sparsity is achieved using dictionaries of basis functions, and the model remains flexible by adding or removing them iteratively. Random irregular sampling with time-slice processing is used to reconstruct data without aliasing. Experiments on synthetic and field data sets illustrate its effectiveness with state-of-the-art reconstruction accuracy. In addition, a hybrid approach is used in which the domain of operation is smaller while, simultaneously, learned dictionaries of basis functions from seismic data are used. Furthermore, the uncertainty in predictions is quantified using the predictive variance of the RVM, obtaining high uncertainty when the reconstruction accuracy is low and vice versa. This could be used for the evaluation of source/receiver configurations guiding seismic survey design.

Geophysics ◽  
2019 ◽  
Vol 84 (2) ◽  
pp. P15-P25 ◽  
Author(s):  
Georgios Pilikos ◽  
A. C. Faul

Compressive sensing is used to improve the efficiency of seismic data acquisition and survey design. Nevertheless, most methods are ad hoc, and their only aim is to fill in the gaps in the data. Algorithms might be able to predict missing receivers’ values, however, it is also desirable to be able to associate each prediction with a degree of uncertainty. We used beta process factor analysis (BPFA) and its variance. With this, we achieved high correlation between uncertainty and respective reconstruction error. Comparisons with other algorithms in the literature and results on synthetic and field data illustrate the advantages of using BPFA for uncertainty quantification. This could be useful when modeling the degree of uncertainty for different source/receiver configurations to guide future seismic survey design.


Geophysics ◽  
2021 ◽  
pp. 1-44
Author(s):  
Mengli Zhang

The time-lapse seismic method plays a critical role in the reservoir monitoring and characterization. However, time-lapse data acquisitions are costly. Sparse acquisitions combined with post-acquisition data reconstruction could reduce the cost and facilitate more frequent applications of the time-lapse seismic monitoring. We present a sparse time-lapse seismic data reconstruction methodology based on compressive sensing. The method works with a hybrid of repeated and non-repeated sample locations. To make use of the additional information from non-repeated locations, we present a view that non-repeated samples in space are equivalent to irregular samples in calendar time. Therefore, we use these irregular samples in time coming from non-repeated samples in space to improve the performance of compressive sensing reconstruction. The tests on synthetic and field datasets indicate that our method can achieve a sufficiently accurate reconstruction by using as few as 10% of the receivers or traces. The method not only works with spatially irregular sampling for dealing with the land accessibility problem and for reducing the number of nodal sensors, but also utilizes the non-repeated measurements to improve the reconstruction accuracy.


2014 ◽  
Vol 599-601 ◽  
pp. 1411-1415
Author(s):  
Yan Hai Wu ◽  
Meng Xin Ma ◽  
Nan Wu ◽  
Jing Wang

The traditional reconstruction method of Compressive Sensing (CS) was mostly depended on L1-norm linear regression model. And here we propose Bayesian Compressive Sensing (BCS) to reconstruct the signal. It provides posterior distribution of the parameter rather than point estimate, so we can get the uncertainty of the estimation to optimize the data reconstruction process adaptively. In this paper, we employ hierarchical form of Laplace prior, and aiming at improving the efficiency of reconstruction, we segment image into blocks, employ various sample rates to compress different kinds of block and utilize relevance vector machine (RVM) to sparse signal in the reconstruction process. At last, we provide experimental result of image, and compare with the state-of-the-art CS algorithms, it demonstrating the superior performance of the proposed approach.


2016 ◽  
Vol 130 ◽  
pp. 194-208 ◽  
Author(s):  
Shuwei Gan ◽  
Shoudong Wang ◽  
Yangkang Chen ◽  
Xiaohong Chen ◽  
Weiling Huang ◽  
...  

Author(s):  
Feng Qian ◽  
Cangcang Zhang ◽  
Lingtian Feng ◽  
Cai Lu ◽  
Gulan Zhang ◽  
...  

Geophysics ◽  
2006 ◽  
Vol 71 (5) ◽  
pp. U67-U76 ◽  
Author(s):  
Robert J. Ferguson

The possibility of improving regularization/datuming of seismic data is investigated by treating wavefield extrapolation as an inversion problem. Weighted, damped least squares is then used to produce the regularized/datumed wavefield. Regularization/datuming is extremely costly because of computing the Hessian, so an efficient approximation is introduced. Approximation is achieved by computing a limited number of diagonals in the operators involved. Real and synthetic data examples demonstrate the utility of this approach. For synthetic data, regularization/datuming is demonstrated for large extrapolation distances using a highly irregular recording array. Without approximation, regularization/datuming returns a regularized wavefield with reduced operator artifacts when compared to a nonregularizing method such as generalized phase shift plus interpolation (PSPI). Approximate regularization/datuming returns a regularized wavefield for approximately two orders of magnitude less in cost; but it is dip limited, though in a controllable way, compared to the full method. The Foothills structural data set, a freely available data set from the Rocky Mountains of Canada, demonstrates application to real data. The data have highly irregular sampling along the shot coordinate, and they suffer from significant near-surface effects. Approximate regularization/datuming returns common receiver data that are superior in appearance compared to conventional datuming.


Sign in / Sign up

Export Citation Format

Share Document