Bayesian modeling for uncertainty quantification in seismic compressive sensing

Geophysics ◽  
2019 ◽  
Vol 84 (2) ◽  
pp. P15-P25 ◽  
Author(s):  
Georgios Pilikos ◽  
A. C. Faul

Compressive sensing is used to improve the efficiency of seismic data acquisition and survey design. Nevertheless, most methods are ad hoc, and their only aim is to fill in the gaps in the data. Algorithms might be able to predict missing receivers’ values, however, it is also desirable to be able to associate each prediction with a degree of uncertainty. We used beta process factor analysis (BPFA) and its variance. With this, we achieved high correlation between uncertainty and respective reconstruction error. Comparisons with other algorithms in the literature and results on synthetic and field data illustrate the advantages of using BPFA for uncertainty quantification. This could be useful when modeling the degree of uncertainty for different source/receiver configurations to guide future seismic survey design.

2018 ◽  
Vol 58 (2) ◽  
pp. 773
Author(s):  
John Archer ◽  
Milos Delic ◽  
Frank Nicholson

Through a combination of innovative survey design, new technology and the introduction of novel operational techniques, the trace density of a 3D seismic survey in the Cooper Basin was increased from a baseline of 140 000 to 1 600 000 traces km–2, the bandwidth of the data was extended from four to six octaves, and the dataset was acquired in substantially the same time-frame and for the same cost as the baseline survey.


Geophysics ◽  
2020 ◽  
Vol 85 (4) ◽  
pp. WA279-WA292
Author(s):  
Georgios Pilikos

Missing traces in seismic surveys create gaps in the data and cause problems in later stages of the seismic processing workflow through aliasing or incoherent noise. Compressive sensing (CS) is a framework that encompasses data reconstruction algorithms and acquisition processes. However, CS algorithms are mainly ad hoc by focusing on data reconstruction without any uncertainty quantification or feature learning. To avoid ad hoc algorithms, a probabilistic data-driven model is used, the relevance vector machine (RVM), to reconstruct seismic data and simultaneously quantify uncertainty. Modeling of sparsity is achieved using dictionaries of basis functions, and the model remains flexible by adding or removing them iteratively. Random irregular sampling with time-slice processing is used to reconstruct data without aliasing. Experiments on synthetic and field data sets illustrate its effectiveness with state-of-the-art reconstruction accuracy. In addition, a hybrid approach is used in which the domain of operation is smaller while, simultaneously, learned dictionaries of basis functions from seismic data are used. Furthermore, the uncertainty in predictions is quantified using the predictive variance of the RVM, obtaining high uncertainty when the reconstruction accuracy is low and vice versa. This could be used for the evaluation of source/receiver configurations guiding seismic survey design.


2014 ◽  
Author(s):  
Charles Mosher ◽  
Chengbo Li ◽  
Larry Morley ◽  
Yonchang Ji ◽  
Frank Janiszewski ◽  
...  

2017 ◽  
Vol 36 (8) ◽  
pp. 661-669 ◽  
Author(s):  
Charles C. Mosher ◽  
Chengbo Li ◽  
Frank D. Janiszewski ◽  
Laurence S. Williams ◽  
Tiffany C. Carey ◽  
...  

2017 ◽  
Vol 36 (8) ◽  
pp. 642-645 ◽  
Author(s):  
Richard G. Baraniuk ◽  
Philippe Steeghs

Sign in / Sign up

Export Citation Format

Share Document