scholarly journals Multichannel adaptive deconvolution based on streaming prediction-error filter

2021 ◽  
Vol 18 (6) ◽  
pp. 825-833
Author(s):  
Qinghan Wang ◽  
Yang Liu ◽  
Cai Liu ◽  
Zhisheng Zheng

Abstract Deconvolution mainly improves the resolution of seismic data by compressing seismic wavelets, which is of great significance in high-resolution processing of seismic data. Prediction-error filtering/least-square inverse filtering is widely used in seismic deconvolution and usually assumes that seismic data is stationary. Affected by factors such as earth filtering, actual seismic wavelets are time- and space-varying. Adaptive prediction-error filters are designed to effectively characterise the nonstationarity of seismic data by using iterative methods, however, it leads to problems such as slow calculation speed and high memory cost when dealing with large-scale data. We have proposed an adaptive deconvolution method based on a streaming prediction-error filter. Instead of using slow iterations, mathematical underdetermined problems with the new local smoothness constraints are analytically solved to predict time-varying seismic wavelets. To avoid the discontinuity of deconvolution results along the space axis, both time and space constraints are used to implement multichannel adaptive deconvolution. Meanwhile, we define the parameter of the time-varying prediction step that keeps the relative amplitude relationship among different reflections. The new deconvolution improves the resolution along the time direction while reducing the computational costs by a streaming computation, which is suitable for handling nonstationary large-scale data. Synthetic model and field data tests show that the proposed method can effectively improve the resolution of nonstationary seismic data, while maintaining the lateral continuity of seismic events. Furthermore, the relative amplitude relationship of different reflections is reasonably preserved.

2009 ◽  
Vol 28 (11) ◽  
pp. 2737-2740
Author(s):  
Xiao ZHANG ◽  
Shan WANG ◽  
Na LIAN

2016 ◽  
Author(s):  
John W. Williams ◽  
◽  
Simon Goring ◽  
Eric Grimm ◽  
Jason McLachlan

2008 ◽  
Vol 9 (10) ◽  
pp. 1373-1381 ◽  
Author(s):  
Ding-yin Xia ◽  
Fei Wu ◽  
Xu-qing Zhang ◽  
Yue-ting Zhuang

2021 ◽  
Vol 77 (2) ◽  
pp. 98-108
Author(s):  
R. M. Churchill ◽  
C. S. Chang ◽  
J. Choi ◽  
J. Wong ◽  
S. Klasky ◽  
...  

Author(s):  
Krzysztof Jurczuk ◽  
Marcin Czajkowski ◽  
Marek Kretowski

AbstractThis paper concerns the evolutionary induction of decision trees (DT) for large-scale data. Such a global approach is one of the alternatives to the top-down inducers. It searches for the tree structure and tests simultaneously and thus gives improvements in the prediction and size of resulting classifiers in many situations. However, it is the population-based and iterative approach that can be too computationally demanding to apply for big data mining directly. The paper demonstrates that this barrier can be overcome by smart distributed/parallel processing. Moreover, we ask the question whether the global approach can truly compete with the greedy systems for large-scale data. For this purpose, we propose a novel multi-GPU approach. It incorporates the knowledge of global DT induction and evolutionary algorithm parallelization together with efficient utilization of memory and computing GPU’s resources. The searches for the tree structure and tests are performed simultaneously on a CPU, while the fitness calculations are delegated to GPUs. Data-parallel decomposition strategy and CUDA framework are applied. Experimental validation is performed on both artificial and real-life datasets. In both cases, the obtained acceleration is very satisfactory. The solution is able to process even billions of instances in a few hours on a single workstation equipped with 4 GPUs. The impact of data characteristics (size and dimension) on convergence and speedup of the evolutionary search is also shown. When the number of GPUs grows, nearly linear scalability is observed what suggests that data size boundaries for evolutionary DT mining are fading.


Author(s):  
Xingyi Wang ◽  
Yu Li ◽  
Yiquan Chen ◽  
Shiwen Wang ◽  
Yin Du ◽  
...  

Sign in / Sign up

Export Citation Format

Share Document