scholarly journals A distributed optimisation framework combining natural gradient with Hessian-free for discriminative sequence training

2021 ◽  
Author(s):  
Adnan Haider ◽  
Chao Zhang ◽  
Florian L. Kreyssig ◽  
Philip C. Woodland
Keyword(s):  
2021 ◽  
Vol 164 ◽  
pp. 112056
Author(s):  
Melanie Bon ◽  
Jacques Grall ◽  
Joao B. Gusmao ◽  
Maritza Fajardo ◽  
Chris Harrod ◽  
...  

1986 ◽  
Vol 22 (13) ◽  
pp. 2017-2029 ◽  
Author(s):  
D. M. Mackay ◽  
D. L. Freyberg ◽  
P. V. Roberts ◽  
J. A. Cherry

2021 ◽  
pp. 1-12
Author(s):  
Junqing Ji ◽  
Xiaojia Kong ◽  
Yajing Zhang ◽  
Tongle Xu ◽  
Jing Zhang

The traditional blind source separation (BSS) algorithm is mainly used to deal with signal separation under the noiseless model, but it does not apply to data with the low signal to noise ratio (SNR). To solve the problem, an adaptive variable step size natural gradient BSS algorithm based on an improved wavelet threshold is proposed in this paper. Firstly, an improved wavelet threshold method is used to reduce the noise of the signal. Secondly, the wavelet coefficient layer with obvious periodicity is denoised using a morphological component analysis (MCA) algorithm, and the processed wavelet coefficients are recombined to obtain the ideal model. Thirdly, the recombined signal is pre-whitened, and a new separation matrix update formula of natural gradient algorithm is constructed by defining a new separation degree estimation function. Finally, the adaptive variable step size natural gradient blind source algorithm is used to separate the noise reduction signal. The results show that the algorithm can not only adaptively adjust the step size according to different signals, but also improve the convergence speed, stability and separation accuracy.


Sign in / Sign up

Export Citation Format

Share Document