scholarly journals Unified Algorithm Framework for Nonconvex Stochastic Optimization in Deep Neural Networks

IEEE Access ◽  
2021 ◽  
pp. 1-1
Author(s):  
Yini Zhu ◽  
Hideaki Iiduka
2020 ◽  
Vol 31 (12) ◽  
pp. 5079-5091 ◽  
Author(s):  
Haoqian Wang ◽  
Yi Luo ◽  
Wangpeng An ◽  
Qingyun Sun ◽  
Jun Xu ◽  
...  

Author(s):  
Yasutoshi Ida ◽  
Yasuhiro Fujiwara ◽  
Sotetsu Iwamura

Adaptive learning rate algorithms such as RMSProp are widely used for training deep neural networks. RMSProp offers efficient training since it uses first order gradients to approximate Hessian-based preconditioning. However, since the first order gradients include noise caused by stochastic optimization, the approximation may be inaccurate. In this paper, we propose a novel adaptive learning rate algorithm called SDProp. Its key idea is effective handling of the noise by preconditioning based on covariance matrix. For various neural networks, our approach is more efficient and effective than RMSProp and its variant.


2014 ◽  
Author(s):  
Erik McDermott ◽  
Georg Heigold ◽  
Pedro J. Moreno ◽  
Andrew Senior ◽  
Michiel Bacchiani

Author(s):  
Alex Hernández-García ◽  
Johannes Mehrer ◽  
Nikolaus Kriegeskorte ◽  
Peter König ◽  
Tim C. Kietzmann

2018 ◽  
Author(s):  
Chi Zhang ◽  
Xiaohan Duan ◽  
Ruyuan Zhang ◽  
Li Tong

Sign in / Sign up

Export Citation Format

Share Document