Exploiting potential of deep neural networks by layer-wise fine-grained parallelism

2020 ◽  
Vol 102 ◽  
pp. 210-221 ◽  
Author(s):  
Wenbin Jiang ◽  
Yangsong Zhang ◽  
Pai Liu ◽  
Jing Peng ◽  
Laurence T. Yang ◽  
...  
Author(s):  
Xiuwen Yi ◽  
Zhewen Duan ◽  
Ruiyuan Li ◽  
Junbo Zhang ◽  
Tianrui Li ◽  
...  

2021 ◽  
Vol 12 (1) ◽  
pp. 268
Author(s):  
Jiali Deng ◽  
Haigang Gong ◽  
Minghui Liu ◽  
Tianshu Xie ◽  
Xuan Cheng ◽  
...  

It has been shown that the learning rate is one of the most critical hyper-parameters for the overall performance of deep neural networks. In this paper, we propose a new method for setting the global learning rate, named random amplify learning rates (RALR), to improve the performance of any optimizer in training deep neural networks. Instead of monotonically decreasing the learning rate, we expect to escape saddle points or local minima by amplifying the learning rate between reasonable boundary values based on a given probability. Training with RALR rather than conventionally decreasing the learning rate achieves further improvement on networks’ performance without extra consumption. Remarkably, the RALR is complementary with state-of-the-art data augmentation and regularization methods. Besides, we empirically study its performance on image classification tasks, fine-grained classification tasks, object detection tasks, and machine translation tasks. Experiments demonstrate that RALR can bring a notable improvement while preventing overfitting when training deep neural networks. For example, the classification accuracy of ResNet-110 trained on the CIFAR-100 dataset using RALR achieves a 1.34% gain compared with ResNet-110 trained traditionally.


IEEE Access ◽  
2019 ◽  
Vol 7 ◽  
pp. 122740-122757 ◽  
Author(s):  
Yang-Yang Zheng ◽  
Jian-Lei Kong ◽  
Xue-Bo Jin ◽  
Xiao-Yi Wang ◽  
Ting-Li Su ◽  
...  

Author(s):  
Erzhuo Shao ◽  
Huandong Wang ◽  
Jie Feng ◽  
Tong Xia ◽  
Hedong Yang ◽  
...  

Author(s):  
Alex Hernández-García ◽  
Johannes Mehrer ◽  
Nikolaus Kriegeskorte ◽  
Peter König ◽  
Tim C. Kietzmann

2018 ◽  
Author(s):  
Chi Zhang ◽  
Xiaohan Duan ◽  
Ruyuan Zhang ◽  
Li Tong

Sign in / Sign up

Export Citation Format

Share Document