adaptive learning rate
Recently Published Documents


TOTAL DOCUMENTS

138
(FIVE YEARS 45)

H-INDEX

12
(FIVE YEARS 3)

2021 ◽  
Vol 11 (20) ◽  
pp. 9468
Author(s):  
Yunyun Sun ◽  
Yutong Liu ◽  
Haocheng Zhou ◽  
Huijuan Hu

Deep learning proves its promising results in various domains. The automatic identification of plant diseases with deep convolutional neural networks attracts a lot of attention at present. This article extends stochastic gradient descent momentum optimizer and presents a discount momentum (DM) deep learning optimizer for plant diseases identification. To examine the recognition and generalization capability of the DM optimizer, we discuss the hyper-parameter tuning and convolutional neural networks models across the plantvillage dataset. We further conduct comparison experiments on popular non-adaptive learning rate methods. The proposed approach achieves an average validation accuracy of no less than 97% for plant diseases prediction on several state-of-the-art deep learning models and holds a low sensitivity to hyper-parameter settings. Experimental results demonstrate that the DM method can bring a higher identification performance, while still maintaining a competitive performance over other non-adaptive learning rate methods in terms of both training speed and generalization.


Author(s):  
Mr. Dhanaji Vilas Mirajkar

Artificial neural network (ANN) mainly consists of learning algorithms, which are require to optimize the convergence of neural networks. We need to optimize the convergence of neural networks in order to improve the speed and accuracy of decision making process. To enable the optimization process one of the widely used algorithm is back propagation learning algorithm. Objective of study is to applied backpropagation algorithm for solving multivariate time series problem. To better the accuracy of neural network it is important to find optimized architecture for the problem under consideration. The learning rate is also an important factor which affects the performance of result. In this study, we proposed extended adaptive learning approach in which learning rate is adapted from number of previous iteration error trend in first half of training. In next half of training learning rate is adapted as per adaptive learning rate algorithm. Compare performance of three variation of backpropagationalgorithm. All these variation experimented on two standard dataset. Experimental result shows that during validation and training ANN with extended adaptive learning rate outperforms other than two variations.


Author(s):  
Mahmoud Smaida ◽  
Serhii Yaroshchak ◽  
Ahmed Y. Ben Sasi

One of the most important hyper-parameters for model training and generalization is the learning rate. Recently, many research studies have shown that optimizing the learning rate schedule is very useful for training deep neural networks to get accurate and efficient results. In this paper, different learning rate schedules using some comprehensive optimization techniques have been compared in order to measure the accuracy of a convolutional neural network CNN model to classify four ophthalmic conditions. In this work, a deep learning CNN based on Keras and TensorFlow has been deployed using Python on a database that contains 1692 images, which consists of four types of ophthalmic cases: Glaucoma, Myopia, Diabetic retinopathy, and Normal eyes. The CNN model has been trained on Google Colab. GPU with different learning rate schedules and adaptive learning algorithms. Constant learning rate, time-based decay, step-based decay, exponential decay, and adaptive learning rate optimization techniques for deep learning have been addressed. Adam adaptive learning rate method. has outperformed the other optimization techniques and achieved the best model accuracy of 92.58% for training set and 80.49% for validation datasets, respectively.


Author(s):  
Vakada Naveen ◽  
Yaswanth Mareedu ◽  
Neeharika Sai Mandava ◽  
Sravya Kaveti ◽  
G. Krishna Kishore

2021 ◽  
pp. 381-396
Author(s):  
Zhiyong Hao ◽  
Yixuan Jiang ◽  
Huihua Yu ◽  
Hsiao-Dong Chiang

Sign in / Sign up

Export Citation Format

Share Document