Recurrent gradient descent adaptive learning rate and momentum neural network for rainfall forecasting

Author(s):  
Bayu Kanigoro ◽  
Afan Galih Salman
Author(s):  
Afan Galih Salman ◽  
Yen Lina Prasetio

The artificial neural network (ANN) technology in rainfall prediction can be done using the learning approach. The ANN prediction accuracy is measured by the determination coefficient (R2) and root mean square error (RMSE). This research implements Elman’s Recurrent ANN which is heuristically optimized based on el-nino southern oscilation (ENSO) variables: wind, southern oscillation index (SOI), sea surface temperatur (SST) dan outgoing long wave radiation (OLR) to forecast regional monthly rainfall in Bongan Bali. The heuristic learning optimization done is basically a performance development of standard gradient descent learning algorithm into training algorithms: gradient descent momentum and adaptive learning rate. The patterns of input data affect the performance of Recurrent Elman neural network in estimation process. The first data group that is 75% training data and 25% testing data produce the maximum R2 leap 74,6% while the second data group that is 50% training data and 50% testing data produce the maximum R2 leap 49,8%.


2018 ◽  
Vol 3 (2) ◽  
Author(s):  
Zahratul Fitri

Abstrak— Algoritma backpropagation merupakan bagian dari Jaringan Syaraf Tiruan (JST) yang memiliki beberapa layar tersembunyi. Algoritma backpropagation juga merupakan multi-layer yang banyak digunakan untuk menyelesaikan persoalan yang luas, akan tetapi, algoritma backpropagation juga memiliki kelamahan pada proses pembelajaran yang cukup lambat. Pada penelitian ini penulis menganalisis bagaimana mengembangkan algoritma backpropagation dengan menggunakan learning rate dan parameter momentum untuk meminimalisir error dan epoch yang akurat sebagai proses menghitung perubahan bobot. Dari hasil penelitian diperoleh bahwa pengembangan yang dilakukan memperoleh nilai paling baik pada nilai momentum yaitu 0,9 dan 1.0 dan nilai learning rate yaitu > 0,7. Hal ini membuktikan bahwa nilai pembelajaran dengan menggunakan nilai parameter momentum dan nilai learning rate diatas sangat baik digunakan sebagai percepatan laju konvergensi.Kata kunci— Algoritma Backpropagation, Parameter Momentum, Adaptive Learning Rate . Abstract— Backpropagation algorithm is part of an Artificial Neural Network (ANN), which has some hidden screen. Backpropagation algorithm is also a multi-layer finish that is widely used for large problems, however, the backpropagation algorithm also has weaknesses in the learning process is quite slow. In this study the authors analyze how to develop a backpropagation algorithm using learning rate and momentum parameters to minimize the error and accurate epoch as the process of calculating the weight change. The result showed that the development is carried out to obtain best value on the momentum value of 0.9 and 1.0 and the value of learning rate is> 0.7. It is proved that the value of learning by using the parameter values of momentum and learning rate values above are best used as a convergence rate acceleration.Keywords— Backpropagation Algorithm, Parameter of Momentum, Adaptive   Learning Rate


Author(s):  
S. M. Yang ◽  
G. S. Lee

Abstract One of the major difficulties in neural network applications is the selection of the parameters in network configuration and the coefficients in learning rule for fast convergence as well as best system performance. This paper developed a network design methodology so that the optimal design parameters/coefficients can be determined in a systematic way thereby avoiding the lengthy trial-and-error. The methodology combines the Taguchi method of quality engineering and the back-propagation network with an adaptive learning rate for their advantages in implementation feasibility and performance robustness. Vibration suppression experiments of a composite smart structure with embedded piezoelectric sensor/actuator validate that the methodology provides an efficient neural controller design, including the plant order, the number of hidden layer neurons, the number of training patterns, and the coefficients of adaptive learning rate.


Sign in / Sign up

Export Citation Format

Share Document