Hybrid Particle Swarm Optimization With Genetic Algorithm to Train Artificial Neural Networks for Short-Term Load Forecasting

2022 ◽  
pp. 227-241
Author(s):  
Kuruge Darshana Abeyrathna ◽  
Chawalit Jeenanunta

This research proposes a new training algorithm for artificial neural networks (ANNs) to improve the short-term load forecasting (STLF) performance. The proposed algorithm overcomes the so-called training issue in ANNs, where it traps in local minima, by applying genetic algorithm operations in particle swarm optimization when it converges to local minima. The training ability of the hybridized training algorithm is evaluated using load data gathered by Electricity Generating Authority of Thailand. The ANN is trained using the new training algorithm with one-year data to forecast equal 48 periods of each day in 2013. During the testing phase, a mean absolute percentage error (MAPE) is used to evaluate performance of the hybridized training algorithm and compare them with MAPEs from Backpropagation, GA, and PSO. Yearly average MAPE and the average MAPEs for weekdays, Mondays, weekends, Holidays, and Bridging holidays show that PSO+GA algorithm outperforms other training algorithms for STLF.

2019 ◽  
Vol 10 (1) ◽  
pp. 1-14 ◽  
Author(s):  
Kuruge Darshana Abeyrathna ◽  
Chawalit Jeenanunta

This research proposes a new training algorithm for artificial neural networks (ANNs) to improve the short-term load forecasting (STLF) performance. The proposed algorithm overcomes the so-called training issue in ANNs, where it traps in local minima, by applying genetic algorithm operations in particle swarm optimization when it converges to local minima. The training ability of the hybridized training algorithm is evaluated using load data gathered by Electricity Generating Authority of Thailand. The ANN is trained using the new training algorithm with one-year data to forecast equal 48 periods of each day in 2013. During the testing phase, a mean absolute percentage error (MAPE) is used to evaluate performance of the hybridized training algorithm and compare them with MAPEs from Backpropagation, GA, and PSO. Yearly average MAPE and the average MAPEs for weekdays, Mondays, weekends, Holidays, and Bridging holidays show that PSO+GA algorithm outperforms other training algorithms for STLF.


Sign in / Sign up

Export Citation Format

Share Document