Hyperparameters optimization of neural network using improved particle swarm optimization for modeling of electromagnetic inverse problems

Author(s):  
Debanjali Sarkar ◽  
Taimoor Khan ◽  
Fazal Ahmed Talukdar

Abstract Optimization of hyperparameters of artificial neural network (ANN) usually involves a trial and error approach which is not only computationally expensive but also fails to predict a near-optimal solution most of the time. To design a better optimized ANN model, evolutionary algorithms are widely utilized to determine hyperparameters. This work proposes hyperparameters optimization of the ANN model using an improved particle swarm optimization (IPSO) algorithm. The different ANN hyperparameters considered are a number of hidden layers, neurons in each hidden layer, activation function, and training function. The proposed technique is validated using inverse modeling of two meander line electromagnetic bandgap unit cells and a slotted ultra-wideband antenna loaded with EBG structures. Three other evolutionary algorithms viz. hybrid PSO, conventional PSO, and genetic algorithm are also adopted for the hyperparameter optimization of the ANN models for comparative analysis. Performances of all the models are evaluated using quantitative assessment parameters viz. mean square error, mean absolute percentage deviation, and coefficient of determination (R2). The comparative investigation establishes the accurate and efficient prediction capability of the ANN models tuned using IPSO compared to other evolutionary algorithms.

2013 ◽  
Vol 333-335 ◽  
pp. 1384-1387
Author(s):  
Jin Jie Yao ◽  
Xiang Ju ◽  
Li Ming Wang ◽  
Jin Xiao Pan ◽  
Yan Han

Target localization technology has been intensively studied and broadly applied in many fields. This paper presents one improved particle swarm optimization technique in training a back-propagation neural network for position estimation in target localization. The proposed scheme combines particle swarm optimization (PSO), back-propagation neural network (BP), adaptive inertia weight and hybrid mutation, called IPSO-BP. To verify the proposed IPSO-BP approach, comparisons between the PSO-based BP approach (PSO-BP) and general back-propagation neural network (BP) are made. The computational results show that the proposed IPSO-BP approach exhibits much better performance in the training process and better prediction ability in the validation process than those using the other two base line approaches.


Author(s):  
Goran Klepac

Developed neural networks as an output could have numerous potential outputs caused by numerous combinations of input values. When we are in position to find optimal combination of input values for achieving specific output value within neural network model it is not a trivial task. This request comes from profiling purposes if, for example, neural network gives information of specific profile regarding input or recommendation system realized by neural networks, etc. Utilizing evolutionary algorithms like particle swarm optimization algorithm, which will be illustrated in this chapter, can solve these problems.


Sign in / Sign up

Export Citation Format

Share Document