Novel self-adjusted particle swarm optimization algorithm for feature selection

Computing ◽  
2021 ◽  
Author(s):  
Bo Wei ◽  
Xuan Wang ◽  
Xuewen Xia ◽  
Mingfeng Jiang ◽  
Zuohua Ding ◽  
...  
2021 ◽  
Vol 2021 ◽  
pp. 1-11
Author(s):  
Bingsheng Chen ◽  
Huijie Chen ◽  
Mengshan Li

Feature selection can classify the data with irrelevant features and improve the accuracy of data classification in pattern classification. At present, back propagation (BP) neural network and particle swarm optimization algorithm can be well combined with feature selection. On this basis, this paper adds interference factors to BP neural network and particle swarm optimization algorithm to improve the accuracy and practicability of feature selection. This paper summarizes the basic methods and requirements for feature selection and combines the benefits of global optimization with the feedback mechanism of BP neural networks to feature based on backpropagation and particle swarm optimization (BP-PSO). Firstly, a chaotic model is introduced to increase the diversity of particles in the initial process of particle swarm optimization, and an adaptive factor is introduced to enhance the global search ability of the algorithm. Then, the number of features is optimized to reduce the number of features on the basis of ensuring the accuracy of feature selection. Finally, different data sets are introduced to test the accuracy of feature selection, and the evaluation mechanisms of encapsulation mode and filtering mode are used to verify the practicability of the model. The results show that the average accuracy of BP-PSO is 8.65% higher than the suboptimal NDFs model in different data sets, and the performance of BP-PSO is 2.31% to 18.62% higher than the benchmark method in all data sets. It shows that BP-PSO can select more distinguishing feature subsets, which verifies the accuracy and practicability of this model.


Particle Swarm Optimization, a nature based stochastic evolutionary algorithm that iteratively tries to improvise the solution pertaining to a particular objective function. The problem becomes challenging if the objective function is not properly identified nor it is properly been evaluated which results in slow convergence and inability to find the optimal solution. Hence, we propose a novel rough set based particle swarm optimization algorithm using golden ratio principle for an efficient feature selection process that focusses on two objectives: First, that results in a reduced subset of features without conceding the originality of the data and the second is that yields a high optimal result. Since many subset of features might result with a meaningful solution, we have used the golden ratio principle to extract the most reduced subset with a high optimal solution. The algorithm has been tested over several benchmark datasets. The results shows that the proposed algorithm identifies a small set of features without convincing the optimal solution, thus able to achieve the stated objectives.


Sign in / Sign up

Export Citation Format

Share Document