binary pso
Recently Published Documents


TOTAL DOCUMENTS

118
(FIVE YEARS 35)

H-INDEX

16
(FIVE YEARS 2)

Informatica ◽  
2022 ◽  
pp. 1-21
Author(s):  
Emrah Hancer ◽  
Marina Bardamova ◽  
Ilya Hodashinsky ◽  
Konstantin Sarin ◽  
Artem Slezkin ◽  
...  

2021 ◽  
Author(s):  
◽  
Hoai Nguyen

<p>Classification aims to identify a class label of an instance according to the information from its characteristics or features. Unfortunately, many classification problems have a large feature set containing irrelevant and redundant features, which reduce the classification performance. In order to address the above problem, feature selection is proposed to select a small subset of relevant features. There are three main types of feature selection methods, i.e. wrapper, embedded and filter approaches. Wrappers use a classification algorithm to evaluate candidate feature subsets. In embedded approaches, the selection process is embedded in the training process of a classification algorithm. Different from the other two approaches, filters do not involve any classification algorithm during the selection process. Feature selection is an important process but it is not an easy task due to its large search space and complex feature interactions. Because of the potential global search ability, Evolutionary Computation (EC), especially Particle Swarm Optimization (PSO), has been widely and successfully applied to feature selection. However, there is potential to improve the effectiveness and efficiency of EC-based feature selection.  The overall goal of this thesis is to investigate and improve the capability of EC for feature selection to select small feature subsets while maintaining or even improving the classification performance compared to using all features. Different aspects of feature selection are considered in this thesis such as the number of objectives (single-objective/multi-objective), the fitness function (filter/wrapper), and the searching mechanism.  This thesis introduces a new fitness function based on mutual information which is calculated by an estimation approach instead of the traditional counting approach. Results show that the estimation approach works well on both continuous and discrete data. More importantly, mutual information calculated by the estimation approach can capture feature interactions better than the traditional counting approach.  This thesis develops a novel binary PSO algorithm, which is the first work to redefine some core concepts of PSO such as velocity and momentum to suit the characteristics of binary search spaces. Experimental results show that the proposed binary PSO algorithm evolve better solutions than other binary EC algorithms when the search spaces are large and complex. Specifically, on feature selection, the proposed binary PSO algorithm can select smaller feature subsets with similar or better classification accuracies, especially when there are a large number of features.  This thesis proposes surrogate models for wrapper-based feature selection. The surrogate models use surrogate training sets which are subsets of informative instances selected from the training set. Experimental results show that the proposed surrogate models assist PSO to reduce the computational cost while maintaining or even improving the classification performance compared to using only the original training set.  The thesis develops the first wrapper-based multi-objective feature selection algorithm using MOEA/D. A new decomposition strategy using multiple reference points for MOEA/D is designed, which can deal with different characteristics of multi-objective feature selection such as highly discontinuous Pareto fronts and complex relationships between objectives. The experimental results show that the proposed algorithm can evolve more diverse non-dominated sets than other multi-objective algorithms.   This thesis introduces the first PSO-based feature selection algorithm for transfer learning. In the proposed algorithm, the fitness function uses classification performance to reduce the differences between domains while maintaining the discriminative ability on the target domain. The experimental results show that the proposed algorithm can select feature subsets which achieve better classification performance than four state-of-the-art feature-based transfer learning algorithms.</p>


2021 ◽  
Author(s):  
◽  
Hoai Nguyen

<p>Classification aims to identify a class label of an instance according to the information from its characteristics or features. Unfortunately, many classification problems have a large feature set containing irrelevant and redundant features, which reduce the classification performance. In order to address the above problem, feature selection is proposed to select a small subset of relevant features. There are three main types of feature selection methods, i.e. wrapper, embedded and filter approaches. Wrappers use a classification algorithm to evaluate candidate feature subsets. In embedded approaches, the selection process is embedded in the training process of a classification algorithm. Different from the other two approaches, filters do not involve any classification algorithm during the selection process. Feature selection is an important process but it is not an easy task due to its large search space and complex feature interactions. Because of the potential global search ability, Evolutionary Computation (EC), especially Particle Swarm Optimization (PSO), has been widely and successfully applied to feature selection. However, there is potential to improve the effectiveness and efficiency of EC-based feature selection.  The overall goal of this thesis is to investigate and improve the capability of EC for feature selection to select small feature subsets while maintaining or even improving the classification performance compared to using all features. Different aspects of feature selection are considered in this thesis such as the number of objectives (single-objective/multi-objective), the fitness function (filter/wrapper), and the searching mechanism.  This thesis introduces a new fitness function based on mutual information which is calculated by an estimation approach instead of the traditional counting approach. Results show that the estimation approach works well on both continuous and discrete data. More importantly, mutual information calculated by the estimation approach can capture feature interactions better than the traditional counting approach.  This thesis develops a novel binary PSO algorithm, which is the first work to redefine some core concepts of PSO such as velocity and momentum to suit the characteristics of binary search spaces. Experimental results show that the proposed binary PSO algorithm evolve better solutions than other binary EC algorithms when the search spaces are large and complex. Specifically, on feature selection, the proposed binary PSO algorithm can select smaller feature subsets with similar or better classification accuracies, especially when there are a large number of features.  This thesis proposes surrogate models for wrapper-based feature selection. The surrogate models use surrogate training sets which are subsets of informative instances selected from the training set. Experimental results show that the proposed surrogate models assist PSO to reduce the computational cost while maintaining or even improving the classification performance compared to using only the original training set.  The thesis develops the first wrapper-based multi-objective feature selection algorithm using MOEA/D. A new decomposition strategy using multiple reference points for MOEA/D is designed, which can deal with different characteristics of multi-objective feature selection such as highly discontinuous Pareto fronts and complex relationships between objectives. The experimental results show that the proposed algorithm can evolve more diverse non-dominated sets than other multi-objective algorithms.   This thesis introduces the first PSO-based feature selection algorithm for transfer learning. In the proposed algorithm, the fitness function uses classification performance to reduce the differences between domains while maintaining the discriminative ability on the target domain. The experimental results show that the proposed algorithm can select feature subsets which achieve better classification performance than four state-of-the-art feature-based transfer learning algorithms.</p>


2021 ◽  
pp. 14-26
Author(s):  
Wissem Abbes ◽  
Zied Kechaou ◽  
Amir Hussain ◽  
Adel M. Alimi

Author(s):  
Jinyan Li ◽  
Yaoyang Wu ◽  
Simon Fong ◽  
Antonio J. Tallón-Ballesteros ◽  
Xin-she Yang ◽  
...  

2021 ◽  
Vol 2021 ◽  
pp. 1-15
Author(s):  
Tung Khac Truong

The discounted {0–1} knapsack problem may be a kind of backpack issue with gathering structure and rebate connections among things. A moth-flame optimization algorithm has shown good searchability combined with an effective solution presentation designed for the discounted {0-1} knapsack problem. A new encoding scheme used a shorter length binary vector to help reduce the search domain and speed up the computing time. A greedy repair procedure is used to help the algorithm have fast convergence and reduce the gap between the best-found solution and the optimal solution. The experience results of 30 discounted {0-1} knapsack problem instances are used to evaluate the proposed algorithm. The results demonstrate that the proposed algorithm outperforms the two binary PSO algorithms and the genetic algorithm in solving 30 DKP01 instances. The Wilcoxon rank-sum test is used to support the proposed declarations.


2021 ◽  
Vol 11 (4) ◽  
pp. 2715-2725
Author(s):  
A. Gayatri ◽  
M. Surendra Kumar ◽  
A.M. Prasad

To design an efficient communication system, controlling the energy present in the side lobes of the far-field pattern is essential with a considered antenna array. This paper discussed one method for synthesizing a thin antenna array for optimizing three objectives simultaneously. They are several active elements, peak SLL and FNBW. All these objectives are in contrast in nature. This multi-objective technique furnishes appreciable flexibility for any specified application. A planar array antenna of 20X10 and 10X10 is synthesized using modified BPSO and in the position updating equation, a modified sigmoid function is used, including spread distance. Numerical results state that MBPSO performs well, and the array antenna of 20X10 with 54% filled aperture (108 elements) produces maximum PSLL and FNBW of -19.28dB and 280 in the remaining ∅ plane, respectively. The pattern representation in the far-field at three cutting planes with low PSLL’s of -20dB.Whereas 10X10 planar array antenna with 52% thinning percentage produces the best PSLL of -22.04 dB and -23.44 dB in ∅=00 & 900principal planes, respectively. The FNBW has observed in two planes is around 310. And also achieved a compromised solution of PSLL and FNBW of -19.28 dB and 270, respectively.


Sign in / Sign up

Export Citation Format

Share Document