A General Framework for Auto-Weighted Feature Selection via Global Redundancy Minimization

2019 ◽  
Vol 28 (5) ◽  
pp. 2428-2438 ◽  
Author(s):  
Feiping Nie ◽  
Sheng Yang ◽  
Rui Zhang ◽  
Xuelong Li
Data Mining ◽  
2011 ◽  
pp. 80-105 ◽  
Author(s):  
Yong Seong Kim ◽  
W. Nick Street ◽  
Filippo Menczer

Feature subset selection is an important problem in knowledge discovery, not only for the insight gained from determining relevant modeling variables, but also for the improved understandability, scalability, and, possibly, accuracy of the resulting models. The purpose of this chapter is to provide a comprehensive analysis of feature selection via evolutionary search in supervised and unsupervised learning. To achieve this purpose, we first discuss a general framework for feature selection based on a new search algorithm, Evolutionary Local Selection Algorithm (ELSA). The search is formulated as a multi-objective optimization problem to examine the trade-off between the complexity of the generated solutions against their quality. ELSA considers multiple objectives efficiently while avoiding computationally expensive global comparison. We combine ELSA with Artificial Neural Networks (ANNs) and Expectation-Maximization (EM) algorithms for feature selection in supervised and unsupervised learning respectively. Further, we provide a new two-level evolutionary algorithm, Meta-Evolutionary Ensembles (MEE), where feature selection is used to promote the diversity among classifiers in the same ensemble.


Author(s):  
LIOR ROKACH ◽  
BARAK CHIZI ◽  
ODED MAIMON

Feature selection is the process of identifying relevant features in the dataset and discarding everything else as irrelevant and redundant. Since feature selection reduces the dimensionality of the data, it enables the learning algorithms to operate more effectively and rapidly. In some cases, classification performance can be improved; in other instances, the obtained classifier is more compact and can be easily interpreted. There is much work done on feature selection methods for creating ensemble of classifiers. Thus, these works examine how feature selection can help ensemble of classifiers to gain diversity. This paper examines a different direction, i.e. whether ensemble methodology can be used for improving feature selection performance. In this paper we present a general framework for creating several feature subsets and then combine them into a single subset. Theoretical and empirical results presented in this paper validate the hypothesis that this approach can help to find a better feature subset.


2011 ◽  
Vol 38 (8) ◽  
pp. 10018-10024 ◽  
Author(s):  
Bárbara B. Pineda-Bautista ◽  
J.A. Carrasco-Ochoa ◽  
J. Fco. Martı́nez-Trinidad

2015 ◽  
Vol 35 ◽  
pp. 740-748 ◽  
Author(s):  
Sebastián Maldonado ◽  
Álvaro Flores ◽  
Thomas Verbraken ◽  
Bart Baesens ◽  
Richard Weber

Author(s):  
Lindsey M. Kitchell ◽  
Francisco J. Parada ◽  
Brandi L. Emerick ◽  
Tom A. Busey

2008 ◽  
Vol 1 (2) ◽  
pp. 109-134 ◽  
Author(s):  
Stephen R. Anderson

Alternations between allomorphs that are not directly related by phonological rule, but whose selection is governed by phonological properties of the environment, have attracted the sporadic attention of phonologists and morphologists. Such phenomena are commonly limited to rather small corners of a language's structure, however, and as a result have not been a major theoretical focus. This paper examines a set of alternations in Surmiran, a Swiss Rumantsch language, that have this character and that pervade the entire system of the language. It is shown that the alternations in question, best attested in the verbal system, are not conditioned by any coherent set of morphological properties (either straightforwardly or in the extended sense of ‘morphomes’ explored in other Romance languages by Maiden). These alternations are, however, straightforwardly aligned with the location of stress in words, and an analysis is proposed within the general framework of Optimality Theory to express this. The resulting system of phonologically conditioned allomorphy turns out to include the great majority of patterning which one might be tempted to treat as productive phonology, but which has been rendered opaque (and subsequently morphologized) as a result of the working of historical change.


Sign in / Sign up

Export Citation Format

Share Document