Author(s):  
Sébastien Gadat ◽  
Sébastien Gadat

Variable selection for classification is a crucial paradigm in image analysis. Indeed, images are generally described by a large amount of features (pixels, edges …) although it is difficult to obtain a sufficiently large number of samples to draw reliable inference for classifications using the whole number of features. The authors describe in this chapter some simple and effective features selection methods based on filter strategy. They also provide some more sophisticated methods based on margin criterion or stochastic approximation techniques that achieve great performances of classification with a very small proportion of variables. Most of these “wrapper” methods are dedicated to a special case of classifier, except the Optimal features Weighting algorithm (denoted OFW in the sequel) which is a meta-algorithm and works with any classifier. A large part of this chapter will be dedicated to the description of the description of OFW and hybrid OFW algorithms. The authors illustrate also several other methods on practical examples of face detection problems.


1979 ◽  
Vol 4 (2) ◽  
pp. 139-160
Author(s):  
Thomas J. Hummel ◽  
Charles B. Johnston

Results are given on an investigation of stochastic approximation procedures of the Robbins-Monro type. Attention is focused on formal methods for selecting successive values of a single treatment variable for a specific case in sequential experimentation. Empirical results obtained by means of Monte Carlo methods are used to compare several formal stochastic approximation techniques and stopping rules. Marked differences were found between the five approximation procedures studied. A procedure using afinite memory performed most effectively. Two procedures suggested in the literature were judged to be inefficient under various conditions.


Sign in / Sign up

Export Citation Format

Share Document