Kernel Canonical Discriminant Analysis Based on Variable Selection

Author(s):  
Seiichi Ikeda ◽  
◽  
Yoshiharu Sato

We have shown that models support vector regression and classification are essentially linear in reproducing kernel Hilbert space (RKHS). To overcome the over fitting problem, a regularization term is added to the optimization process, deciding the coefficient of regularization term involves difficulties. We introduce the variable selection concept to the linear model in RKHS, where the kernel functions is treated as variable transformation when its value is given by observation. We show that kernel canonical discriminant functions for multiclass problems can be discussed under variable selection, which enables us to reduce the number of kernel functions in the discriminant function, i.e., the discriminant function is obtained as linear combinations of sufficiently small numbers of kernel functions, so, we can expect to get reasonable prediction. We discuss variable selection performance in canonical discriminant functions compared to support vector machines.

2017 ◽  
Vol 17 (15&16) ◽  
pp. 1292-1306 ◽  
Author(s):  
Rupak Chatterjee ◽  
Ting Yu

The support vector machine (SVM) is a popular machine learning classification method which produces a nonlinear decision boundary in a feature space by constructing linear boundaries in a transformed Hilbert space. It is well known that these algorithms when executed on a classical computer do not scale well with the size of the feature space both in terms of data points and dimensionality. One of the most significant limitations of classical algorithms using non-linear kernels is that the kernel function has to be evaluated for all pairs of input feature vectors which themselves may be of substantially high dimension. This can lead to computationally excessive times during training and during the prediction process for a new data point. Here, we propose using both canonical and generalized coherent states to calculate specific nonlinear kernel functions. The key link will be the reproducing kernel Hilbert space (RKHS) property for SVMs that naturally arise from canonical and generalized coherent states. Specifically, we discuss the evaluation of radial kernels through a positive operator valued measure (POVM) on a quantum optical system based on canonical coherent states. A similar procedure may also lead to calculations of kernels not usually used in classical algorithms such as those arising from generalized coherent states.


2018 ◽  
Vol 2018 ◽  
pp. 1-7 ◽  
Author(s):  
Jinshan Qi ◽  
Xun Liang ◽  
Rui Xu

By utilizing kernel functions, support vector machines (SVMs) successfully solve the linearly inseparable problems. Subsequently, its applicable areas have been greatly extended. Using multiple kernels (MKs) to improve the SVM classification accuracy has been a hot topic in the SVM research society for several years. However, most MK learning (MKL) methods employ L1-norm constraint on the kernel combination weights, which forms a sparse yet nonsmooth solution for the kernel weights. Alternatively, the Lp-norm constraint on the kernel weights keeps all information in the base kernels. Nonetheless, the solution of Lp-norm constraint MKL is nonsparse and sensitive to the noise. Recently, some scholars presented an efficient sparse generalized MKL (L1- and L2-norms based GMKL) method, in which L1  L2 established an elastic constraint on the kernel weights. In this paper, we further extend the GMKL to a more generalized MKL method based on the p-norm, by joining L1- and Lp-norms. Consequently, the L1- and L2-norms based GMKL is a special case in our method when p=2. Experiments demonstrated that our L1- and Lp-norms based MKL offers a higher accuracy than the L1- and L2-norms based GMKL in the classification, while keeping the properties of the L1- and L2-norms based on GMKL.


This article presented in the context of 2D global facial recognition, using Gabor Wavelet's feature extraction algorithms, and facial recognition Support Vector Machines (SVM), the latter incorporating the kernel functions: linear, cubic and Gaussian. The models generated by these kernels were validated by the cross validation technique through the Matlab application. The objective is to observe the results of facial recognition in each case. An efficient technique is proposed that includes the mentioned algorithms for a database of 2D images. The technique has been processed in its training and testing phases, for the facial image databases FERET [1] and MUCT [2], and the models generated by the technique allowed to perform the tests, whose results achieved a facial recognition of individuals over 96%.


Author(s):  
Alina Lazar ◽  
Bradley A. Shellito

Support Vector Machines (SVM) are powerful tools for classification of data. This article describes the functionality of SVM including their design and operation. SVM have been shown to provide high classification accuracies and have good generalization capabilities. SVM can classify linearly separable data as well as nonlinearly separable data through the use of the kernel function. The advantages of using SVM are discussed along with the standard types of kernel functions. Furthermore, the effectiveness of applying SVM to large, spatial datasets derived from Geographic Information Systems (GIS) is also described. Future trends and applications are also discussed – the described extracted dataset contains seven independent variables related to urban development plus a class label which denotes the urban areas versus the rural areas. This large dataset, with over a million instances really proves the generalization capabilities of the SVM methods. Also, the spatial property allows experts to analyze the error signal.


2005 ◽  
Vol 17 (1) ◽  
pp. 177-204 ◽  
Author(s):  
Charles A. Micchelli ◽  
Massimiliano Pontil

In this letter, we provide a study of learning in a Hilbert space of vector-valued functions. We motivate the need for extending learning theory of scalar-valued functions by practical considerations and establish some basic results for learning vector-valued functions that should prove useful in applications. Specifically, we allow an output space Y to be a Hilbert space, and we consider a reproducing kernel Hilbert space of functions whose values lie in Y. In this setting, we derive the form of the minimal norm interpolant to a finite set of data and apply it to study some regularization functionals that are important in learning theory. We consider specific examples of such functionals corresponding to multiple-output regularization networks and support vector machines, for both regression and classification. Finally, we provide classes of operator-valued kernels of the dot product and translation-invariant type.


2014 ◽  
Vol 511-512 ◽  
pp. 467-474
Author(s):  
Jun Tu ◽  
Cheng Liang Liu ◽  
Zhong Hua Miao

Feature selection plays an important role in terrain classification for outdoor robot navigation. For terrain classification, the image data usually have a large number of feature dimensions. The better selection of features usually results in higher labeling accuracy. In this work, a novel approach for terrain perception using Importance Factor based I-Relief algorithm and Feature Weighted Support Vector Machines (IFIR-FWSVM) is put forward. Firstly, the weight of each feature for classification is computed by using Importance Factor based I-Relief algorithm (IFIR) and the irrelevant features are eliminated. Then the weighted features are used to compute the kernel functions of SVM and trained the classifier. Finally, the trained SVM is employed to predict the terrain label in the far-field regions. Experimental results based on DARPA datasets show that the proposed method IFIR-FWSVM is superior over traditional SVM.


2014 ◽  
Vol 644-650 ◽  
pp. 4314-4318
Author(s):  
Xin You Wang ◽  
Ya Li Ning ◽  
Xi Ping He

In order to solve the problem of the conventional methods operated directly in the image, difficult to obtain good results because they are poor in high dimension performance. In this paper, a new method was proposed, which use the Least Squares Support Vector Machines in image segmentation. Furthermore, the parameters of kernel functions are also be optimized by Particle Swarm Optimization (PSO) algorithm. The practical application in various of standard data sets and color image segmentation experiment. The results show that, LS-SVM can use a variety of features in image, the experiments have achieved good results of image segmentation, and the time needed for segmentation is greatly reduced than standard SVM.


Sign in / Sign up

Export Citation Format

Share Document