Kernel Canonical Correlation Analysis and Least Squares Support Vector Machines

Author(s):  
Tony Van Gestel ◽  
Johan A. K. Suykens ◽  
Jos De Brabanter ◽  
Bart De Moor ◽  
Joos Vandewalle
Author(s):  
Md. Ashad Alam ◽  
Kenji Fukumizu

It is well known that the performance of kernel methods depends on the choice of appropriate kernels and associated parameters. While cross-validation (CV) is a useful method of kernel and parameter choice for supervised learning such as the support vector machines, there are no general well-founded methods for unsupervised kernel methods. This paper discusses CV for kernel canonical correlation analysis (KCCA), and proposes a new regularization approach for KCCA. As we demonstrate with Gaussian kernels, the CV errors for KCCA tend to decrease as the bandwidth parameter of the kernel decreases, which provides inappropriate features with all the data concentrated in a few points. This is caused by the ill-posedness of the KCCA with the CV. To solve this problem, we propose to use constraints on the fourth-order moments of canonical variables in addition to the variances. Experiments on synthesized and real-world data demonstrate that the proposed higher-order regularized KCCA can be applied effectively with the CV to find appropriate kernel and regularization parameters.


Sign in / Sign up

Export Citation Format

Share Document