scholarly journals Uncorrelated Multilinear Principal Component Analysis for Unsupervised Multilinear Subspace Learning

2009 ◽  
Vol 20 (11) ◽  
pp. 1820-1836 ◽  
Author(s):  
Haiping Lu ◽  
K.N. Plataniotis ◽  
A.N. Venetsanopoulos
2013 ◽  
Vol 2013 ◽  
pp. 1-16 ◽  
Author(s):  
Cong Liu ◽  
Xu Wei-sheng ◽  
Wu Qi-di

We propose the Tensorial Kernel Principal Component Analysis (TKPCA) for dimensionality reduction and feature extraction from tensor objects, which extends the conventional Principal Component Analysis (PCA) in two perspectives: working directly with multidimensional data (tensors) in their native state and generalizing an existing linear technique to its nonlinear version by applying the kernel trick. Our method aims to remedy the shortcomings of multilinear subspace learning (tensorial PCA) developed recently in modelling the nonlinear manifold of tensor objects and brings together the desirable properties of kernel methods and tensor decompositions for significant performance gain when the data are multidimensional and nonlinear dependencies do exist. Our approach begins by formulating TKPCA as an optimization problem. Then, we develop a kernel function based on Grassmann Manifold that can directly take tensorial representation as parameters instead of traditional vectorized representation. Furthermore, a TKPCA-based tensor object recognition is also proposed for application of the action recognition. Experiments with real action datasets show that the proposed method is insensitive to both noise and occlusion and performs well compared with state-of-the-art algorithms.


Sign in / Sign up

Export Citation Format

Share Document