scholarly journals Hilbert–Schmidt Independence Criterion Subspace Learning on Hybrid Region Covariance Descriptor for Image Classification

2021 ◽  
Vol 2021 ◽  
pp. 1-15
Author(s):  
Xi Liu ◽  
Peng Yang ◽  
Zengrong Zhan ◽  
Zhengming Ma

The region covariance descriptor (RCD), which is known as a symmetric positive definite (SPD) matrix, is commonly used in image representation. As SPD manifolds have a non-Euclidean geometry, Euclidean machine learning methods are not directly applicable to them. In this work, an improved covariance descriptor called the hybrid region covariance descriptor (HRCD) is proposed. The HRCD incorporates the mean feature information into the RCD to improve the latter’s discriminative performance. To address the non-Euclidean properties of SPD manifolds, this study also proposes an algorithm called the Hilbert-Schmidt independence criterion subspace learning (HSIC-SL) for SPD manifolds. The HSIC-SL algorithm is aimed at improving classification accuracy. This algorithm is a kernel function that embeds SPD matrices into the reproducing kernel Hilbert space and further maps them to a linear space. To make the mapping consider the correlation between SPD matrices and linear projection, this method introduces global HSIC maximization to the model. The proposed method is compared with existing methods and is proved to be highly accurate and valid by classification experiments on the HRCD and HSIC-SL using the COIL-20, ETH-80, QMUL, face data FERET, and Brodatz datasets.

2019 ◽  
Vol 2019 ◽  
pp. 1-11 ◽  
Author(s):  
Xi Liu ◽  
Zhengming Ma ◽  
Guo Niu

Covariance matrices, known as symmetric positive definite (SPD) matrices, are usually regarded as points lying on Riemannian manifolds. We describe a new covariance descriptor, which could improve the discriminative learning ability of region covariance descriptor by taking into account the mean of feature vectors. Due to the specific geometry of Riemannian manifolds, classical learning methods cannot be directly used on it. In this paper, we propose a subspace projection framework for the classification task on Riemannian manifolds and give the mathematical derivation for it. It is different from the common technique used for Riemannian manifolds, which is to explicitly project the points from a Riemannian manifold onto Euclidean space based upon a linear hypothesis. Under the proposed framework, we define a Gaussian Radial Basis Function- (RBF-) based kernel with a Log-Euclidean Riemannian Metric (LERM) to embed a Riemannian manifold into a high-dimensional Reproducing Kernel Hilbert Space (RKHS) and then project it onto a subspace of the RKHS. Finally, a variant of Linear Discriminative Analyze (LDA) is recast onto the subspace. Experiments demonstrate the considerable effectiveness of the mixed region covariance descriptor and the proposed method.


2013 ◽  
Vol 52 (2) ◽  
pp. 027207 ◽  
Author(s):  
Serdar Cakir ◽  
Tayfun Aytaç ◽  
Alper Yildirim ◽  
Soosan Beheshti ◽  
Ö. Nezih Gerek ◽  
...  

2021 ◽  
Vol 2021 ◽  
pp. 1-11
Author(s):  
Xi Liu ◽  
Zengrong Zhan ◽  
Guo Niu

Image recognition tasks involve an increasingly high amount of symmetric positive definite (SPD) matrices data. SPD manifolds exhibit nonlinear geometry, and Euclidean machine learning methods cannot be directly applied to SPD manifolds. The kernel trick of SPD manifolds is based on the concept of projecting data onto a reproducing kernel Hilbert space. Unfortunately, existing kernel methods do not consider the connection of SPD matrices and linear projections. Thus, a framework that uses the correlation between SPD matrices and projections to model the kernel map is proposed herein. To realize this, this paper formulates a Hilbert–Schmidt independence criterion (HSIC) regularization framework based on the kernel trick, where HSIC is usually used to express the interconnectedness of two datasets. The proposed framework allows us to extend the existing kernel methods to new HSIC regularization kernel methods. Additionally, this paper proposes an algorithm called HSIC regularized graph discriminant analysis (HRGDA) for SPD manifolds based on the HSIC regularization framework. The proposed HSIC regularization framework and HRGDA are highly accurate and valid based on experimental results on several classification tasks.


Sign in / Sign up

Export Citation Format

Share Document