region covariance descriptor
Recently Published Documents


TOTAL DOCUMENTS

14
(FIVE YEARS 6)

H-INDEX

2
(FIVE YEARS 1)

2021 ◽  
Vol 2021 ◽  
pp. 1-15
Author(s):  
Xi Liu ◽  
Peng Yang ◽  
Zengrong Zhan ◽  
Zhengming Ma

The region covariance descriptor (RCD), which is known as a symmetric positive definite (SPD) matrix, is commonly used in image representation. As SPD manifolds have a non-Euclidean geometry, Euclidean machine learning methods are not directly applicable to them. In this work, an improved covariance descriptor called the hybrid region covariance descriptor (HRCD) is proposed. The HRCD incorporates the mean feature information into the RCD to improve the latter’s discriminative performance. To address the non-Euclidean properties of SPD manifolds, this study also proposes an algorithm called the Hilbert-Schmidt independence criterion subspace learning (HSIC-SL) for SPD manifolds. The HSIC-SL algorithm is aimed at improving classification accuracy. This algorithm is a kernel function that embeds SPD matrices into the reproducing kernel Hilbert space and further maps them to a linear space. To make the mapping consider the correlation between SPD matrices and linear projection, this method introduces global HSIC maximization to the model. The proposed method is compared with existing methods and is proved to be highly accurate and valid by classification experiments on the HRCD and HSIC-SL using the COIL-20, ETH-80, QMUL, face data FERET, and Brodatz datasets.


Author(s):  
Mohd Fauzi Abu Hassan ◽  
Azurahisham Sah Pri ◽  
Zakiah Ahmad ◽  
Tengku Mohd Azahar Tuan Dir

2019 ◽  
Vol 2019 ◽  
pp. 1-11 ◽  
Author(s):  
Xi Liu ◽  
Zhengming Ma ◽  
Guo Niu

Covariance matrices, known as symmetric positive definite (SPD) matrices, are usually regarded as points lying on Riemannian manifolds. We describe a new covariance descriptor, which could improve the discriminative learning ability of region covariance descriptor by taking into account the mean of feature vectors. Due to the specific geometry of Riemannian manifolds, classical learning methods cannot be directly used on it. In this paper, we propose a subspace projection framework for the classification task on Riemannian manifolds and give the mathematical derivation for it. It is different from the common technique used for Riemannian manifolds, which is to explicitly project the points from a Riemannian manifold onto Euclidean space based upon a linear hypothesis. Under the proposed framework, we define a Gaussian Radial Basis Function- (RBF-) based kernel with a Log-Euclidean Riemannian Metric (LERM) to embed a Riemannian manifold into a high-dimensional Reproducing Kernel Hilbert Space (RKHS) and then project it onto a subspace of the RKHS. Finally, a variant of Linear Discriminative Analyze (LDA) is recast onto the subspace. Experiments demonstrate the considerable effectiveness of the mixed region covariance descriptor and the proposed method.


Sign in / Sign up

Export Citation Format

Share Document