A hierarchical weighted low-rank representation for image clustering and classification

2020 ◽  
pp. 107736
Author(s):  
Zhiqiang Fu ◽  
Yao Zhao ◽  
Dongxia Chang ◽  
Yiming Wang
2021 ◽  
Vol 2021 ◽  
pp. 1-14
Author(s):  
Wenyun Gao ◽  
Xiaoyun Li ◽  
Sheng Dai ◽  
Xinghui Yin ◽  
Stanley Ebhohimhen Abhadiomhen

The low-rank representation (LRR) method has recently gained enormous popularity due to its robust approach in solving the subspace segmentation problem, particularly those concerning corrupted data. In this paper, the recursive sample scaling low-rank representation (RSS-LRR) method is proposed. The advantage of RSS-LRR over traditional LRR is that a cosine scaling factor is further introduced, which imposes a penalty on each sample to minimize noise and outlier influence better. Specifically, the cosine scaling factor is a similarity measure learned to extract each sample’s relationship with the low-rank representation’s principal components in the feature space. In order words, the smaller the angle between an individual data sample and the low-rank representation’s principal components, the more likely it is that the data sample is clean. Thus, the proposed method can then effectively obtain a good low-rank representation influenced mainly by clean data. Several experiments are performed with varying levels of corruption on ORL, CMU PIE, COIL20, COIL100, and LFW in order to evaluate RSS-LRR’s effectiveness over state-of-the-art low-rank methods. The experimental results show that RSS-LRR consistently performs better than the compared methods in image clustering and classification tasks.


2021 ◽  
Vol 15 ◽  
pp. 174830262199962
Author(s):  
Cong-Zhe You ◽  
Zhen-Qiu Shu ◽  
Hong-Hui Fan

Low-Rank Representation (LRR) and Sparse Subspace Clustering (SSC) are considered as the hot topics of subspace clustering algorithms. SSC induces the sparsity through minimizing the l1-norm of the data matrix while LRR promotes a low-rank structure through minimizing the nuclear norm. In this paper, considering the problem of fitting a union of subspace to a collection of data points drawn from one more subspaces and corrupted by noise, we pose this problem as a non-convex optimization problem, where the goal is to decompose the corrupted data matrix as the sum of a clean and self-expressive dictionary plus a matrix of noise. We propose a new algorithm, named Low-Rank and Sparse Subspace Clustering with a Clean dictionary (LRS2C2), by combining SSC and LRR, as the representation is often both sparse and low-rank. The effectiveness of the proposed algorithm is demonstrated through experiments on motion segmentation and image clustering.


Author(s):  
Xiangjun Shen ◽  
Jinghui Zhou ◽  
Zhongchen Ma ◽  
Bingkun Bao ◽  
Zhengjun Zha

Cross-domain data has become very popular recently since various viewpoints and different sensors tend to facilitate better data representation. In this article, we propose a novel cross-domain object representation algorithm (RLRCA) which not only explores the complexity of multiple relationships of variables by canonical correlation analysis (CCA) but also uses a low rank model to decrease the effect of noisy data. To the best of our knowledge, this is the first try to smoothly integrate CCA and a low-rank model to uncover correlated components across different domains and to suppress the effect of noisy or corrupted data. In order to improve the flexibility of the algorithm to address various cross-domain object representation problems, two instantiation methods of RLRCA are proposed from feature and sample space, respectively. In this way, a better cross-domain object representation can be achieved through effectively learning the intrinsic CCA features and taking full advantage of cross-domain object alignment information while pursuing low rank representations. Extensive experimental results on CMU PIE, Office-Caltech, Pascal VOC 2007, and NUS-WIDE-Object datasets, demonstrate that our designed models have superior performance over several state-of-the-art cross-domain low rank methods in image clustering and classification tasks with various corruption levels.


2021 ◽  
Vol 15 ◽  
pp. 174830262098369
Author(s):  
Cong-Zhe You ◽  
Zhen-Qiu Shu ◽  
Hong-Hui Fan

Low-Rank Representation (LRR) and Sparse Subspace Clustering (SSC) are considered as the hot topics of subspace clustering algorithms. SSC induces the sparsity through minimizing the [Formula: see text]-norm of the data matrix while LRR promotes a low-rank structure through minimizing the nuclear norm. In this paper, considering the problem of fitting a union of subspace to a collection of data points drawn from one more subspaces and corrupted by noise, we pose this problem as a non-convex optimization problem, where the goal is to decompose the corrupted data matrix as the sum of a clean and self-expressive dictionary plus a matrix of noise. We propose a new algorithm, named Low-Rank and Sparse Subspace Clustering with a Clean dictionary (LRS2C2), by combining SSC and LRR, as the representation is often both sparse and low-rank. The effectiveness of the proposed algorithm is demonstrated through experiments on motion segmentation and image clustering.


2020 ◽  
Vol 10 ◽  
Author(s):  
Conghai Lu ◽  
Juan Wang ◽  
Jinxing Liu ◽  
Chunhou Zheng ◽  
Xiangzhen Kong ◽  
...  

Sign in / Sign up

Export Citation Format

Share Document