low rank decomposition
Recently Published Documents


TOTAL DOCUMENTS

144
(FIVE YEARS 53)

H-INDEX

16
(FIVE YEARS 4)

2021 ◽  
Vol 7 (12) ◽  
pp. 279
Author(s):  
Jobin Francis ◽  
Baburaj Madathil ◽  
Sudhish N. George ◽  
Sony George

The massive generation of data, which includes images and videos, has made data management, analysis, information extraction difficult in recent years. To gather relevant information, this large amount of data needs to be grouped. Real-life data may be noise corrupted during data collection or transmission, and the majority of them are unlabeled, allowing for the use of robust unsupervised clustering techniques. Traditional clustering techniques, which vectorize the images, are unable to keep the geometrical structure of the images. Hence, a robust tensor-based submodule clustering method based on l12 regularization with improved clustering capability is formulated. The l12 induced tensor nuclear norm (TNN), integrated into the proposed method, offers better low rankness while retaining the self-expressiveness property of submodules. Unlike existing methods, the proposed method employs a simultaneous noise removal technique by twisting the lateral image slices of the input data tensor into frontal slices and eliminates the noise content in each image, using the principles of the sparse and low rank decomposition technique. Experiments are carried out over three datasets with varying amounts of sparse, Gaussian and salt and pepper noise. The experimental results demonstrate the superior performance of the proposed method over the existing state-of-the-art methods.


電腦學刊 ◽  
2021 ◽  
Vol 32 (6) ◽  
pp. 195-205
Author(s):  
Bin Chen Bin Chen ◽  
Jin-Ning Zhu Bin Chen ◽  
Yi-Zhou Dong Jin-Ning Zhu


Mathematics ◽  
2021 ◽  
Vol 9 (11) ◽  
pp. 1189
Author(s):  
Xindi Ma ◽  
Jie Gao ◽  
Xiaoyu Liu ◽  
Taiping Zhang ◽  
Yuanyan Tang

Non-negative matrix factorization is used to find a basic matrix and a weight matrix to approximate the non-negative matrix. It has proven to be a powerful low-rank decomposition technique for non-negative multivariate data. However, its performance largely depends on the assumption of a fixed number of features. This work proposes a new probabilistic non-negative matrix factorization which factorizes a non-negative matrix into a low-rank factor matrix with constraints and a non-negative weight matrix. In order to automatically learn the potential binary features and feature number, a deterministic Indian buffet process variational inference is introduced to obtain the binary factor matrix. Further, the weight matrix is set to satisfy the exponential prior. To obtain the real posterior distribution of the two factor matrices, a variational Bayesian exponential Gaussian inference model is established. The comparative experiments on the synthetic and real-world datasets show the efficacy of the proposed method.


2021 ◽  
Vol 11 (4) ◽  
pp. 1861
Author(s):  
Zihao Rong ◽  
Shaofan Wang ◽  
Dehui Kong ◽  
Baocai Yin

Vehicle detection as a special case of object detection has practical meaning but faces challenges, such as the difficulty of detecting vehicles of various orientations, the serious influence from occlusion, the clutter of background, etc. In addition, existing effective approaches, like deep-learning-based ones, demand a large amount of training time and data, which causes trouble for their application. In this work, we propose a dictionary-learning-based vehicle detection approach which explicitly addresses these problems. Specifically, an ensemble of sparse-and-dense dictionaries (ESDD) are learned through supervised low-rank decomposition; each pair of sparse-and-dense dictionaries (SDD) in the ensemble is trained to represent either a subcategory of vehicle (corresponding to certain orientation range or occlusion level) or a subcategory of background (corresponding to a cluster of background patterns) and only gives good reconstructions to samples of the corresponding subcategory, making the ESDD capable of classifying vehicles from background even though they exhibit various appearances. We further organize ESDD into a two-level cascade (CESDD) to perform coarse-to-fine two-stage classification for better performance and computation reduction. The CESDD is then coupled with a downstream AdaBoost process to generate robust classifications. The proposed CESDD model is used as a window classifier in a sliding-window scan process over image pyramids to produce multi-scale detections, and an adapted mean-shift-like non-maximum suppression process is adopted to remove duplicate detections. Our CESDD vehicle detection approach is evaluated on KITTI dataset and compared with other strong counterparts; the experimental results exhibit the effectiveness of CESDD-based classification and detection, and the training of CESDD only demands small amount of time and data.


Sign in / Sign up

Export Citation Format

Share Document