scholarly journals Thresholding Approach for Low-Rank Correlation Matrix based on MM algorithm

2021 ◽  
Author(s):  
Kensuke Tanioka ◽  
Yuki Furotani ◽  
Satoru Hiwa

Background: Low-rank approximation is a very useful approach for interpreting the features of a correlation matrix; however, a low-rank approximation may result in estimation far from zero even if the corresponding original value was far from zero. In this case, the results lead to misinterpretation. Methods: To overcome these problems, we propose a new approach to estimate a sparse low-rank correlation matrix based on threshold values combined with cross-validation. In the proposed approach, the MM algorithm was used to estimate the sparse low-rank correlation matrix, and a grid search was performed to select the threshold values related to sparse estimation. Results: Through numerical simulation, we found that the FPR and average relative error of the proposed method were superior to those of the tandem approach. For the application of microarray gene expression, the FPRs of the proposed approach with d=2,3, and 5 were 0.128, 0.139, and 0.197, respectively, while FPR of the tandem approach was 0.285. Conclusions: We propose a novel approach to estimate sparse low-rank correlation matrix. The advantage of the proposed method is that it provides results that are easy to interpret and avoid misunderstandings. We demonstrated the superiority of the proposed method through both numerical simulations and real examples.

2020 ◽  
Vol 34 (05) ◽  
pp. 8204-8211
Author(s):  
Jian Li ◽  
Xing Wang ◽  
Baosong Yang ◽  
Shuming Shi ◽  
Michael R. Lyu ◽  
...  

Recent NLP studies reveal that substantial linguistic information can be attributed to single neurons, i.e., individual dimensions of the representation vectors. We hypothesize that modeling strong interactions among neurons helps to better capture complex information by composing the linguistic properties embedded in individual neurons. Starting from this intuition, we propose a novel approach to compose representations learned by different components in neural machine translation (e.g., multi-layer networks or multi-head attention), based on modeling strong interactions among neurons in the representation vectors. Specifically, we leverage bilinear pooling to model pairwise multiplicative interactions among individual neurons, and a low-rank approximation to make the model computationally feasible. We further propose extended bilinear pooling to incorporate first-order representations. Experiments on WMT14 English⇒German and English⇒French translation tasks show that our model consistently improves performances over the SOTA Transformer baseline. Further analyses demonstrate that our approach indeed captures more syntactic and semantic information as expected.


2019 ◽  
Vol 69 (5) ◽  
pp. 464-468
Author(s):  
Mandar K. Bivalkar ◽  
Bambam Kumar ◽  
Dharmendra Singh

Low dielectric materials referred as weak targets are very difficult to detect behind the wall in through wall imaging (TWI) due to strong reflections from wall. TWI Experimental data collected for low dielectric target behind the wall and transceiver on another side of the wall. Recently several researchers are using low-rank approximation (LRA) for reduction of random noise in the various data. Explore the possibilities of using LRA for TWI data for improving the detection of low dielectric material. A novel approach using modification of LRA with exploiting the noise subspace in singular value decomposition (SVD) to detect weak target behind the wall is introduced. LRA consider data has low rank in f-x domain for noisy data, local windows are implemented in LRA approach to satisfy the principle assumptions required by the LRA algorithm itself. Decomposed TWI data in the noise space of the SVD to detect the weak target adaptively. Results for modified LRA for detection of weak target behind the wall are very encouraging over LRA.


Author(s):  
Tingting Ren ◽  
Xiuyi Jia ◽  
Weiwei Li ◽  
Shu Zhao

Label distribution learning (LDL) can be viewed as the generalization of multi-label learning. This novel paradigm focuses on the relative importance of different labels to a particular instance. Most previous LDL methods either ignore the correlation among labels, or only exploit the label correlations in a global way. In this paper, we utilize both the global and local relevance among labels to provide more information for training model and propose a novel label distribution learning algorithm. In particular, a label correlation matrix based on low-rank approximation is applied to capture the global label correlations. In addition, the label correlation among local samples are adopted to modify the label correlation matrix. The experimental results on real-world data sets show that the proposed algorithm outperforms state-of-the-art LDL methods.


2020 ◽  
Vol 14 (12) ◽  
pp. 2791-2798
Author(s):  
Xiaoqun Qiu ◽  
Zhen Chen ◽  
Saifullah Adnan ◽  
Hongwei He

2020 ◽  
Vol 6 ◽  
pp. 922-933
Author(s):  
M. Amine Hadj-Youcef ◽  
Francois Orieux ◽  
Alain Abergel ◽  
Aurelia Fraysse

2021 ◽  
Vol 11 (10) ◽  
pp. 4582
Author(s):  
Kensuke Tanioka ◽  
Satoru Hiwa

In the domain of functional magnetic resonance imaging (fMRI) data analysis, given two correlation matrices between regions of interest (ROIs) for the same subject, it is important to reveal relatively large differences to ensure accurate interpretation. However, clustering results based only on differences tend to be unsatisfactory and interpreting the features tends to be difficult because the differences likely suffer from noise. Therefore, to overcome these problems, we propose a new approach for dimensional reduction clustering. Methods: Our proposed dimensional reduction clustering approach consists of low-rank approximation and a clustering algorithm. The low-rank matrix, which reflects the difference, is estimated from the inner product of the difference matrix, not only from the difference. In addition, the low-rank matrix is calculated based on the majorize–minimization (MM) algorithm such that the difference is bounded within the range −1 to 1. For the clustering process, ordinal k-means is applied to the estimated low-rank matrix, which emphasizes the clustering structure. Results: Numerical simulations show that, compared with other approaches that are based only on differences, the proposed method provides superior performance in recovering the true clustering structure. Moreover, as demonstrated through a real-data example of brain activity measured via fMRI during the performance of a working memory task, the proposed method can visually provide interpretable community structures consisting of well-known brain functional networks, which can be associated with the human working memory system. Conclusions: The proposed dimensional reduction clustering approach is a very useful tool for revealing and interpreting the differences between correlation matrices, even when the true differences tend to be relatively small.


Sign in / Sign up

Export Citation Format

Share Document