Bayesian Low-Tubal-Rank Robust Tensor Factorization with Multi-Rank Determination

Author(s):  
Yang Zhou ◽  
Yiu-ming Cheung
Author(s):  
Prachi Jain ◽  
Shikhar Murty ◽  
Mausam . ◽  
Soumen Chakrabarti

This paper analyzes the varied performance of Matrix Factorization (MF) on the related tasks of relation extraction and knowledge-base completion, which have been unified recently into a single framework of knowledge-base inference (KBI) [Toutanova et al., 2015]. We first propose a new evaluation protocol that makes comparisons between MF and Tensor Factorization (TF) models fair. We find that this results in a steep drop in MF performance. Our analysis attributes this to the high out-of-vocabulary (OOV) rate of entity pairs in test folds of commonly-used datasets. To alleviate this issue, we propose three extensions to MF. Our best model is a TF-augmented MF model. This hybrid model is robust and obtains strong results across various KBI datasets.


2021 ◽  
Vol 216 ◽  
pp. 106657
Author(s):  
Jin-Ju Wang ◽  
Ding-Cheng Wang ◽  
Ting-Zhu Huang ◽  
Jie Huang ◽  
Xi-Le Zhao ◽  
...  

Author(s):  
Clément Luneau ◽  
Jean Barbier ◽  
Nicolas Macris

Abstract We consider a statistical model for finite-rank symmetric tensor factorization and prove a single-letter variational expression for its asymptotic mutual information when the tensor is of even order. The proof applies the adaptive interpolation method originally invented for rank-one factorization. Here we show how to extend the adaptive interpolation to finite-rank and even-order tensors. This requires new non-trivial ideas with respect to the current analysis in the literature. We also underline where the proof falls short when dealing with odd-order tensors.


2017 ◽  
Vol 7 (1) ◽  
Author(s):  
Yejin Kim ◽  
Robert El-Kareh ◽  
Jimeng Sun ◽  
Hwanjo Yu ◽  
Xiaoqian Jiang
Keyword(s):  

Sign in / Sign up

Export Citation Format

Share Document