Rank, Trace-Norm and Max-Norm

Author(s):  
Nathan Srebro ◽  
Adi Shraibman
Keyword(s):  
Author(s):  
Takeshi Teshima ◽  
Miao Xu ◽  
Issei Sato ◽  
Masashi Sugiyama

We consider the problem of recovering a low-rank matrix from its clipped observations. Clipping is conceivable in many scientific areas that obstructs statistical analyses. On the other hand, matrix completion (MC) methods can recover a low-rank matrix from various information deficits by using the principle of low-rank completion. However, the current theoretical guarantees for low-rank MC do not apply to clipped matrices, as the deficit depends on the underlying values. Therefore, the feasibility of clipped matrix completion (CMC) is not trivial. In this paper, we first provide a theoretical guarantee for the exact recovery of CMC by using a trace-norm minimization algorithm. Furthermore, we propose practical CMC algorithms by extending ordinary MC methods. Our extension is to use the squared hinge loss in place of the squared loss for reducing the penalty of overestimation on clipped entries. We also propose a novel regularization term tailored for CMC. It is a combination of two trace-norm terms, and we theoretically bound the recovery error under the regularization. We demonstrate the effectiveness of the proposed methods through experiments using both synthetic and benchmark data for recommendation systems.


Quantum ◽  
2018 ◽  
Vol 2 ◽  
pp. 51
Author(s):  
Daniel Puzzuoli

Given a linear mapΦ:Mn→Mm, its multiplicity maps are defined as the family of linear mapsΦ⊗idk:Mn⊗Mk→Mm⊗Mk, whereidkdenotes the identity onMk. Let‖⋅‖1denote the trace-norm on matrices, as well as the induced trace-norm on linear maps of matrices, i.e.‖Φ‖1=max{‖Φ(X)‖1:X∈Mn,‖X‖1=1}. A fact of fundamental importance in both operator algebras and quantum information is that‖Φ⊗idk‖1can grow withk. In general, the rate of growth is bounded by‖Φ⊗idk‖1≤k‖Φ‖1, and matrix transposition is the canonical example of a map achieving this bound. We prove that, up to an equivalence, the transpose is the unique map achieving this bound. The equivalence is given in terms of complete trace-norm isometries, and the proof relies on a particular characterization of complete trace-norm isometries regarding preservation of certain multiplication relations.We use this result to characterize the set of single-shot quantum channel discrimination games satisfying a norm relation that, operationally, implies that the game can be won with certainty using entanglement, but is hard to win without entanglement. Specifically, we show that the well-known example of such a game, involving the Werner-Holevo channels, is essentially the unique game satisfying this norm relation. This constitutes a step towards a characterization of single-shot quantum channel discrimination games with maximal gap between optimal performance of entangled and unentangled strategies.


2018 ◽  
Vol 67 (6) ◽  
pp. 1121-1131 ◽  
Author(s):  
Juan Monsalve ◽  
Juan Rada
Keyword(s):  

2016 ◽  
Vol 28 (4) ◽  
pp. 686-715 ◽  
Author(s):  
Kishan Wimalawarne ◽  
Ryota Tomioka ◽  
Masashi Sugiyama

We theoretically and experimentally investigate tensor-based regression and classification. Our focus is regularization with various tensor norms, including the overlapped trace norm, the latent trace norm, and the scaled latent trace norm. We first give dual optimization methods using the alternating direction method of multipliers, which is computationally efficient when the number of training samples is moderate. We then theoretically derive an excess risk bound for each tensor norm and clarify their behavior. Finally, we perform extensive experiments using simulated and real data and demonstrate the superiority of tensor-based learning methods over vector- and matrix-based learning methods.


2015 ◽  
Vol 91 (4) ◽  
Author(s):  
Lian-He Shao ◽  
Zhengjun Xi ◽  
Heng Fan ◽  
Yongming Li
Keyword(s):  

2015 ◽  
Vol 484 ◽  
pp. 396-408 ◽  
Author(s):  
Yuan Li ◽  
Yu-E Li
Keyword(s):  

2016 ◽  
Vol 55 (11) ◽  
pp. 4866-4877 ◽  
Author(s):  
Yu-Xia Xie ◽  
Jing Liu ◽  
Hong Ma
Keyword(s):  

Sign in / Sign up

Export Citation Format

Share Document