Improving the Incoherence of a Learned Dictionary via Rank Shrinkage

2017 ◽  
Vol 29 (1) ◽  
pp. 263-285 ◽  
Author(s):  
Shashanka Ubaru ◽  
Abd-Krim Seghouane ◽  
Yousef Saad

This letter considers the problem of dictionary learning for sparse signal representation whose atoms have low mutual coherence. To learn such dictionaries, at each step, we first update the dictionary using the method of optimal directions (MOD) and then apply a dictionary rank shrinkage step to decrease its mutual coherence. In the rank shrinkage step, we first compute a rank 1 decomposition of the column-normalized least squares estimate of the dictionary obtained from the MOD step. We then shrink the rank of this learned dictionary by transforming the problem of reducing the rank to a nonnegative garrotte estimation problem and solving it using a path-wise coordinate descent approach. We establish theoretical results that show that the rank shrinkage step included will reduce the coherence of the dictionary, which is further validated by experimental results. Numerical experiments illustrating the performance of the proposed algorithm in comparison to various other well-known dictionary learning algorithms are also presented.

2018 ◽  
Vol 26 (2) ◽  
pp. 171-184 ◽  
Author(s):  
Nianci Feng ◽  
Jianjun Wang ◽  
Wendong Wang

AbstractIn this paper, the iterative reweighted least squares (IRLS) algorithm for sparse signal recovery with partially known support is studied. We establish a theoretical analysis of the IRLS algorithm by incorporating some known part of support information as a prior, and obtain the error estimate and convergence result of this algorithm. Our results show that the error bound depends on the best {(s+k)}-term approximation and the regularization parameter λ, and convergence result depends only on the regularization parameter λ. Finally, a series of numerical experiments are carried out to demonstrate the effectiveness of the algorithm for sparse signal recovery with partially known support, which shows that an appropriate q ({0<q<1}) can lead to a better recovery performance than that of the case {q=1}.


2015 ◽  
Vol 39 (4) ◽  
pp. 537-554 ◽  
Author(s):  
Fatemeh Panjeh Ali Beik ◽  
Davod Khojasteh Salkuyeh

This paper deals with developing a robust iterative algorithm to find the least-squares ( P, Q)-orthogonal symmetric and skew-symmetric solution sets of the generalized coupled matrix equations. To this end, first, some properties of these type of matrices are established. Furthermore, an approach is offered to determine the optimal approximate ( P, Q)-orthogonal (skew-)symmetric solution pair corresponding to a given arbitrary matrix pair. Some numerical experiments are reported to confirm the validity of the theoretical results and to illustrate the effectiveness of the proposed algorithm.


Mathematics ◽  
2021 ◽  
Vol 9 (9) ◽  
pp. 994
Author(s):  
Elisa Alòs ◽  
Jorge A. León

Here, we review some results of fractional volatility models, where the volatility is driven by fractional Brownian motion (fBm). In these models, the future average volatility is not a process adapted to the underlying filtration, and fBm is not a semimartingale in general. So, we cannot use the classical Itô’s calculus to explain how the memory properties of fBm allow us to describe some empirical findings of the implied volatility surface through Hull and White type formulas. Thus, Malliavin calculus provides a natural approach to deal with the implied volatility without assuming any particular structure of the volatility. The aim of this paper is to provides the basic tools of Malliavin calculus for the study of fractional volatility models. That is, we explain how the long and short memory of fBm improves the description of the implied volatility. In particular, we consider in detail a model that combines the long and short memory properties of fBm as an example of the approach introduced in this paper. The theoretical results are tested with numerical experiments.


SIAM Review ◽  
1966 ◽  
Vol 8 (3) ◽  
pp. 384-386 ◽  
Author(s):  
J. L. Farrell ◽  
J. C. Stuelpnagel ◽  
R. H. Wessner ◽  
J. R. Velman ◽  
J. E. Brook

Sign in / Sign up

Export Citation Format

Share Document