bregman distances
Recently Published Documents


TOTAL DOCUMENTS

36
(FIVE YEARS 8)

H-INDEX

10
(FIVE YEARS 1)

Author(s):  
Xin Jiang ◽  
Lieven Vandenberghe

AbstractWe present a new variant of the Chambolle–Pock primal–dual algorithm with Bregman distances, analyze its convergence, and apply it to the centering problem in sparse semidefinite programming. The novelty in the method is a line search procedure for selecting suitable step sizes. The line search obviates the need for estimating the norm of the constraint matrix and the strong convexity constant of the Bregman kernel. As an application, we discuss the centering problem in large-scale semidefinite programming with sparse coefficient matrices. The logarithmic barrier function for the cone of positive semidefinite completable sparse matrices is used as the distance-generating kernel. For this distance, the complexity of evaluating the Bregman proximal operator is shown to be roughly proportional to the cost of a sparse Cholesky factorization. This is much cheaper than the standard proximal operator with Euclidean distances, which requires an eigenvalue decomposition.


Author(s):  
Mahesh Chandra Mukkamala ◽  
Jalal Fadili ◽  
Peter Ochs

AbstractLipschitz continuity of the gradient mapping of a continuously differentiable function plays a crucial role in designing various optimization algorithms. However, many functions arising in practical applications such as low rank matrix factorization or deep neural network problems do not have a Lipschitz continuous gradient. This led to the development of a generalized notion known as the L-smad property, which is based on generalized proximity measures called Bregman distances. However, the L-smad property cannot handle nonsmooth functions, for example, simple nonsmooth functions like $$\vert x^4-1 \vert $$ | x 4 - 1 | and also many practical composite problems are out of scope. We fix this issue by proposing the MAP property, which generalizes the L-smad property and is also valid for a large class of structured nonconvex nonsmooth composite problems. Based on the proposed MAP property, we propose a globally convergent algorithm called Model BPG, that unifies several existing algorithms. The convergence analysis is based on a new Lyapunov function. We also numerically illustrate the superior performance of Model BPG on standard phase retrieval problems and Poisson linear inverse problems, when compared to a state of the art optimization method that is valid for generic nonconvex nonsmooth optimization problems.


Author(s):  
Juan Enrique Martínez-Legaz ◽  
Maryam Tamadoni Jahromi ◽  
Eskandar Naraghirad

AbstractWe investigate convergence properties of Bregman distances induced by convex representations of maximally monotone operators. We also introduce and study the projection mappings associated with such distances.


Author(s):  
Kevin P. Josey ◽  
Elizabeth Juarez-Colunga ◽  
Fan Yang ◽  
Debashis Ghosh

Optimization ◽  
2019 ◽  
Vol 68 (8) ◽  
pp. 1599-1624
Author(s):  
Xian-Fa Luo ◽  
Li Meng ◽  
Ching-Feng Wen ◽  
Jen-Chih Yao

2018 ◽  
Vol 26 (5) ◽  
pp. 639-646 ◽  
Author(s):  
Jens Flemming

Abstract We consider Tikhonov-type variational regularization of ill-posed linear operator equations in Banach spaces with general convex penalty functionals. Upper bounds for certain error measures expressing the distance between exact and regularized solutions, especially for Bregman distances, can be obtained from variational source conditions. We prove that such bounds are optimal in case of twisted Bregman distances, a specific a priori parameter choice, and low regularity of the exact solution, that is, the rate function is also an asymptotic lower bound for the error measure. This result extends existing converse results from Hilbert space settings to Banach spaces without adhering to spectral theory.


Author(s):  
Martin Burger ◽  
Tapio Helin ◽  
Hanne Kekkonen

Abstract In this paper we consider variational regularization methods for inverse problems with large noise that is in general unbounded in the image space of the forward operator. We introduce a Banach space setting that allows to define a reasonable notion of solutions for more general noise in a larger space provided that one has sufficient mapping properties of the forward operators. A key observation, which guides us through the subsequent analysis, is that such a general noise model can be understood with the same setting as approximate source conditions (while a standard model of bounded noise is related directly to classical source conditions). Based on this insight we obtain a quite general existence result for regularized variational problems and derive error estimates in terms of Bregman distances. The latter is specialized for the particularly important cases of one- and $p$-homogeneous regularization functionals. As a natural further step we study stochastic noise models and in particular white noise for which we derive error estimates in terms of the expectation of the Bregman distance. The finiteness of certain expectations leads to a novel class of abstract smoothness conditions on the forward operator, which can be easily interpreted in the Hilbert space case. We finally exemplify the approach and in particular the conditions for popular examples of regularization functionals given by squared norm, Besov norm and total variation.


Sign in / Sign up

Export Citation Format

Share Document