scholarly journals The Generalized Bregman Distance

2021 ◽  
Vol 31 (1) ◽  
pp. 404-424
Author(s):  
Regina S. Burachik ◽  
Minh N. Dao ◽  
Scott B. Lindstrom
Keyword(s):  
2012 ◽  
Vol 2012 ◽  
pp. 1-12 ◽  
Author(s):  
Tian-Yuan Kuo ◽  
Jyh-Chung Jeng ◽  
Young-Ye Huang ◽  
Chung-Chien Hong

We introduce the class of(α,β)-hybrid mappings relative to a Bregman distanceDfin a Banach space, and then we study the fixed point and weak convergence problem for such mappings.


2018 ◽  
Vol 32 (1) ◽  
pp. 263-274
Author(s):  
Dan Ştefan Marinescu ◽  
Mihai Monea

Abstract The aim of this paper is to extend a result presented by Roman Ger during the 15th International Conference on Functional Equations and Inequalities. First, we present some necessary and sufficient conditions for a continuous function to be convex. We will use these to extend Ger’s result. Finally, we make some connections with other mathematical notions, as g-convex dominated function or Bregman distance.


Author(s):  
Hui Zhang ◽  
Yu-Hong Dai ◽  
Lei Guo ◽  
Wei Peng

We introduce a unified algorithmic framework, called the proximal-like incremental aggregated gradient (PLIAG) method, for minimizing the sum of a convex function that consists of additive relatively smooth convex components and a proper lower semicontinuous convex regularization function over an abstract feasible set whose geometry can be captured by using the domain of a Legendre function. The PLIAG method includes many existing algorithms in the literature as special cases, such as the proximal gradient method, the Bregman proximal gradient method (also called the NoLips algorithm), the incremental aggregated gradient method, the incremental aggregated proximal method, and the proximal incremental aggregated gradient method. It also includes some novel interesting iteration schemes. First, we show that the PLIAG method is globally sublinearly convergent without requiring a growth condition, which extends the sublinear convergence result for the proximal gradient algorithm to incremental aggregated-type first-order methods. Then, by embedding a so-called Bregman distance growth condition into a descent-type lemma to construct a special Lyapunov function, we show that the PLIAG method is globally linearly convergent in terms of both function values and Bregman distances to the optimal solution set, provided that the step size is not greater than some positive constant. The convergence results derived in this paper are all established beyond the standard assumptions in the literature (i.e., without requiring the strong convexity and the Lipschitz gradient continuity of the smooth part of the objective). When specialized to many existing algorithms, our results recover or supplement their convergence results under strictly weaker conditions.


2013 ◽  
Vol 2013 ◽  
pp. 1-12 ◽  
Author(s):  
Li-Wei Kuo ◽  
D. R. Sahu

The purpose of this paper is to discuss some fundamental properties of Bregman distance, generalized projection operators, firmly nonexpansive mappings, and resolvent operators of set-valued monotone operators corresponding to a functionalΦ(∥·∥). We further study some proximal point algorithms for finding zeros of monotone operators and solving generalized mixed equilibrium problems in Banach spaces. Our results improve and extend some recent results concerning generalized projection operators corresponding to Bregman distance.


2010 ◽  
Vol 162 (6) ◽  
pp. 1225-1244 ◽  
Author(s):  
Heinz H. Bauschke ◽  
Mason S. Macklem ◽  
Jason B. Sewell ◽  
Xianfu Wang
Keyword(s):  

Sign in / Sign up

Export Citation Format

Share Document