firmly nonexpansive
Recently Published Documents


TOTAL DOCUMENTS

48
(FIVE YEARS 11)

H-INDEX

12
(FIVE YEARS 0)

Author(s):  
Pontus Giselsson ◽  
Walaa M. Moursi

AbstractMany iterative optimization algorithms involve compositions of special cases of Lipschitz continuous operators, namely firmly nonexpansive, averaged, and nonexpansive operators. The structure and properties of the compositions are of particular importance in the proofs of convergence of such algorithms. In this paper, we systematically study the compositions of further special cases of Lipschitz continuous operators. Applications of our results include compositions of scaled conically nonexpansive mappings, as well as the Douglas–Rachford and forward–backward operators, when applied to solve certain structured monotone inclusion and optimization problems. Several examples illustrate and tighten our conclusions.


Mathematics ◽  
2021 ◽  
Vol 9 (19) ◽  
pp. 2418
Author(s):  
Afrah A. N. Abdou ◽  
Mohamed A. Khamsi

In this work, we investigate the existence of periodic points of mappings defined on nonconvex domains within the variable exponent sequence spaces ℓp(·). In particular, we consider the case of modular firmly nonexpansive and modular firmly asymptotically nonexpansive mappings. These kinds of results have never been obtained before.


Author(s):  
Sorin-Mihai Grad ◽  
Felipe Lara

AbstractWe introduce and investigate a new generalized convexity notion for functions called prox-convexity. The proximity operator of such a function is single-valued and firmly nonexpansive. We provide examples of (strongly) quasiconvex, weakly convex, and DC (difference of convex) functions that are prox-convex, however none of these classes fully contains the one of prox-convex functions or is included into it. We show that the classical proximal point algorithm remains convergent when the convexity of the proper lower semicontinuous function to be minimized is relaxed to prox-convexity.


Author(s):  
Nicholas Pischke ◽  
Ulrich Kohlenbach

AbstractWe use techniques originating from the subdiscipline of mathematical logic called ‘proof mining’ to provide rates of metastability and—under a metric regularity assumption—rates of convergence for a subgradient-type algorithm solving the equilibrium problem in convex optimization over fixed-point sets of firmly nonexpansive mappings. The algorithm is due to H. Iiduka and I. Yamada who in 2009 gave a noneffective proof of its convergence. This case study illustrates the applicability of the logic-based abstract quantitative analysis of general forms of Fejér monotonicity as given by the second author in previous papers.


2021 ◽  
Vol 76 (4) ◽  
Author(s):  
Arian Bërdëllima ◽  
Gabriele Steidl

AbstractWe introduce the class of $$\alpha $$ α -firmly nonexpansive and quasi $$\alpha $$ α -firmly nonexpansive operators on r-uniformly convex Banach spaces. This extends the existing notion from Hilbert spaces, where $$\alpha $$ α -firmly nonexpansive operators coincide with so-called $$\alpha $$ α -averaged operators. For our more general setting, we show that $$\alpha $$ α -averaged operators form a subset of $$\alpha $$ α -firmly nonexpansive operators. We develop some basic calculus rules for (quasi) $$\alpha $$ α -firmly nonexpansive operators. In particular, we show that their compositions and convex combinations are again (quasi) $$\alpha $$ α -firmly nonexpansive. Moreover, we will see that quasi $$\alpha $$ α -firmly nonexpansive operators enjoy the asymptotic regularity property. Then, based on Browder’s demiclosedness principle, we prove for r-uniformly convex Banach spaces that the weak cluster points of the iterates $$x_{n+1}:=Tx_{n}$$ x n + 1 : = T x n belong to the fixed point set $${{\,\mathrm{Fix}\,}}T$$ Fix T whenever the operator T is nonexpansive and quasi $$\alpha $$ α -firmly. If additionally the space has a Fréchet differentiable norm or satisfies Opial’s property, then these iterates converge weakly to some element in $${{\,\mathrm{Fix}\,}}T$$ Fix T . Further, the projections $$P_{{{\,\mathrm{Fix}\,}}T}x_n$$ P Fix T x n converge strongly to this weak limit point. Finally, we give three illustrative examples, where our theory can be applied, namely from infinite dimensional neural networks, semigroup theory, and contractive projections in $$L_p$$ L p , $$p \in (1,\infty ) \backslash \{2\}$$ p ∈ ( 1 , ∞ ) \ { 2 } spaces on probability measure spaces.


Sign in / Sign up

Export Citation Format

Share Document