scholarly journals On compositions of special cases of Lipschitz continuous operators

Author(s):  
Pontus Giselsson ◽  
Walaa M. Moursi

AbstractMany iterative optimization algorithms involve compositions of special cases of Lipschitz continuous operators, namely firmly nonexpansive, averaged, and nonexpansive operators. The structure and properties of the compositions are of particular importance in the proofs of convergence of such algorithms. In this paper, we systematically study the compositions of further special cases of Lipschitz continuous operators. Applications of our results include compositions of scaled conically nonexpansive mappings, as well as the Douglas–Rachford and forward–backward operators, when applied to solve certain structured monotone inclusion and optimization problems. Several examples illustrate and tighten our conclusions.

2020 ◽  
Vol 2020 (1) ◽  
Author(s):  
Yuanheng Wang ◽  
Xiuping Wu ◽  
Chanjuan Pan

AbstractIn this paper, we propose an iteration algorithm for finding a split common fixed point of an asymptotically nonexpansive mapping in the frameworks of two real Banach spaces. Under some suitable conditions imposed on the sequences of parameters, some strong convergence theorems are proved, which also solve some variational inequalities that are closely related to optimization problems. The results here generalize and improve the main results of other authors.


2021 ◽  
Vol 76 (4) ◽  
Author(s):  
Arian Bërdëllima ◽  
Gabriele Steidl

AbstractWe introduce the class of $$\alpha $$ α -firmly nonexpansive and quasi $$\alpha $$ α -firmly nonexpansive operators on r-uniformly convex Banach spaces. This extends the existing notion from Hilbert spaces, where $$\alpha $$ α -firmly nonexpansive operators coincide with so-called $$\alpha $$ α -averaged operators. For our more general setting, we show that $$\alpha $$ α -averaged operators form a subset of $$\alpha $$ α -firmly nonexpansive operators. We develop some basic calculus rules for (quasi) $$\alpha $$ α -firmly nonexpansive operators. In particular, we show that their compositions and convex combinations are again (quasi) $$\alpha $$ α -firmly nonexpansive. Moreover, we will see that quasi $$\alpha $$ α -firmly nonexpansive operators enjoy the asymptotic regularity property. Then, based on Browder’s demiclosedness principle, we prove for r-uniformly convex Banach spaces that the weak cluster points of the iterates $$x_{n+1}:=Tx_{n}$$ x n + 1 : = T x n belong to the fixed point set $${{\,\mathrm{Fix}\,}}T$$ Fix T whenever the operator T is nonexpansive and quasi $$\alpha $$ α -firmly. If additionally the space has a Fréchet differentiable norm or satisfies Opial’s property, then these iterates converge weakly to some element in $${{\,\mathrm{Fix}\,}}T$$ Fix T . Further, the projections $$P_{{{\,\mathrm{Fix}\,}}T}x_n$$ P Fix T x n converge strongly to this weak limit point. Finally, we give three illustrative examples, where our theory can be applied, namely from infinite dimensional neural networks, semigroup theory, and contractive projections in $$L_p$$ L p , $$p \in (1,\infty ) \backslash \{2\}$$ p ∈ ( 1 , ∞ ) \ { 2 } spaces on probability measure spaces.


Author(s):  
C. R. Subramanian

We introduce and study an inductively defined analogue [Formula: see text] of any increasing graph invariant [Formula: see text]. An invariant [Formula: see text] is increasing if [Formula: see text] whenever [Formula: see text] is an induced subgraph of [Formula: see text]. This inductive analogue simultaneously generalizes and unifies known notions like degeneracy, inductive independence number, etc., into a single generic notion. For any given increasing [Formula: see text], this gets us several new invariants and many of which are also increasing. It is also shown that [Formula: see text] is the minimum (over all orderings) of a value associated with each ordering. We also explore the possibility of computing [Formula: see text] (and a corresponding optimal vertex ordering) and identify some pairs [Formula: see text] for which [Formula: see text] can be computed efficiently for members of [Formula: see text]. In particular, it includes graphs of bounded [Formula: see text] values. Some specific examples (like the class of chordal graphs) have already been studied extensively. We further extend this new notion by (i) allowing vertex weighted graphs, (ii) allowing [Formula: see text] to take values from a totally ordered universe with a minimum and (iii) allowing the consideration of [Formula: see text]-neighborhoods for arbitrary but fixed [Formula: see text]. Such a generalization is employed in designing efficient approximations of some graph optimization problems. Precisely, we obtain efficient algorithms (by generalizing the known algorithm of Ye and Borodin [Y. Ye and A. Borodin, Elimination graphs, ACM Trans. Algorithms 8(2) (2012) 1–23] for special cases) for approximating optimal weighted induced [Formula: see text]-subgraphs and optimal [Formula: see text]-colorings (for hereditary [Formula: see text]’s) within multiplicative factors of (essentially) [Formula: see text] and [Formula: see text] respectively, where [Formula: see text] denotes the inductive analogue (as defined in this work) of optimal size of an unweighted induced [Formula: see text]-subgraph of the input and [Formula: see text] is the minimum size of a forbidden induced subgraph of [Formula: see text]. Our results generalize the previous result on efficiently approximating maximum independent sets and minimum colorings on graphs of bounded inductive independence number to optimal [Formula: see text]-subgraphs and [Formula: see text]-colorings for arbitrary hereditary classes [Formula: see text]. As a corollary, it is also shown that any maximal [Formula: see text]-subgraph approximates an optimal solution within a factor of [Formula: see text] for unweighted graphs, where [Formula: see text] is maximum size of any induced [Formula: see text]-subgraph in any local neighborhood [Formula: see text].


2019 ◽  
Vol 28 (1) ◽  
pp. 19-26
Author(s):  
IOANNIS K. ARGYROS ◽  
◽  
SANTHOSH GEORGE ◽  

We present the local as well as the semi-local convergence of some iterative methods free of derivatives for Banach space valued operators. These methods contain the secant and the Kurchatov method as special cases. The convergence is based on weak hypotheses specializing to Lipschitz continuous or Holder continuous hypotheses. The results are of theoretical and practical interest. In particular the method is compared favorably ¨ to other methods using concrete numerical examples to solve systems of equations containing a nondifferentiable term.


Author(s):  
Ion Necoara ◽  
Martin Takáč

Abstract In this paper we consider large-scale smooth optimization problems with multiple linear coupled constraints. Due to the non-separability of the constraints, arbitrary random sketching would not be guaranteed to work. Thus, we first investigate necessary and sufficient conditions for the sketch sampling to have well-defined algorithms. Based on these sampling conditions we develop new sketch descent methods for solving general smooth linearly constrained problems, in particular, random sketch descent (RSD) and accelerated random sketch descent (A-RSD) methods. To our knowledge, this is the first convergence analysis of RSD algorithms for optimization problems with multiple non-separable linear constraints. For the general case, when the objective function is smooth and non-convex, we prove for the non-accelerated variant sublinear rate in expectation for an appropriate optimality measure. In the smooth convex case, we derive for both algorithms, non-accelerated and A-RSD, sublinear convergence rates in the expected values of the objective function. Additionally, if the objective function satisfies a strong convexity type condition, both algorithms converge linearly in expectation. In special cases, where complexity bounds are known for some particular sketching algorithms, such as coordinate descent methods for optimization problems with a single linear coupled constraint, our theory recovers the best known bounds. Finally, we present several numerical examples to illustrate the performances of our new algorithms.


Author(s):  
Abdelkrim El Mouatasim ◽  
Rachid Ellaia ◽  
Eduardo de Cursi

Random perturbation of the projected variable metric method for nonsmooth nonconvex optimization problems with linear constraintsWe present a random perturbation of the projected variable metric method for solving linearly constrained nonsmooth (i.e., nondifferentiable) nonconvex optimization problems, and we establish the convergence to a global minimum for a locally Lipschitz continuous objective function which may be nondifferentiable on a countable set of points. Numerical results show the effectiveness of the proposed approach.


Sign in / Sign up

Export Citation Format

Share Document