scholarly journals Distributed algorithms for computing a fixed point of multi-agent nonexpansive operators

Automatica ◽  
2020 ◽  
Vol 122 ◽  
pp. 109286
Author(s):  
Xiuxian Li ◽  
Lihua Xie
Mathematics ◽  
2020 ◽  
Vol 8 (3) ◽  
pp. 378 ◽  
Author(s):  
Adisak Hanjing ◽  
Suthep Suantai

In this paper, a new accelerated fixed point algorithm for solving a common fixed point of a family of nonexpansive operators is introduced and studied, and then a weak convergence result and the convergence behavior of the proposed method is proven and discussed. Using our main result, we obtain a new accelerated image restoration algorithm, called the forward-backward modified W-algorithm (FBMWA), for solving a minimization problem in the form of the sum of two proper lower semi-continuous and convex functions. As applications, we apply the FBMWA algorithm to solving image restoration problems. We analyze and compare convergence behavior of our method with the others for deblurring the image. We found that our algorithm has a higher efficiency than the others in the literature.


2021 ◽  
Vol 2021 (1) ◽  
Author(s):  
Tongxin Xu ◽  
Luoyi Shi

AbstractIn this paper, we propose a new iterative algorithm for solving the multiple-sets split feasibility problem (MSSFP for short) and the split equality fixed point problem (SEFPP for short) with firmly quasi-nonexpansive operators or nonexpansive operators in real Hilbert spaces. Under mild conditions, we prove strong convergence theorems for the algorithm by using the projection method and the properties of projection operators. The result improves and extends the corresponding ones announced by some others in the earlier and recent literature.


2021 ◽  
Vol 10 (1) ◽  
pp. 1154-1177
Author(s):  
Patrick L. Combettes ◽  
Lilian E. Glaudin

Abstract Various strategies are available to construct iteratively a common fixed point of nonexpansive operators by activating only a block of operators at each iteration. In the more challenging class of composite fixed point problems involving operators that do not share common fixed points, current methods require the activation of all the operators at each iteration, and the question of maintaining convergence while updating only blocks of operators is open. We propose a method that achieves this goal and analyze its asymptotic behavior. Weak, strong, and linear convergence results are established by exploiting a connection with the theory of concentrating arrays. Applications to several nonlinear and nonsmooth analysis problems are presented, ranging from monotone inclusions and inconsistent feasibility problems, to variational inequalities and minimization problems arising in data science.


Sign in / Sign up

Export Citation Format

Share Document