scholarly journals A proximal point method for nonsmooth convex optimization problems in Banach spaces

1997 ◽  
Vol 2 (1-2) ◽  
pp. 97-120 ◽  
Author(s):  
Y. I. Alber ◽  
R. S. Burachik ◽  
A. N. Iusem

In this paper we show the weak convergence and stability of the proximal point method when applied to the constrained convex optimization problem in uniformly convex and uniformly smooth Banach spaces. In addition, we establish a nonasymptotic estimate of convergence rate of the sequence of functional values for the unconstrained case. This estimate depends on a geometric characteristic of the dual Banach space, namely its modulus of convexity. We apply a new technique which includes Banach space geometry, estimates of duality mappings, nonstandard Lyapunov functionals and generalized projection operators in Banach spaces.

2009 ◽  
Vol 18 (1) ◽  
pp. 109-120 ◽  
Author(s):  
Alfredo N. Iusem ◽  
Elena Resmerita

2014 ◽  
Vol 2014 ◽  
pp. 1-8
Author(s):  
Yu-hua Zeng ◽  
Yu-fei Yang ◽  
Zheng Peng

We propose an appealing line-search-based partial proximal alternating directions (LSPPAD) method for solving a class of separable convex optimization problems. These problems under consideration are common in practice. The proposed method solves two subproblems at each iteration: one is solved by a proximal point method, while the proximal term is absent from the other. Both subproblems admit inexact solutions. A line search technique is used to guarantee the convergence. The convergence of the LSPPAD method is established under some suitable conditions. The advantage of the proposed method is that it provides the tractability of the subproblem in which the proximal term is absent. Numerical tests show that the LSPPAD method has better performance compared with the existing alternating projection based prediction-correction (APBPC) method if both are employed to solve the described problem.


2021 ◽  
Vol 5 ◽  
pp. 82-92
Author(s):  
Sergei Denisov ◽  
◽  
Vladimir Semenov ◽  

Many problems of operations research and mathematical physics can be formulated in the form of variational inequalities. The development and research of algorithms for solving variational inequalities is an actively developing area of applied nonlinear analysis. Note that often nonsmooth optimization problems can be effectively solved if they are reformulated in the form of saddle point problems and algorithms for solving variational inequalities are applied. Recently, there has been progress in the study of algorithms for problems in Banach spaces. This is due to the wide involvement of the results and constructions of the geometry of Banach spaces. A new algorithm for solving variational inequalities in a Banach space is proposed and studied. In addition, the Alber generalized projection is used instead of the metric projection onto the feasible set. An attractive feature of the algorithm is only one computation at the iterative step of the projection onto the feasible set. For variational inequalities with monotone Lipschitz operators acting in a 2-uniformly convex and uniformly smooth Banach space, a theorem on the weak convergence of the method is proved.


Sign in / Sign up

Export Citation Format

Share Document