scholarly journals Two modifications of the inertial Tseng extragradient method with self-adaptive step size for solving monotone variational inequality problems

2020 ◽  
Vol 53 (1) ◽  
pp. 208-224 ◽  
Author(s):  
Timilehin Opeyemi Alakoya ◽  
Lateef Olakunle Jolaoso ◽  
Oluwatosin Temitope Mewomo

AbstractIn this work, we introduce two new inertial-type algorithms for solving variational inequality problems (VIPs) with monotone and Lipschitz continuous mappings in real Hilbert spaces. The first algorithm requires the computation of only one projection onto the feasible set per iteration while the second algorithm needs the computation of only one projection onto a half-space, and prior knowledge of the Lipschitz constant of the monotone mapping is not required in proving the strong convergence theorems for the two algorithms. Under some mild assumptions, we prove strong convergence results for the proposed algorithms to a solution of a VIP. Finally, we provide some numerical experiments to illustrate the efficiency and advantages of the proposed algorithms.

2016 ◽  
Vol 21 (4) ◽  
pp. 478-501 ◽  
Author(s):  
Dang Van Hieu

In this paper, we introduce two parallel extragradient-proximal methods for solving split equilibrium problems. The algorithms combine the extragradient method, the proximal method and the shrinking projection method. The weak and strong convergence theorems for iterative sequences generated by the algorithms are established under widely used assumptions for equilibrium bifunctions. We also present an application to split variational inequality problems and a numerical example to illustrate the convergence of the proposed algorithms.


Author(s):  
Zhongbing Xie ◽  
Gang Cai ◽  
Xiaoxiao Li ◽  
Qiao-Li Dong

Abstract The purpose of this paper is to study a new Tseng’s extragradient method with two different stepsize rules for solving pseudomonotone variational inequalities in real Hilbert spaces. We prove a strong convergence theorem of the proposed algorithm under some suitable conditions imposed on the parameters. Moreover, we also give some numerical experiments to demonstrate the performance of our algorithm.


Symmetry ◽  
2021 ◽  
Vol 13 (3) ◽  
pp. 489
Author(s):  
Aviv Gibali ◽  
Olaniyi S. Iyiola ◽  
Lanre Akinyemi ◽  
Yekini Shehu

Our main focus in this work is the classical variational inequality problem with Lipschitz continuous and pseudo-monotone mapping in real Hilbert spaces. An adaptive reflected subgradient-extragradient method is presented along with its weak convergence analysis. The novelty of the proposed method lies in the fact that only one projection onto the feasible set in each iteration is required, and there is no need to know/approximate the Lipschitz constant of the cost function a priori. To illustrate and emphasize the potential applicability of the new scheme, several numerical experiments and comparisons in tomography reconstruction, Nash–Cournot oligopolistic equilibrium, and more are presented.


2021 ◽  
Vol 2021 (1) ◽  
Author(s):  
Ming Tian ◽  
Gang Xu

AbstractThe objective of this article is to solve pseudomonotone variational inequality problems in a real Hilbert space. We introduce an inertial algorithm with a new self-adaptive step size rule, which is based on the projection and contraction method. Only one step projection is used to design the proposed algorithm, and the strong convergence of the iterative sequence is obtained under some appropriate conditions. The main advantage of the algorithm is that the proof of convergence of the algorithm is implemented without the prior knowledge of the Lipschitz constant of cost operator. Numerical experiments are also put forward to support the analysis of the theorem and provide comparisons with related algorithms.


Symmetry ◽  
2021 ◽  
Vol 13 (2) ◽  
pp. 182
Author(s):  
Kanikar Muangchoo ◽  
Nasser Aedh Alreshidi ◽  
Ioannis K. Argyros

In this paper, we introduce two novel extragradient-like methods to solve variational inequalities in a real Hilbert space. The variational inequality problem is a general mathematical problem in the sense that it unifies several mathematical models, such as optimization problems, Nash equilibrium models, fixed point problems, and saddle point problems. The designed methods are analogous to the two-step extragradient method that is used to solve variational inequality problems in real Hilbert spaces that have been previously established. The proposed iterative methods use a specific type of step size rule based on local operator information rather than its Lipschitz constant or any other line search procedure. Under mild conditions, such as the Lipschitz continuity and monotonicity of a bi-function (including pseudo-monotonicity), strong convergence results of the described methods are established. Finally, we provide many numerical experiments to demonstrate the performance and superiority of the designed methods.


Sign in / Sign up

Export Citation Format

Share Document