line search
Recently Published Documents


TOTAL DOCUMENTS

913
(FIVE YEARS 204)

H-INDEX

36
(FIVE YEARS 7)

2022 ◽  
Vol 143 ◽  
pp. 104592
Author(s):  
Xin Zhou ◽  
Dechun Lu ◽  
Cancan Su ◽  
Zhiwei Gao ◽  
Xiuli Du

Mathematics ◽  
2021 ◽  
Vol 9 (24) ◽  
pp. 3322
Author(s):  
Lu-Chuan Ceng ◽  
Jen-Chih Yao ◽  
Yekini Shehu

We propose two Mann-type subgradient-like extra gradient iterations with the line-search procedure for hierarchical variational inequality (HVI) with the common fixed-point problem (CFPP) constraint of finite family of nonexpansive mappings and an asymptotically nonexpansive mapping in a real Hilbert space. Our methods include combinations of the Mann iteration method, subgradient extra gradient method with the line-search process, and viscosity approximation method. Under suitable assumptions, we obtain the strong convergence results of sequence of iterates generated by our methods for a solution to HVI with the CFPP constraint.


Author(s):  
Xin Jiang ◽  
Lieven Vandenberghe

AbstractWe present a new variant of the Chambolle–Pock primal–dual algorithm with Bregman distances, analyze its convergence, and apply it to the centering problem in sparse semidefinite programming. The novelty in the method is a line search procedure for selecting suitable step sizes. The line search obviates the need for estimating the norm of the constraint matrix and the strong convexity constant of the Bregman kernel. As an application, we discuss the centering problem in large-scale semidefinite programming with sparse coefficient matrices. The logarithmic barrier function for the cone of positive semidefinite completable sparse matrices is used as the distance-generating kernel. For this distance, the complexity of evaluating the Bregman proximal operator is shown to be roughly proportional to the cost of a sparse Cholesky factorization. This is much cheaper than the standard proximal operator with Euclidean distances, which requires an eigenvalue decomposition.


2021 ◽  
Vol 2021 (1) ◽  
Author(s):  
Wenting Chen ◽  
Meixia Li

AbstractThe multiple-sets split feasibility problem is the generalization of split feasibility problem, which has been widely used in fuzzy image reconstruction and sparse signal processing systems. In this paper, we present an inertial relaxed algorithm to solve the multiple-sets split feasibility problem by using an alternating inertial step. The advantage of this algorithm is that the choice of stepsize is determined by Armijo-type line search, which avoids calculating the norms of operators. The weak convergence of the sequence obtained by our algorithm is proved under mild conditions. In addition, the numerical experiments are given to verify the convergence and validity of the algorithm.


Author(s):  
Ghada M. Al-Naemi ◽  
Ahmed H. Sheekoo

<p>A new scaled conjugate gradient (SCG) method is proposed throughout this paper, the SCG technique may be a special important generalization conjugate gradient (CG) method, and it is an efficient numerical method for solving nonlinear large scale unconstrained optimization. As a result, we proposed the new SCG method with a strong Wolfe condition (SWC) line search is proposed. The proposed technique's descent property, as well as its global convergence property, are satisfied without the use of any line searches under some suitable assumptions. The proposed technique's efficiency and feasibility are backed up by numerical experiments comparing them to traditional CG techniques.</p>


2021 ◽  
Vol 38 (1) ◽  
pp. 249-262
Author(s):  
PONGSAKORN YOTKAEW ◽  
◽  
HABIB UR REHMAN ◽  
BANCHA PANYANAK ◽  
NUTTAPOL PAKKARANANG ◽  
...  

In this paper, we study the numerical solution of the variational inequalities involving quasimonotone operators in infinite-dimensional Hilbert spaces. We prove that the iterative sequence generated by the proposed algorithm for the solution of quasimonotone variational inequalities converges strongly to a solution. The main advantage of the proposed iterative schemes is that it uses a monotone and non-monotone step size rule based on operator knowledge rather than its Lipschitz constant or some other line search method.


2021 ◽  
Vol 2021 ◽  
pp. 1-9
Author(s):  
Zhujun Wang ◽  
Li Cai

We propose a class of inexact secant methods in association with the line search filter technique for solving nonlinear equality constrained optimization. Compared with other filter methods that combine the line search method applied in most large-scale optimization problems, the inexact line search filter algorithm is more flexible and realizable. In this paper, we focus on the analysis of the local superlinear convergence rate of the algorithms, while their global convergence properties can be obtained by making an analogy with our previous work. These methods have been implemented in a Matlab code, and detailed numerical results indicate that the proposed algorithms are efficient for 43 problems from the CUTEr test set.


Sign in / Sign up

Export Citation Format

Share Document