Local Convergence Analysis of the Levenberg–Marquardt Framework for Nonzero-Residue Nonlinear Least-Squares Problems Under an Error Bound Condition

2019 ◽  
Vol 183 (3) ◽  
pp. 1099-1122 ◽  
Author(s):  
Roger Behling ◽  
Douglas S. Gonçalves ◽  
Sandra A. Santos
2021 ◽  
Vol 13 (2) ◽  
pp. 305-314
Author(s):  
S.M. Shakhno ◽  
H.P. Yarmola

We investigate the local convergence of the Gauss-Newton-Kurchatov method for solving nonlinear least squares problems. This method is a combination of Gauss-Newton and Kurchatov methods and it is used for problems with the decomposition of the operator. The convergence analysis of the method is performed under the generalized Lipshitz conditions. The conditions of convergence, radius and the convergence order of the considered method are established. Given numerical examples confirm the theoretical results.


Axioms ◽  
2021 ◽  
Vol 10 (3) ◽  
pp. 158
Author(s):  
Ioannis K. Argyros ◽  
Stepan Shakhno ◽  
Roman Iakymchuk ◽  
Halyna Yarmola ◽  
Michael I. Argyros

We develop a local convergence of an iterative method for solving nonlinear least squares problems with operator decomposition under the classical and generalized Lipschitz conditions. We consider the case of both zero and nonzero residuals and determine their convergence orders. We use two types of Lipschitz conditions (center and restricted region conditions) to study the convergence of the method. Moreover, we obtain a larger radius of convergence and tighter error estimates than in previous works. Hence, we extend the applicability of this method under the same computational effort.


2019 ◽  
Vol 74 (2) ◽  
pp. 547-582 ◽  
Author(s):  
Jifeng Bao ◽  
Carisa Kwok Wai Yu ◽  
Jinhua Wang ◽  
Yaohua Hu ◽  
Jen-Chih Yao

Mathematics ◽  
2019 ◽  
Vol 7 (1) ◽  
pp. 99 ◽  
Author(s):  
Ioannis Argyros ◽  
Stepan Shakhno ◽  
Yurii Shunkin

We study an iterative differential-difference method for solving nonlinear least squares problems, which uses, instead of the Jacobian, the sum of derivative of differentiable parts of operator and divided difference of nondifferentiable parts. Moreover, we introduce a method that uses the derivative of differentiable parts instead of the Jacobian. Results that establish the conditions of convergence, radius and the convergence order of the proposed methods in earlier work are presented. The numerical examples illustrate the theoretical results.


2018 ◽  
Vol 34 (2) ◽  
pp. 135-142
Author(s):  
IOANNIS K. ARGYROS ◽  
◽  
YEOL JE CHO ◽  
SANTHOSH GEORGE ◽  
◽  
...  

The aim of this paper is to extend the applicability of the Gauss-Newton’s method for solving nonlinear least squares problems using our new idea of restricted convergence domains. The new technique uses tighter Lipschitz functions than in earlier papers leading to a tighter ball convergence analysis.


2021 ◽  
Vol 7 (1) ◽  
pp. 1241-1256
Author(s):  
Lin Zheng ◽  
◽  
Liang Chen ◽  
Yanfang Ma ◽  
◽  
...  

<abstract><p>The Levenberg-Marquardt method is one of the most important methods for solving systems of nonlinear equations and nonlinear least-squares problems. It enjoys a quadratic convergence rate under the local error bound condition. Recently, to solve nonzero-residue nonlinear least-squares problem, Behling et al. propose a modified Levenberg-Marquardt method with at least superlinearly convergence under a new error bound condtion <sup>[<xref ref-type="bibr" rid="b3">3</xref>]</sup>. To extend their results for systems of nonlinear equations, by choosing the LM parameters adaptively, we propose an efficient variant of the Levenberg-Marquardt method and prove its quadratic convergence under the new error bound condition. We also investigate its global convergence by using the Wolfe line search. The effectiveness of the new method is validated by some numerical experiments.</p></abstract>


Sign in / Sign up

Export Citation Format

Share Document