scholarly journals On limited memory SQP methods for large scale constrained nonlinear least squares problems

2000 ◽  
Vol 42 ◽  
pp. 900
Author(s):  
Z. F. Li
2020 ◽  
Vol 48 (4) ◽  
pp. 987-1003
Author(s):  
Hans Georg Bock ◽  
Jürgen Gutekunst ◽  
Andreas Potschka ◽  
María Elena Suaréz Garcés

AbstractJust as the damped Newton method for the numerical solution of nonlinear algebraic problems can be interpreted as a forward Euler timestepping on the Newton flow equations, the damped Gauß–Newton method for nonlinear least squares problems is equivalent to forward Euler timestepping on the corresponding Gauß–Newton flow equations. We highlight the advantages of the Gauß–Newton flow and the Gauß–Newton method from a statistical and a numerical perspective in comparison with the Newton method, steepest descent, and the Levenberg–Marquardt method, which are respectively equivalent to Newton flow forward Euler, gradient flow forward Euler, and gradient flow backward Euler. We finally show an unconditional descent property for a generalized Gauß–Newton flow, which is linked to Krylov–Gauß–Newton methods for large-scale nonlinear least squares problems. We provide numerical results for large-scale problems: An academic generalized Rosenbrock function and a real-world bundle adjustment problem from 3D reconstruction based on 2D images.


2021 ◽  
Author(s):  
Morteza Kimiaei ◽  
Arnold Neumaier

AbstractThis paper suggests a new limited memory trust region algorithm for large unconstrained black box least squares problems, called LMLS. Main features of LMLS are a new non-monotone technique, a new adaptive radius strategy, a new Broyden-like algorithm based on the previous good points, and a heuristic estimation for the Jacobian matrix in a subspace with random basis indices. Our numerical results show that LMLS is robust and efficient, especially in comparison with solvers using traditional limited memory and standard quasi-Newton approximations.


Heliyon ◽  
2021 ◽  
pp. e07499
Author(s):  
Mahmoud Muhammad Yahaya ◽  
Poom Kumam ◽  
Aliyu Muhammed Awwal ◽  
Sani Aji

Axioms ◽  
2021 ◽  
Vol 10 (3) ◽  
pp. 158
Author(s):  
Ioannis K. Argyros ◽  
Stepan Shakhno ◽  
Roman Iakymchuk ◽  
Halyna Yarmola ◽  
Michael I. Argyros

We develop a local convergence of an iterative method for solving nonlinear least squares problems with operator decomposition under the classical and generalized Lipschitz conditions. We consider the case of both zero and nonzero residuals and determine their convergence orders. We use two types of Lipschitz conditions (center and restricted region conditions) to study the convergence of the method. Moreover, we obtain a larger radius of convergence and tighter error estimates than in previous works. Hence, we extend the applicability of this method under the same computational effort.


Sign in / Sign up

Export Citation Format

Share Document