Nonlinear-Least-Squares Analysis of Slow-Motion EPR Spectra in One and Two Dimensions Using a Modified Levenberg–Marquardt Algorithm

1996 ◽  
Vol 120 (2) ◽  
pp. 155-189 ◽  
Author(s):  
David E. Budil ◽  
Sanghyuk Lee ◽  
Sunil Saxena ◽  
Jack H. Freed
2019 ◽  
Vol 74 (2) ◽  
pp. 547-582 ◽  
Author(s):  
Jifeng Bao ◽  
Carisa Kwok Wai Yu ◽  
Jinhua Wang ◽  
Yaohua Hu ◽  
Jen-Chih Yao

2019 ◽  
Vol 21 (3) ◽  
pp. 471-501 ◽  
Author(s):  
Michael Kommenda ◽  
Bogdan Burlacu ◽  
Gabriel Kronberger ◽  
Michael Affenzeller

AbstractIn this paper we analyze the effects of using nonlinear least squares for parameter identification of symbolic regression models and integrate it as local search mechanism in tree-based genetic programming. We employ the Levenberg–Marquardt algorithm for parameter optimization and calculate gradients via automatic differentiation. We provide examples where the parameter identification succeeds and fails and highlight its computational overhead. Using an extensive suite of symbolic regression benchmark problems we demonstrate the increased performance when incorporating nonlinear least squares within genetic programming. Our results are compared with recently published results obtained by several genetic programming variants and state of the art machine learning algorithms. Genetic programming with nonlinear least squares performs among the best on the defined benchmark suite and the local search can be easily integrated in different genetic programming algorithms as long as only differentiable functions are used within the models.


Author(s):  
Huawei Wang ◽  
◽  
De Xu

In the novel method we propose for determining extrinsic parameters for active stereovision, we first map the relationship between rotational and yaw angles based on least squares fitting, then optimize the rotational axis between two cameras using the Levenberg-Marquardt algorithm. Extrinsic parameters are then easily derived for active stereovision based on the mapping model without complex recalibration. The results of experiments confirmed our proposed method's feasibility.


2016 ◽  
Vol 23 (2) ◽  
pp. 59-73 ◽  
Author(s):  
J. Mandel ◽  
E. Bergou ◽  
S. Gürol ◽  
S. Gratton ◽  
I. Kasanický

Abstract. The ensemble Kalman smoother (EnKS) is used as a linear least-squares solver in the Gauss–Newton method for the large nonlinear least-squares system in incremental 4DVAR. The ensemble approach is naturally parallel over the ensemble members and no tangent or adjoint operators are needed. Furthermore, adding a regularization term results in replacing the Gauss–Newton method, which may diverge, by the Levenberg–Marquardt method, which is known to be convergent. The regularization is implemented efficiently as an additional observation in the EnKS. The method is illustrated on the Lorenz 63 model and a two-level quasi-geostrophic model.


2015 ◽  
Vol 2 (3) ◽  
pp. 865-902 ◽  
Author(s):  
J. Mandel ◽  
E. Bergou ◽  
S. Gürol ◽  
S. Gratton

Abstract. We propose to use the ensemble Kalman smoother (EnKS) as the linear least squares solver in the Gauss–Newton method for the large nonlinear least squares in incremental 4DVAR. The ensemble approach is naturally parallel over the ensemble members and no tangent or adjoint operators are needed. Further, adding a regularization term results in replacing the Gauss–Newton method, which may diverge, by the Levenberg–Marquardt method, which is known to be convergent. The regularization is implemented efficiently as an additional observation in the EnKS. The method is illustrated on the Lorenz 63 and the two-level quasi-geostrophic model problems.


Sign in / Sign up

Export Citation Format

Share Document