A dynamical systems perspective to convergence rate analysis of proximal algorithms

Author(s):  
Mahyar Fazlyab ◽  
Alejandro Ribeiro ◽  
Manfred Morari ◽  
Victor M. Preciado
Mathematics ◽  
2020 ◽  
Vol 8 (4) ◽  
pp. 608
Author(s):  
Pornsarp Pornsawad ◽  
Parada Sungcharoen ◽  
Christine Böckmann

In this paper, we present the convergence rate analysis of the modified Landweber method under logarithmic source condition for nonlinear ill-posed problems. The regularization parameter is chosen according to the discrepancy principle. The reconstructions of the shape of an unknown domain for an inverse potential problem by using the modified Landweber method are exhibited.


Author(s):  
Ran Gu ◽  
Qiang Du

Abstract How to choose the step size of gradient descent method has been a popular subject of research. In this paper we propose a modified limited memory steepest descent method (MLMSD). In each iteration we propose a selection rule to pick a unique step size from a candidate set, which is calculated by Fletcher’s limited memory steepest descent method (LMSD), instead of going through all the step sizes in a sweep, as in Fletcher’s original LMSD algorithm. MLMSD is motivated by an inexact super-linear convergence rate analysis. The R-linear convergence of MLMSD is proved for a strictly convex quadratic minimization problem. Numerical tests are presented to show that our algorithm is efficient and robust.


Sign in / Sign up

Export Citation Format

Share Document