Convergence results of a modified regularized gradient method for nonlinear ill-posed problems

2009 ◽  
Vol 34 (1-2) ◽  
pp. 233-250
Author(s):  
Yehui Peng ◽  
Zhenhai Liu ◽  
Heying Feng
2017 ◽  
Vol 11 (4) ◽  
pp. 703-720 ◽  
Author(s):  
Stefan Kindermann ◽  
Keyword(s):  

2012 ◽  
Vol 476-478 ◽  
pp. 2292-2295
Author(s):  
Zhen Chen

Three identification methods, the time domain method (TDM)、the conjugate gradient method (CGM)and the pre-treatment conjugate gradient method (PCGM) are employed for moving force identification. Related research shows that the PCGM have higher identification accuracy and robust noise immunity as well as producing an acceptable solution to ill-posed cases to some extent when they are used to identify the moving force. However, the pre-treatment matrix is very important to the PCGM because it affects the identification accuracy and robust noise immunity as well as ill-posed cases to some extent. The theory study results are practical significant to selection properly pre-treatment matrix.


Author(s):  
Touraj Nikazad ◽  
Mokhtar Abbasi ◽  
Tommy Elfving

AbstractWe study error minimizing relaxation (EMR) strategies for use in Landweber and Kaczmarz type iterations applied to linear systems with or without convex constraints. Convergence results based on operator theory are given, assuming exact data. The advantages and disadvantages of these relaxation strategies on a noisy and ill-posed problem are illustrated using examples taken from the field of image reconstruction from projections. We also consider combining EMR with penalization.


Author(s):  
Hui Zhang ◽  
Yu-Hong Dai ◽  
Lei Guo ◽  
Wei Peng

We introduce a unified algorithmic framework, called the proximal-like incremental aggregated gradient (PLIAG) method, for minimizing the sum of a convex function that consists of additive relatively smooth convex components and a proper lower semicontinuous convex regularization function over an abstract feasible set whose geometry can be captured by using the domain of a Legendre function. The PLIAG method includes many existing algorithms in the literature as special cases, such as the proximal gradient method, the Bregman proximal gradient method (also called the NoLips algorithm), the incremental aggregated gradient method, the incremental aggregated proximal method, and the proximal incremental aggregated gradient method. It also includes some novel interesting iteration schemes. First, we show that the PLIAG method is globally sublinearly convergent without requiring a growth condition, which extends the sublinear convergence result for the proximal gradient algorithm to incremental aggregated-type first-order methods. Then, by embedding a so-called Bregman distance growth condition into a descent-type lemma to construct a special Lyapunov function, we show that the PLIAG method is globally linearly convergent in terms of both function values and Bregman distances to the optimal solution set, provided that the step size is not greater than some positive constant. The convergence results derived in this paper are all established beyond the standard assumptions in the literature (i.e., without requiring the strong convexity and the Lipschitz gradient continuity of the smooth part of the objective). When specialized to many existing algorithms, our results recover or supplement their convergence results under strictly weaker conditions.


Sign in / Sign up

Export Citation Format

Share Document