scholarly journals Convergence Rate Analysis of the Proximal Difference of the Convex Algorithm

2021 ◽  
Vol 2021 ◽  
pp. 1-5
Author(s):  
Xueyong Wang ◽  
Ying Zhang ◽  
Haibin Chen ◽  
Xipeng Kou

In this paper, we study the convergence rate of the proximal difference of the convex algorithm for the problem with a strong convex function and two convex functions. By making full use of the special structure of the difference of convex decomposition, we prove that the convergence rate of the proximal difference of the convex algorithm is linear, which is measured by the objective function value.

Mathematics ◽  
2020 ◽  
Vol 8 (4) ◽  
pp. 608
Author(s):  
Pornsarp Pornsawad ◽  
Parada Sungcharoen ◽  
Christine Böckmann

In this paper, we present the convergence rate analysis of the modified Landweber method under logarithmic source condition for nonlinear ill-posed problems. The regularization parameter is chosen according to the discrepancy principle. The reconstructions of the shape of an unknown domain for an inverse potential problem by using the modified Landweber method are exhibited.


Author(s):  
Ran Gu ◽  
Qiang Du

Abstract How to choose the step size of gradient descent method has been a popular subject of research. In this paper we propose a modified limited memory steepest descent method (MLMSD). In each iteration we propose a selection rule to pick a unique step size from a candidate set, which is calculated by Fletcher’s limited memory steepest descent method (LMSD), instead of going through all the step sizes in a sweep, as in Fletcher’s original LMSD algorithm. MLMSD is motivated by an inexact super-linear convergence rate analysis. The R-linear convergence of MLMSD is proved for a strictly convex quadratic minimization problem. Numerical tests are presented to show that our algorithm is efficient and robust.


2013 ◽  
Vol 365-366 ◽  
pp. 182-185
Author(s):  
Hong Gang Xia ◽  
Qing Liang Wang

In this paper, a modified harmony search (MHS) algorithm was presented for solving 0-1 knapsack problems. MHS employs position update strategy for generating new solution vectors that enhances accuracy and convergence rate of harmony search (HS) algorithm. Besides, the harmony memory consideration rate (HMCR) is dynamically adapted to the changing of objective function value in the current harmony memory, and the key parameters PAR and BW dynamically adjusted with the number of generation. Based on the experiment of solving ten classic 0-1 knapsack problems, the MHS has demonstrated stronger convergence and stability than original harmony search (HS) algorithm and its two improved algorithms (IHS and NGHS).


Sign in / Sign up

Export Citation Format

Share Document