scholarly journals Nonmonotone Adaptive Barzilai-Borwein Gradient Algorithm for Compressed Sensing

2014 ◽  
Vol 2014 ◽  
pp. 1-6 ◽  
Author(s):  
Yuanying Qiu ◽  
Jianlei Yan ◽  
Fanyong Xu

We study a nonmonotone adaptive Barzilai-Borwein gradient algorithm forl1-norm minimization problems arising from compressed sensing. At each iteration, the generated search direction enjoys descent property and can be easily derived by minimizing a local approximal quadratic model and simultaneously taking the favorable structure of thel1-norm. Under some suitable conditions, its global convergence result could be established. Numerical results illustrate that the proposed method is promising and competitive with the existing algorithms NBBL1 and TwIST.

2019 ◽  
Vol 24 (1) ◽  
pp. 115
Author(s):  
Hind H. Mohammed

In this paper, we will present different type of CG algorithms depending on Peary conjugacy condition. The new conjugate gradient training (GDY) algorithm using to train MFNNs and prove it's descent property and global convergence for it and then we tested the behavior of this algorithm in the training of artificial neural networks and compared it with known algorithms in this field through two types of issues   http://dx.doi.org/10.25130/tjps.24.2019.020


Mathematics ◽  
2021 ◽  
Vol 9 (12) ◽  
pp. 1353
Author(s):  
Qi Tian ◽  
Xiaoliang Wang ◽  
Liping Pang ◽  
Mingkun Zhang ◽  
Fanyun Meng

Three-term conjugate gradient methods have attracted much attention for large-scale unconstrained problems in recent years, since they have attractive practical factors such as simple computation, low memory requirement, better descent property and strong global convergence property. In this paper, a hybrid three-term conjugate gradient algorithm is proposed and it owns a sufficient descent property, independent of any line search technique. Under some mild conditions, the proposed method is globally convergent for uniformly convex objective functions. Meanwhile, by using the modified secant equation, the proposed method is also global convergence without convexity assumption on the objective function. Numerical results also indicate that the proposed algorithm is more efficient and reliable than the other methods for the testing problems.


2020 ◽  
Vol 2020 ◽  
pp. 1-9
Author(s):  
Zhou Sheng ◽  
Dan Luo

In this paper, a Cauchy point direction trust region algorithm is presented to solve nonlinear equations. The search direction is an optimal convex combination of the trust region direction and the Cauchy point direction with the sufficiently descent property and the automatic trust region property. The global convergence of the proposed algorithm is proven under some conditions. The preliminary numerical results demonstrate that the proposed algorithm is promising and has better convergence behaviors than the other two existing algorithms for solving large-scale nonlinear equations.


2020 ◽  
Vol 25 (1) ◽  
pp. 128
Author(s):  
SHAHER QAHTAN HUSSEIN1 ◽  
GHASSAN EZZULDDIN ARIF1 ◽  
YOKSAL ABDLL SATTAR2

In this paper we can derive a new search direction of conjugating gradient method associated with (Dai-Liao method ) the new algorithm becomes converged by assuming some hypothesis. We are also able to prove the Descent property for the new method, numerical results showed for the proposed method is effective comparing with the (FR, HS and DY) methods.   http://dx.doi.org/10.25130/tjps.25.2020.019    


2015 ◽  
Vol 08 (02) ◽  
pp. 1550036
Author(s):  
H. Zegeye ◽  
O. A. Daman

We introduce an iterative process which converges strongly to the minimum-norm fixed point of Lipschitzian pseudocontractive mapping. As a consequence, convergence result to the minimum-norm zero of monotone mappings is proved. In addition, applications to convexly constrained linear inverse problems and convex minimization problems are included. Our theorems improve and unify most of the results that have been proved for this important class of nonlinear operators.


2021 ◽  
pp. 1-1
Author(s):  
Xiaomei Yang ◽  
Yubo Mei ◽  
Xunyong Hu ◽  
Ruiseng Luo ◽  
Kai Liu

Sign in / Sign up

Export Citation Format

Share Document