A New Conjugate Gradient Method with Smoothing $$L_{1/2} $$ L 1 / 2 Regularization Based on a Modified Secant Equation for Training Neural Networks

2017 ◽  
Vol 48 (2) ◽  
pp. 955-978 ◽  
Author(s):  
Wenyu Li ◽  
Yan Liu ◽  
Jie Yang ◽  
Wei Wu
2018 ◽  
Vol 275 ◽  
pp. 308-316 ◽  
Author(s):  
Jian Wang ◽  
Bingjie Zhang ◽  
Zhanquan Sun ◽  
Wenxue Hao ◽  
Qingying Sun

2012 ◽  
Vol 2012 ◽  
pp. 1-8 ◽  
Author(s):  
Ioannis E. Livieris ◽  
Panagiotis Pintelas

We propose a conjugate gradient method which is based on the study of the Dai-Liao conjugate gradient method. An important property of our proposed method is that it ensures sufficient descent independent of the accuracy of the line search. Moreover, it achieves a high-order accuracy in approximating the second-order curvature information of the objective function by utilizing the modified secant condition proposed by Babaie-Kafaki et al. (2010). Under mild conditions, we establish that the proposed method is globally convergent for general functions provided that the line search satisfies the Wolfe conditions. Numerical experiments are also presented.


Sign in / Sign up

Export Citation Format

Share Document