secant condition
Recently Published Documents


TOTAL DOCUMENTS

25
(FIVE YEARS 4)

H-INDEX

7
(FIVE YEARS 1)

Author(s):  
Muna M. M. Ali

The use of the self-scaling Broyden-Fletcher-Goldfarb-Shanno (BFGS) method is very efficient for the resolution of large-scale optimization problems, in this paper, we present a new algorithm and modified the self-scaling BFGS algorithm. Also, based on noticeable non-monotone line search properties, we discovered and employed a new non-monotone idea. Thereafter first, an updated formula is exhorted to the convergent Hessian matrix and we have achieved the secant condition, second, we established the global convergence properties of the algorithm under some mild conditions and the objective function is not convexity hypothesis. A promising behavior is achieved and the numerical results are also reported of the new algorithm.


Author(s):  
Fahimeh Abdollahi ◽  
M. Fatemi

We propose an effective conjugate gradient method belonging to the class of Dai-Liao methods for solving unconstrained optimization problems. We employ a variant of the modified secant condition, and introduce a new conjugate gradient parameter by solving an optimization problem. Optimization problem combines the well-known features of the linear conjugate gradient method using some penalty functions. This new parameter takes advantage of function information as well as the gradient information to provide the iterations. Our proposed method is globally convergent under mild assumptions. We examine the ability of the method for solving some real world problems from image processing field. Numerical results show that the proposed method is efficient in the sense of PSNR test. We also compare our proposed method with some well-known existing algorithms using a collection of CUTEr problems to show it's efficiency.


2018 ◽  
Vol 103 (12) ◽  
pp. 1889-1902
Author(s):  
Usman Abbas Yakubu ◽  
Mustafa Mamat ◽  
Mohamad Afendee Mohamad ◽  
Mohd Rivaie ◽  
Rabi’u Bashir Yunus

Filomat ◽  
2016 ◽  
Vol 30 (5) ◽  
pp. 1283-1296
Author(s):  
Keyvan Amini ◽  
Somayeh Bahrami ◽  
Shadi Amiri

In this paper, a modified BFGS algorithm is proposed to solve unconstrained optimization problems. First, based on a modified secant condition, an update formula is recommended to approximate Hessian matrix. Then thanks to the remarkable nonmonotone line search properties, an appropriate nonmonotone idea is employed. Under some mild conditions, the global convergence properties of the algorithm are established without convexity assumption on the objective function. Preliminary numerical experiments are also reported which indicate the promising behavior of the new algorithm.


Author(s):  
Minghou Cheng ◽  
Yu-Hong Dai ◽  
Rui Diao

Based on the idea of maximum determinant positive definite matrix completion, Yamashita proposed a sparse quasi-Newton update, called MCQN, for unconstrained optimization problems with sparse Hessian structures. Such an MCQN update keeps the sparsity structure of the Hessian while relaxing the secant condition. In this paper, we propose an alternative to the MCQN update, in which the quasi-Newton matrix satisfies the secant condition, but does not have the same sparsity structure as the Hessian in general. Our numerical results demonstrate the usefulness of the new MCQN update with the BFGS formula for a collection of test problems. A local and superlinear convergence analysis is also provided for the new MCQN update with the DFP formula.  


2012 ◽  
Vol 2012 ◽  
pp. 1-8 ◽  
Author(s):  
Ioannis E. Livieris ◽  
Panagiotis Pintelas

We propose a conjugate gradient method which is based on the study of the Dai-Liao conjugate gradient method. An important property of our proposed method is that it ensures sufficient descent independent of the accuracy of the line search. Moreover, it achieves a high-order accuracy in approximating the second-order curvature information of the objective function by utilizing the modified secant condition proposed by Babaie-Kafaki et al. (2010). Under mild conditions, we establish that the proposed method is globally convergent for general functions provided that the line search satisfies the Wolfe conditions. Numerical experiments are also presented.


Sign in / Sign up

Export Citation Format

Share Document