scholarly journals New Variants of Newton’s Method for Nonlinear Unconstrained Optimization Problems

2010 ◽  
Vol 02 (01) ◽  
pp. 40-45
Author(s):  
V. KANWAR ◽  
Kapil K. SHARMA ◽  
Ramandeep BEHL
Author(s):  
Sanjeev Kumar ◽  
Vinay Kanwar ◽  
Sushil Kumar Tomar ◽  
Sukhjit Singh

One-parameter families of Newton's iterative method for the solution of nonlinear equations and its extension to unconstrained optimization problems are presented in the paper. These methods are derived by implementing approximations through a straight line and through a parabolic curve in the vicinity of the root. The presented variants are found to yield better performance than Newton's method, in addition that they overcome its limitations.


2019 ◽  
Vol 53 (2) ◽  
pp. 657-666
Author(s):  
Mohammad Afzalinejad

A problem with rapidly convergent methods for unconstrained optimization like the Newton’s method is the computational difficulties arising specially from the second derivative. In this paper, a class of methods for solving unconstrained optimization problems is proposed which implicitly applies approximations to derivatives. This class of methods is based on a modified Steffensen method for finding roots of a function and attempts to make a quadratic model for the function without using the second derivative. Two methods of this kind with non-expensive computations are proposed which just use first derivative of the function. Derivative-free versions of these methods are also suggested for the cases where the gradient formulas are not available or difficult to evaluate. The theory as well as numerical examinations confirm the rapid convergence of this class of methods.


2016 ◽  
Vol 78 (6-4) ◽  
Author(s):  
Nur Syarafina Mohamed ◽  
Mustafa Mamat ◽  
Fatma Susilawati Mohamad ◽  
Mohd Rivaie

Conjugate gradient (CG) methods are widely used in solving nonlinear unconstrained optimization problems such as designs, economics, physics and engineering due to its low computational memory requirement. In this paper, a new modifications of CG coefficient ( ) which possessed global convergence properties is proposed by using exact line search. Based on the number of iterations and central processing unit (CPU) time, the numerical results show that the new  performs better than some other well known CG methods under some standard test functions.


2021 ◽  
Vol 2021 (1) ◽  
Author(s):  
Shashi Kant Mishra ◽  
Suvra Kanti Chakraborty ◽  
Mohammad Esmael Samei ◽  
Bhagwat Ram

AbstractA Polak–Ribière–Polyak (PRP) algorithm is one of the oldest and popular conjugate gradient algorithms for solving nonlinear unconstrained optimization problems. In this paper, we present a q-variant of the PRP (q-PRP) method for which both the sufficient and conjugacy conditions are satisfied at every iteration. The proposed method is convergent globally with standard Wolfe conditions and strong Wolfe conditions. The numerical results show that the proposed method is promising for a set of given test problems with different starting points. Moreover, the method reduces to the classical PRP method as the parameter q approaches 1.


2020 ◽  
Vol 9 (2) ◽  
pp. 101-105
Author(s):  
Hussein Ageel Khatab ◽  
Salah Gazi Shareef

In this paper, we propose a new conjugate gradient method for solving nonlinear unconstrained optimization. The new method consists of three parts, the first part of them is the parameter of Hestenes-Stiefel (HS). The proposed method is satisfying the descent condition, sufficient descent condition and conjugacy condition. We give some numerical results to show the efficiency of the suggested method.


2015 ◽  
Vol 2015 ◽  
pp. 1-7
Author(s):  
Guanghui Zhou ◽  
Qin Ni

A new spectral conjugate gradient method (SDYCG) is presented for solving unconstrained optimization problems in this paper. Our method provides a new expression of spectral parameter. This formula ensures that the sufficient descent condition holds. The search direction in the SDYCG can be viewed as a combination of the spectral gradient and the Dai-Yuan conjugate gradient. The global convergence of the SDYCG is also obtained. Numerical results show that the SDYCG may be capable of solving large-scale nonlinear unconstrained optimization problems.


2014 ◽  
Vol 8 (1) ◽  
pp. 218-221 ◽  
Author(s):  
Ping Hu ◽  
Zong-yao Wang

We propose a non-monotone line search combination rule for unconstrained optimization problems, the corresponding non-monotone search algorithm is established and its global convergence can be proved. Finally, we use some numerical experiments to illustrate the new combination of non-monotone search algorithm’s effectiveness.


Sign in / Sign up

Export Citation Format

Share Document