scholarly journals Modified Conjugate Gradient Method via DFP Update for Solving Systems of Nonlinear Equations

2020 ◽  
Vol 3 (1) ◽  
pp. 43-49
Author(s):  
M K Dauda

In this study, a fully derivative-free method for solving large scale nonlinear systems of equations via memoryless DFP update is presented. The new proposed method is an enhanced DFP (Davidon-FletcherPowell) update which is matrix and derivative free thereby require low memory storage. Under suitable conditions, the proposed method converges globally. Numerical comparisons using a set of large-scale test problems showed that the proposed method is efficient.

2021 ◽  
Vol 4 (4) ◽  
pp. 382-390
Author(s):  
Muhammad Kabir Dauda

Nonlinear problems mostly emanate from the work of engineers, physicists, mathematicians and many other scientists. A variety of iterative methods have been developed for solving large scale nonlinear systems of equations. A prominent method for solving such equations is the classical Newton’s method, but it has many shortcomings that include computing Jacobian inverse that sometimes fails. To overcome such drawbacks, an approximation with derivative free line is used on an existing method. The method uses PSB (Powell-Symmetric Broyden) update. The efficiency of the proposed method has been improved in terms of number of iteration and CPU time, hence the aim of this research. The preliminary numerical results show that the proposed method is practically efficient when applied on some benchmark problems.


2017 ◽  
Vol 95 (3) ◽  
pp. 500-511 ◽  
Author(s):  
XIAOWEI FANG ◽  
QIN NI

We propose a new derivative-free conjugate gradient method for large-scale nonlinear systems of equations. The method combines the Rivaie–Mustafa–Ismail–Leong conjugate gradient method for unconstrained optimisation problems and a new nonmonotone line-search method. The global convergence of the proposed method is established under some mild assumptions. Numerical results using 104 test problems from the CUTEst test problem library show that the proposed method is promising.


Author(s):  
Alessandra Papini ◽  
Margherita Porcelli ◽  
Cristina Sgattoni

AbstractWe present a derivative-free method for solving systems of nonlinear equations that belongs to the class of spectral residual methods. We will show that by endowing a previous version of the algorithm with a suitable new linesearch strategy, standard global convergence results can be attained under mild general assumptions. The robustness of the new method is therefore potentially improved with respect to the previous version as shown by the reported numerical experiments.


Author(s):  
Mohammed Yusuf Waziri ◽  
Jamilu Sabi’u

We suggest a conjugate gradient (CG) method for solving symmetric systems of nonlinear equations without computing Jacobian and gradient via the special structure of the underlying function. This derivative-free feature of the proposed method gives it advantage to solve relatively large-scale problems (500,000 variables) with lower storage requirement compared to some existing methods. Under appropriate conditions, the global convergence of our method is reported. Numerical results on some benchmark test problems show that the proposed method is practically effective.


2020 ◽  
Vol 2020 ◽  
pp. 1-10
Author(s):  
Zhenhua Su ◽  
Min Li

In this paper, a descent Liu–Storey conjugate gradient method is extended to solve large-scale nonlinear systems of equations. Based on certain assumptions, the global convergence property is obtained with a nonmonotone line search. The proposed method is suitable to solve large-scale problems for the low-storage requirement. Numerical experiment results show that the new method is practically effective.


Sign in / Sign up

Export Citation Format

Share Document