scholarly journals Efficient matrix-free direction method with line search for solving large-scale system of nonlinear equations

2020 ◽  
Vol 30 (4) ◽  
pp. 399-412
Author(s):  
Abubakar Halilu ◽  
Mohammed Waziri ◽  
Ibrahim Yusuf

We proposed a matrix-free direction with an inexact line search technique to solve system of nonlinear equations by using double direction approach. In this article, we approximated the Jacobian matrix by appropriately constructed matrix-free method via acceleration parameter. The global convergence of our method is established under mild conditions. Numerical comparisons reported in this paper are based on a set of large-scale test problems and show that the proposed method is efficient for large-scale problems.

2020 ◽  
Vol 8 (1) ◽  
pp. 165-174 ◽  
Author(s):  
Abubakar Sani Halilu ◽  
Mohammed Yusuf Waziri ◽  
Yau Balarabe Musa

In this paper, a single direction with double step length method for solving systems of nonlinear equations is presented. Main idea used in the algorithm is to approximate the Jacobian via acceleration parameter. Furthermore, the two step lengths are calculated using inexact line search procedure. This method is matrix-free, and so is advantageous when solving large-scale problems. The proposed method is proven to be globally convergent under appropriate conditions. The preliminary numerical results reported in this paper using a large-scale benchmark test problems show that the proposed method is practically quite effective.


2017 ◽  
Vol 6 (4) ◽  
pp. 147 ◽  
Author(s):  
Abubakar Sani Halilu ◽  
H. Abdullahi ◽  
Mohammed Yusuf Waziri

A variant method for solving system of nonlinear equations is presented. This method use the special form of iteration with two step length parameters, we suggest a derivative-free method without computing the Jacobian via acceleration parameter as well as inexact line search procedure. The proposed method is proven to be globally convergent under mild condition. The preliminary numerical comparison reported in this paper using a large scale benchmark test problems show that the proposed method is practically quite effective.


2019 ◽  
Vol 2 (3) ◽  
pp. 1-4
Author(s):  
Abubakar Sani Halilu ◽  
M K Dauda ◽  
M Y Waziri ◽  
M Mamat

An algorithm for solving large-scale systems of nonlinear equations based on the transformation of the Newton method with the line search into a derivative-free descent method is introduced. Main idea used in the algorithm construction is to approximate the Jacobian by an appropriate diagonal matrix. Furthermore, the step length is calculated using inexact line search procedure. Under appropriate conditions, the proposed method is proved to be globally convergent under mild conditions. The numerical results presented show the efficiency of the proposed method.


2018 ◽  
Vol 7 (3.28) ◽  
pp. 36
Author(s):  
Norrlaili Shapiee ◽  
Mohd Rivaie ◽  
Mustafa Mamat ◽  
Puspa Liza Ghazali

Conjugate gradient (CG) methods are famous for their utilization in solving unconstrained optimization problems, particularly for large scale problems and have become more intriguing such as in engineering field. In this paper, we propose a new family of CG coefficient and apply in regression analysis. The global convergence is established by using exact and inexact line search. Numerical results are presented based on the number of iterations and CPU time. The findings show that our method is more efficient in comparison to some of the previous CG methods for a given standard test problems and successfully solve the real life problem.  


2020 ◽  
Vol 25 (2) ◽  
pp. 27
Author(s):  
Aliyu Muhammed Awwal ◽  
Lin Wang ◽  
Poom Kumam ◽  
Hassan Mohammad ◽  
Wiboonsak Watthayu

A number of practical problems in science and engineering can be converted into a system of nonlinear equations and therefore, it is imperative to develop efficient methods for solving such equations. Due to their nice convergence properties and low storage requirements, conjugate gradient methods are considered among the most efficient for solving large-scale nonlinear equations. In this paper, a modified conjugate gradient method is proposed based on a projection technique and a suitable line search strategy. The proposed method is matrix-free and its sequence of search directions satisfies sufficient descent condition. Under the assumption that the underlying function is monotone and Lipschitzian continuous, the global convergence of the proposed method is established. The method is applied to solve some benchmark monotone nonlinear equations and also extended to solve ℓ 1 -norm regularized problems to reconstruct a sparse signal in compressive sensing. Numerical comparison with some existing methods shows that the proposed method is competitive, efficient and promising.


2014 ◽  
Vol 2014 ◽  
pp. 1-12 ◽  
Author(s):  
Huiping Cao

Schubert’s method is an extension of Broyden’s method for solving sparse nonlinear equations, which can preserve the zero-nonzero structure defined by the sparse Jacobian matrix and can retain many good properties of Broyden’s method. In particular, Schubert’s method has been proved to be locally andq-superlinearly convergent. In this paper, we globalize Schubert’s method by using a nonmonotone line search. Under appropriate conditions, we show that the proposed algorithm converges globally and superlinearly. Some preliminary numerical experiments are presented, which demonstrate that our algorithm is effective for large-scale problems.


2019 ◽  
Vol 15 (1) ◽  
pp. 117-120 ◽  
Author(s):  
Muhammad Kabir Dauda ◽  
Mustafa Mamat ◽  
Mohamad Afendee Mohamed ◽  
Mahammad Yusuf Waziri

The systems of nonlinear equations emerges from many areas of computing, scientific and engineering research applications. A variety of an iterative methods for solving such systems have been developed, this include the famous Newton method. Unfortunately, the Newton method suffers setback, which includes storing  matrix at each iteration and computing Jacobian matrix, which may be difficult or even impossible to compute. To overcome the drawbacks that bedeviling Newton method, a modification to SR1 update was proposed in this study. With the aid of inexact line search procedure by Li and Fukushima, the modification was achieved by simply approximating the inverse Hessian matrix  with an identity matrix without computing the Jacobian. Unlike the classical SR1 method, the modification neither require storing  matrix at each iteration nor needed to compute the Jacobian matrix. In finding the solution to non-linear problems of the form  40 benchmark test problems were solved. A comparison was made with other two methods based on CPU time and number of iterations. In this study, the proposed method solved 37 problems effectively in terms of number of iterations. In terms of CPU time, the proposed method also outperformed the existing methods. The contribution from the methodology yielded a method that is suitable for solving symmetric systems of nonlinear equations. The derivative-free feature of the proposed method gave its advantage to solve relatively large-scale problems (10,000 variables) compared to the existing methods. From the preliminary numerical results, the proposed method turned out to be significantly faster, effective and suitable for solving large scale symmetric nonlinear equations.


2018 ◽  
Vol 2018 ◽  
pp. 1-11 ◽  
Author(s):  
Xiangrong Li ◽  
Songhua Wang ◽  
Zhongzhou Jin ◽  
Hongtruong Pham

This paper gives a modified Hestenes and Stiefel (HS) conjugate gradient algorithm under the Yuan-Wei-Lu inexact line search technique for large-scale unconstrained optimization problems, where the proposed algorithm has the following properties: (1) the new search direction possesses not only a sufficient descent property but also a trust region feature; (2) the presented algorithm has global convergence for nonconvex functions; (3) the numerical experiment showed that the new algorithm is more effective than similar algorithms.


Author(s):  
Mohammed Yusuf Waziri ◽  
Jamilu Sabi’u

We suggest a conjugate gradient (CG) method for solving symmetric systems of nonlinear equations without computing Jacobian and gradient via the special structure of the underlying function. This derivative-free feature of the proposed method gives it advantage to solve relatively large-scale problems (500,000 variables) with lower storage requirement compared to some existing methods. Under appropriate conditions, the global convergence of our method is reported. Numerical results on some benchmark test problems show that the proposed method is practically effective.


Sign in / Sign up

Export Citation Format

Share Document