scholarly journals An Efficient Conjugate Gradient Method for Convex Constrained Monotone Nonlinear Equations with Applications

Mathematics ◽  
2019 ◽  
Vol 7 (9) ◽  
pp. 767 ◽  
Author(s):  
Abubakar ◽  
Kumam ◽  
Mohammad ◽  
Awwal

This research paper proposes a derivative-free method for solving systems of nonlinearequations with closed and convex constraints, where the functions under consideration are continuousand monotone. Given an initial iterate, the process first generates a specific direction and then employsa line search strategy along the direction to calculate a new iterate. If the new iterate solves theproblem, the process will stop. Otherwise, the projection of the new iterate onto the closed convex set(constraint set) determines the next iterate. In addition, the direction satisfies the sufficient descentcondition and the global convergence of the method is established under suitable assumptions.Finally, some numerical experiments were presented to show the performance of the proposedmethod in solving nonlinear equations and its application in image recovery problems.

2021 ◽  
Vol 0 (0) ◽  
pp. 0
Author(s):  
Abdulkarim Hassan Ibrahim ◽  
Poom Kumam ◽  
Min Sun ◽  
Parin Chaipunya ◽  
Auwal Bala Abubakar

<p style='text-indent:20px;'>In this paper, using the concept of inertial extrapolation, we introduce a globally convergent inertial extrapolation method for solving nonlinear equations with convex constraints for which the underlying mapping is monotone and Lipschitz continuous. The method can be viewed as a combination of the efficient three-term derivative-free method of Gao and He [Calcolo. 55(4), 1-17, 2018] with the inertial extrapolation step. Moreover, the algorithm is designed such that at every iteration, the method is free from derivative evaluations. Under standard assumptions, we establish the global convergence results for the proposed method. Numerical implementations illustrate the performance and advantage of this new method. Moreover, we also extend this method to solve the LASSO problems to decode a sparse signal in compressive sensing. Performance comparisons illustrate the effectiveness and competitiveness of our algorithm.</p>


2019 ◽  
Vol 2 (3) ◽  
pp. 1-4
Author(s):  
Abubakar Sani Halilu ◽  
M K Dauda ◽  
M Y Waziri ◽  
M Mamat

An algorithm for solving large-scale systems of nonlinear equations based on the transformation of the Newton method with the line search into a derivative-free descent method is introduced. Main idea used in the algorithm construction is to approximate the Jacobian by an appropriate diagonal matrix. Furthermore, the step length is calculated using inexact line search procedure. Under appropriate conditions, the proposed method is proved to be globally convergent under mild conditions. The numerical results presented show the efficiency of the proposed method.


Mathematics ◽  
2020 ◽  
Vol 8 (2) ◽  
pp. 168 ◽  
Author(s):  
Zhifeng Dai ◽  
Huan Zhu

The goal of this paper is to extend the modified Hestenes-Stiefel method to solve large-scale nonlinear monotone equations. The method is presented by combining the hyperplane projection method (Solodov, M.V.; Svaiter, B.F. A globally convergent inexact Newton method for systems of monotone equations, in: M. Fukushima, L. Qi (Eds.)Reformulation: Nonsmooth, Piecewise Smooth, Semismooth and Smoothing Methods, Kluwer Academic Publishers. 1998, 355-369) and the modified Hestenes-Stiefel method in Dai and Wen (Dai, Z.; Wen, F. Global convergence of a modified Hestenes-Stiefel nonlinear conjugate gradient method with Armijo line search. Numer Algor. 2012, 59, 79-93). In addition, we propose a new line search for the derivative-free method. Global convergence of the proposed method is established if the system of nonlinear equations are Lipschitz continuous and monotone. Preliminary numerical results are given to test the effectiveness of the proposed method.


Mathematics ◽  
2021 ◽  
Vol 9 (6) ◽  
pp. 583
Author(s):  
Beny Neta

A new high-order derivative-free method for the solution of a nonlinear equation is developed. The novelty is the use of Traub’s method as a first step. The order is proven and demonstrated. It is also shown that the method has much fewer divergent points and runs faster than an optimal eighth-order derivative-free method.


2013 ◽  
Vol 7 (2) ◽  
pp. 390-403 ◽  
Author(s):  
Janak Sharma ◽  
Himani Arora

We present a derivative free method of fourth order convergence for solving systems of nonlinear equations. The method consists of two steps of which first step is the well-known Traub's method. First-order divided difference operator for functions of several variables and direct computation by Taylor's expansion are used to prove the local convergence order. Computational efficiency of new method in its general form is discussed and is compared with existing methods of similar nature. It is proved that for large systems the new method is more efficient. Some numerical tests are performed to compare proposed method with existing methods and to confirm the theoretical results.


Sign in / Sign up

Export Citation Format

Share Document