scholarly journals Global Convergence of Schubert’s Method for Solving Sparse Nonlinear Equations

2014 ◽  
Vol 2014 ◽  
pp. 1-12 ◽  
Author(s):  
Huiping Cao

Schubert’s method is an extension of Broyden’s method for solving sparse nonlinear equations, which can preserve the zero-nonzero structure defined by the sparse Jacobian matrix and can retain many good properties of Broyden’s method. In particular, Schubert’s method has been proved to be locally andq-superlinearly convergent. In this paper, we globalize Schubert’s method by using a nonmonotone line search. Under appropriate conditions, we show that the proposed algorithm converges globally and superlinearly. Some preliminary numerical experiments are presented, which demonstrate that our algorithm is effective for large-scale problems.

2012 ◽  
Vol 2012 ◽  
pp. 1-15 ◽  
Author(s):  
Zhong Jin ◽  
Yuqing Wang

An improved line search filter algorithm for the system of nonlinear equations is presented. We divide the equations into two groups, one contains the equations that are treated as equality constraints and the square of other equations is regarded as objective function. Two groups of equations are updated at every iteration in the works by Nie (2004, 2006, and 2006), by Nie et al. (2008), and by Gu (2011), while we just update them at the iterations when it is needed indeed. As a consequence, the scale of the calculation is decreased in a certain degree. Under some suitable conditions the global convergence can be induced. In the end, numerical experiments show that the method in this paper is effective.


Author(s):  
Jamilu Sabi'u

In this article, an enhanced conjugate gradient approach for solving symmetric nonlinear equations is propose without computing the Jacobian matrix. This approach is completely derivative and matrix free. Using classical assumptions the proposed method has global convergence with nonmonotone line search. Some reported numerical results shows the approach is promising.<p style="margin: 0px; text-indent: 0px; -qt-block-indent: 0; -qt-paragraph-type: empty;"> </p>


2013 ◽  
Vol 30 (01) ◽  
pp. 1250043
Author(s):  
LIANG YIN ◽  
XIONGDA CHEN

The conjugate gradient method is widely used in unconstrained optimization, especially for large-scale problems. Recently, Zhang et al. proposed a three-term PRP method (TTPRP) and a three-term HS method (TTHS), both of which can produce sufficient descent conditions. In this paper, the global convergence of the TTPRP and TTHS methods is studied, in which the line search procedure is replaced by a fixed formula of stepsize. This character is of significance when the line search is expensive in some particular applications. In addition, relevant computational results are also presented.


Author(s):  
Mohammed Yusuf Waziri ◽  
Jamilu Sabi’u

We suggest a conjugate gradient (CG) method for solving symmetric systems of nonlinear equations without computing Jacobian and gradient via the special structure of the underlying function. This derivative-free feature of the proposed method gives it advantage to solve relatively large-scale problems (500,000 variables) with lower storage requirement compared to some existing methods. Under appropriate conditions, the global convergence of our method is reported. Numerical results on some benchmark test problems show that the proposed method is practically effective.


2021 ◽  
Vol 2021 ◽  
pp. 1-6
Author(s):  
Masoud Hatamian ◽  
Mahmoud Paripour ◽  
Farajollah Mohammadi Yaghoobi ◽  
Nasrin Karamikabir

In this article, a new nonmonotone line search technique is proposed for solving a system of nonlinear equations. We attempt to answer this question how to control the degree of the nonmonotonicity of line search rules in order to reach a more efficient algorithm? Therefore, we present a novel algorithm that can avoid the increase of unsuccessful iterations. For this purpose, we show the robust behavior of the proposed algorithm by solving a few numerical examples. Under some suitable assumptions, the global convergence of our strategy is proved.


2020 ◽  
Vol 30 (4) ◽  
pp. 399-412
Author(s):  
Abubakar Halilu ◽  
Mohammed Waziri ◽  
Ibrahim Yusuf

We proposed a matrix-free direction with an inexact line search technique to solve system of nonlinear equations by using double direction approach. In this article, we approximated the Jacobian matrix by appropriately constructed matrix-free method via acceleration parameter. The global convergence of our method is established under mild conditions. Numerical comparisons reported in this paper are based on a set of large-scale test problems and show that the proposed method is efficient for large-scale problems.


2014 ◽  
Vol 8 (1) ◽  
pp. 218-221 ◽  
Author(s):  
Ping Hu ◽  
Zong-yao Wang

We propose a non-monotone line search combination rule for unconstrained optimization problems, the corresponding non-monotone search algorithm is established and its global convergence can be proved. Finally, we use some numerical experiments to illustrate the new combination of non-monotone search algorithm’s effectiveness.


Algorithms ◽  
2021 ◽  
Vol 14 (5) ◽  
pp. 146
Author(s):  
Aleksei Vakhnin ◽  
Evgenii Sopov

Modern real-valued optimization problems are complex and high-dimensional, and they are known as “large-scale global optimization (LSGO)” problems. Classic evolutionary algorithms (EAs) perform poorly on this class of problems because of the curse of dimensionality. Cooperative Coevolution (CC) is a high-performed framework for performing the decomposition of large-scale problems into smaller and easier subproblems by grouping objective variables. The efficiency of CC strongly depends on the size of groups and the grouping approach. In this study, an improved CC (iCC) approach for solving LSGO problems has been proposed and investigated. iCC changes the number of variables in subcomponents dynamically during the optimization process. The SHADE algorithm is used as a subcomponent optimizer. We have investigated the performance of iCC-SHADE and CC-SHADE on fifteen problems from the LSGO CEC’13 benchmark set provided by the IEEE Congress of Evolutionary Computation. The results of numerical experiments have shown that iCC-SHADE outperforms, on average, CC-SHADE with a fixed number of subcomponents. Also, we have compared iCC-SHADE with some state-of-the-art LSGO metaheuristics. The experimental results have shown that the proposed algorithm is competitive with other efficient metaheuristics.


Author(s):  
Jamilu Sabi'u ◽  
Abdullah Shah

In this article, we proposed two Conjugate Gradient (CG) parameters using the modified Dai-{L}iao condition and the descent three-term CG search direction. Both parameters are incorporated with the projection technique for solving large-scale monotone nonlinear equations. Using the Lipschitz and monotone assumptions, the global convergence of methods has been proved. Finally, numerical results are provided to illustrate the robustness of the proposed methods.


Author(s):  
Nelson Butuk ◽  
JeanPaul Pemba

Abstract This paper discusses an accurate numerical approach of computing the Jacobian Matrix for the calculation of low dimensional manifolds for kinetic chemical mechanism reduction. The approach is suitable for numerical computations of large scale problems and is more accurate than the finite difference approach of computing Jacobians. The method is demonstrated via a highly stiff reaction mechanism for the synthesis of Bromide acid and a H2/Air mechanism using a modified CHEMKIN package. The Bromide mechanism consisted of five species participating in six elementary chemical reactions and the H2/Air mechanism consisted of 11 species and 23 reactions. In both cases it is shown that the method is superior to the finite difference approach of computing derivatives with an arbitrary computational step size, h.


Sign in / Sign up

Export Citation Format

Share Document