conjugate gradient algorithms
Recently Published Documents


TOTAL DOCUMENTS

149
(FIVE YEARS 30)

H-INDEX

19
(FIVE YEARS 2)

Symmetry ◽  
2022 ◽  
Vol 14 (1) ◽  
pp. 80
Author(s):  
Jun Huo ◽  
Jielan Yang ◽  
Guoxin Wang ◽  
Shengwei Yao

In this paper, a three-parameter subspace conjugate gradient method is proposed for solving large-scale unconstrained optimization problems. By minimizing the quadratic approximate model of the objective function on a new special three-dimensional subspace, the embedded parameters are determined and the corresponding algorithm is obtained. The global convergence result of a given method for general nonlinear functions is established under mild assumptions. In numerical experiments, the proposed algorithm is compared with SMCG_NLS and SMCG_Conic, which shows that the given algorithm is robust and efficient.


2021 ◽  
Vol ahead-of-print (ahead-of-print) ◽  
Author(s):  
Abubakar Sani Halilu ◽  
Arunava Majumder ◽  
Mohammed Yusuf Waziri ◽  
Kabiru Ahmed ◽  
Aliyu Muhammed Awwal

PurposeThe purpose of this research is to propose a new choice of nonnegative parameter t in Dai–Liao conjugate gradient method.Design/methodology/approachConjugate gradient algorithms are used to solve both constrained monotone and general systems of nonlinear equations. This is made possible by combining the conjugate gradient method with the Newton method approach via acceleration parameter in order to present a derivative-free method.FindingsA conjugate gradient method is presented by proposing a new Dai–Liao nonnegative parameter. Furthermore the proposed method is successfully applied to handle the application in motion control of the two joint planar robotic manipulators.Originality/valueThe proposed algorithm is a new approach that will not either submitted or publish somewhere.


Author(s):  
Aseel M. Qasim ◽  
Zinah F. Salih ◽  
Basim A. Hassan

The primarily objective of this paper which is indicated in the field of conjugate gradient algorithms for unconstrained optimization problems and algorithms is to show the advantage of the new proposed algorithm in comparison with the standard method which is denoted as. Hestenes Stiefel method, as we know the coefficient conjugate parameter is very crucial for this reason, we proposed a simple modification of the coefficient conjugate gradient which is used to derived the new formula for the conjugate gradient update parameter described in this paper. Our new modification is based on the conjugacy situation for nonlinear conjugate gradient methods which is given by the conjugacy condition for nonlinear conjugate gradient methods and added a nonnegative parameter to suggest the new extension of the method. Under mild Wolfe conditions, the global convergence theorem and lemmas are also defined and proved. The proposed method's efficiency is programming and demonstrated by the numerical instances, which were very encouraging.


2021 ◽  
Vol 29 (3) ◽  
pp. 183-200
Author(s):  
Gh. Juncu ◽  
C. Popa ◽  
Gh. Sarbu

Abstract This work continues our previous analysis concerning the numerical solution of the multi-component mass transfer equations. The present test problems are two-dimensional, parabolic, non-linear, diffusion- reaction equations. An implicit finite difference method was used to discretize the mathematical model equations. The algorithm used to solve the non-linear system resulted for each time step is the modified Picard iteration. The numerical performances of the preconditioned conjugate gradient algorithms (BICGSTAB and GMRES) in solving the linear systems of the modified Picard iteration were analysed in detail. The numerical results obtained show good numerical performances.


2021 ◽  
Vol 11 (1) ◽  
pp. 1-9
Author(s):  
Ahmed Anwer Mustafa ◽  
Salah Gazi Shareef

In this paper, a new formula of 𝛽𝑘 is suggested for the conjugate gradient method of solving unconstrained optimization problems based on three terms and step size of cubic. Our new proposed CG method has descent condition, sufficient descent condition, conjugacy condition, and global convergence properties. Numerical comparisons with two standard conjugate gradient algorithms show that this algorithm is very effective depending on the number of iterations and the number of functions evaluated.


Mathematics ◽  
2021 ◽  
Vol 9 (16) ◽  
pp. 2015
Author(s):  
Jose Giovany Babativa-Márquez ◽  
José Luis Vicente-Villardón

Multivariate binary data are increasingly frequent in practice. Although some adaptations of principal component analysis are used to reduce dimensionality for this kind of data, none of them provide a simultaneous representation of rows and columns (biplot). Recently, a technique named logistic biplot (LB) has been developed to represent the rows and columns of a binary data matrix simultaneously, even though the algorithm used to fit the parameters is too computationally demanding to be useful in the presence of sparsity or when the matrix is large. We propose the fitting of an LB model using nonlinear conjugate gradient (CG) or majorization–minimization (MM) algorithms, and a cross-validation procedure is introduced to select the hyperparameter that represents the number of dimensions in the model. A Monte Carlo study that considers scenarios with several sparsity levels and different dimensions of the binary data set shows that the procedure based on cross-validation is successful in the selection of the model for all algorithms studied. The comparison of the running times shows that the CG algorithm is more efficient in the presence of sparsity and when the matrix is not very large, while the performance of the MM algorithm is better when the binary matrix is balanced or large. As a complement to the proposed methods and to give practical support, a package has been written in the R language called BiplotML. To complete the study, real binary data on gene expression methylation are used to illustrate the proposed methods.


Author(s):  
Alaa Saad Ahmed ◽  
Hisham M. Khudhur ◽  
Mohammed S. Najmuldeen

<span>In this study, we develop a different parameter of three term conjugate gradient kind, this scheme depends principally on pure conjugacy condition (PCC), Whereas, the conjugacy condition (PCC) is an important condition in unconstrained non-linear optimization in general and in conjugate gradient methods in particular. The proposed method becomes converged, and satisfy conditions descent property by assuming some hypothesis, The numerical results display the effectiveness of the new method for solving test unconstrained non-linear optimization problems compared to other conjugate gradient algorithms such as Fletcher and Revees (FR) algorithm and three term Fletcher and Revees (TTFR) algorithm. and as shown in Table (1) from where in a number of iterations and evaluation of function and in Figures (1), (2) and (3) from where in A comparison of the number of iterations, A comparison of the number of times a function is calculated and A comparison of the time taken to perform the functions.</span>


2021 ◽  
Vol 2021 (1) ◽  
Author(s):  
Shashi Kant Mishra ◽  
Suvra Kanti Chakraborty ◽  
Mohammad Esmael Samei ◽  
Bhagwat Ram

AbstractA Polak–Ribière–Polyak (PRP) algorithm is one of the oldest and popular conjugate gradient algorithms for solving nonlinear unconstrained optimization problems. In this paper, we present a q-variant of the PRP (q-PRP) method for which both the sufficient and conjugacy conditions are satisfied at every iteration. The proposed method is convergent globally with standard Wolfe conditions and strong Wolfe conditions. The numerical results show that the proposed method is promising for a set of given test problems with different starting points. Moreover, the method reduces to the classical PRP method as the parameter q approaches 1.


Sign in / Sign up

Export Citation Format

Share Document