sufficient descent condition
Recently Published Documents


TOTAL DOCUMENTS

45
(FIVE YEARS 20)

H-INDEX

8
(FIVE YEARS 1)

2021 ◽  
Vol 40 (3) ◽  
pp. 64-75
Author(s):  
Kanikar Muangchoo

In this paper, by combining the Solodov and Svaiter projection technique with the conjugate gradient method for unconstrained optimization proposed by Mohamed et al. (2020), we develop a derivative-free conjugate gradient method to solve nonlinear equations with convex constraints. The proposed method involves a spectral parameter which satisfies the sufficient descent condition. The global convergence is proved under the assumption that the underlying mapping is Lipschitz continuous and satisfies a weaker monotonicity condition. Numerical experiment shows that the proposed method is efficient.


Author(s):  
Amina Boumediene ◽  
Tahar Bechouat ◽  
Rachid Benzine ◽  
Ghania Hadji

The nonlinear Conjugate gradient method (CGM) is a very effective way in solving large-scale optimization problems. Zhang et al. proposed a new CG coefficient which is defined by [Formula: see text]. They proved the sufficient descent condition and the global convergence for nonconvex minimization in strong Wolfe line search. In this paper, we prove that this CG coefficient possesses sufficient descent conditions and global convergence properties under the exact line search.


2021 ◽  
Vol 11 (1) ◽  
pp. 1-9
Author(s):  
Ahmed Anwer Mustafa ◽  
Salah Gazi Shareef

In this paper, a new formula of 𝛽𝑘 is suggested for the conjugate gradient method of solving unconstrained optimization problems based on three terms and step size of cubic. Our new proposed CG method has descent condition, sufficient descent condition, conjugacy condition, and global convergence properties. Numerical comparisons with two standard conjugate gradient algorithms show that this algorithm is very effective depending on the number of iterations and the number of functions evaluated.


2021 ◽  
Vol 2 (2) ◽  
pp. 69
Author(s):  
Nasiru Salihu ◽  
Mathew Remilekun Odekunle ◽  
Mohammed Yusuf Waziri ◽  
Abubakar Sani Halilu ◽  
Suraj Salihu

One of todays’ best-performing CG methods is Dai-Liao (DL) method which depends on non-negative parameter  and conjugacy conditions for its computation. Although numerous optimal selections for the parameter were suggested, the best choice of  remains a subject of consideration. The pure conjugacy condition adopts an exact line search for numerical experiments and convergence analysis. Though, a practical mathematical experiment implies using an inexact line search to find the step size. To avoid such drawbacks, Dai and Liao substituted the earlier conjugacy condition with an extended conjugacy condition. Therefore, this paper suggests a new hybrid CG that combines the strength of Liu and Storey and Conjugate Descent CG methods by retaining a choice of Dai-Liao parameterthat is optimal. The theoretical analysis indicated that the search direction of the new CG scheme is descent and satisfies sufficient descent condition when the iterates jam under strong Wolfe line search. The algorithm is shown to converge globally using standard assumptions. The numerical experimentation of the scheme demonstrated that the proposed method is robust and promising than some known methods applying the performance profile Dolan and Mor´e on 250 unrestricted problems.  Numerical assessment of the tested CG algorithms with sparse signal reconstruction and image restoration in compressive sensing problems, file restoration, image video coding and other applications. The result shows that these CG schemes are comparable and can be applied in different fields such as temperature, fire, seismic sensors, and humidity detectors in forests, using wireless sensor network techniques.


Author(s):  
Ibrahim Mohammed Sulaiman ◽  
Norsuhaily Abu Bakar ◽  
Mustafa Mamat ◽  
Basim A. Hassan ◽  
Maulana Malik ◽  
...  

The hybrid conjugate gradient (CG) method is among the efficient variants of CG method for solving optimization problems. This is due to their low memory requirements and nice convergence properties. In this paper, we present an efficient hybrid CG method for solving unconstrained optimization models and show that the method satisfies the sufficient descent condition. The global convergence prove of the proposed method would be established under inexact line search. Application of the proposed method to the famous statistical regression model describing the global outbreak of the novel COVID-19 is presented. The study parameterized the model using the weekly increase/decrease of recorded cases from December 30, 2019 to March 30, 2020. Preliminary numerical results on some unconstrained optimization problems show that the proposed method is efficient and promising. Furthermore, the proposed method produced a good regression equation for COVID-19 confirmed cases globally.


Author(s):  
Morteza Kimiaei

AbstractThis paper discusses an active set trust-region algorithm for bound-constrained optimization problems. A sufficient descent condition is used as a computational measure to identify whether the function value is reduced or not. To get our complexity result, a critical measure is used which is computationally better than the other known critical measures. Under the positive definiteness of approximated Hessian matrices restricted to the subspace of non-active variables, it will be shown that unlimited zigzagging cannot occur. It is shown that our algorithm is competitive in comparison with the state-of-the-art solvers for solving an ill-conditioned bound-constrained least-squares problem.


2021 ◽  
Vol 2021 ◽  
pp. 1-15
Author(s):  
Ahmad Alhawarat ◽  
Zabidin Salleh ◽  
Ibtisam A. Masmali

The conjugate gradient is a useful tool in solving large- and small-scale unconstrained optimization problems. In addition, the conjugate gradient method can be applied in many fields, such as engineering, medical research, and computer science. In this paper, a convex combination of two different search directions is proposed. The new combination satisfies the sufficient descent condition and the convergence analysis. Moreover, a new conjugate gradient formula is proposed. The new formula satisfies the convergence properties with the descent property related to Hestenes–Stiefel conjugate gradient formula. The numerical results show that the new search direction outperforms both two search directions, making it convex between them. The numerical result includes the number of iterations, function evaluations, and central processing unit time. Finally, we present some examples about image restoration as an application of the proposed conjugate gradient method.


2021 ◽  
Vol 5 (1) ◽  
pp. 47
Author(s):  
Sindy Devila ◽  
Maulana Malik ◽  
Wed Giyarti

In this paper, we propose a new hybrid coefficient of conjugate gradient method (CG) for solving unconstrained optimization model.  The new coefficient is combination of part the MMSIS (Malik et.al, 2020) and PRP (Polak, Ribi'ere \& Polyak, 1969) coefficients.  Under exact line search, the search direction of new method satisfies the sufficient descent condition and based on certain assumption, we establish the global convergence properties.  Using some test functions, numerical results show that the proposed method is more efficient than MMSIS method.  Besides, the new method can be used to solve problem in minimizing portfolio selection risk .


Author(s):  
Chergui Ahmed ◽  
Bouali Tahar

In this paper, We propose a new nonlinear conjugate gradient method (FRA) that satisfies a sufficient descent condition and global convergence under the inexact line search of strong wolf powell. Our numerical experiment shaw the efficiency of the new method in solving a set of problems from the CUTEst package, the proposed new formula gives excellent numerical results at CPU time, number of iterations, number of gradient ratings when compared to WYL, DY, PRP, and FR methods.


Author(s):  
Ladan Arman ◽  
Yuanming Xu ◽  
Long Liping

Abstract In this paper, based on the efficient Conjugate Descent (CD) method, two generalized CD algorithms are proposed to solve the unconstrained optimization problems. These methods are three-term conjugate gradient methods which the generated directions by using the conjugate gradient parameters and independent of the line search satisfy in the sufficient descent condition. Furthermore, under the strong Wolfe line search, the global convergence of the proposed methods are proved. Also, the preliminary numerical results on the CUTEst collection are presented to show effectiveness of our methods.


Sign in / Sign up

Export Citation Format

Share Document