exact line search
Recently Published Documents


TOTAL DOCUMENTS

49
(FIVE YEARS 18)

H-INDEX

5
(FIVE YEARS 1)

Author(s):  
Amina Boumediene ◽  
Tahar Bechouat ◽  
Rachid Benzine ◽  
Ghania Hadji

The nonlinear Conjugate gradient method (CGM) is a very effective way in solving large-scale optimization problems. Zhang et al. proposed a new CG coefficient which is defined by [Formula: see text]. They proved the sufficient descent condition and the global convergence for nonconvex minimization in strong Wolfe line search. In this paper, we prove that this CG coefficient possesses sufficient descent conditions and global convergence properties under the exact line search.


2021 ◽  
Vol 2 (2) ◽  
pp. 69
Author(s):  
Nasiru Salihu ◽  
Mathew Remilekun Odekunle ◽  
Mohammed Yusuf Waziri ◽  
Abubakar Sani Halilu ◽  
Suraj Salihu

One of todays’ best-performing CG methods is Dai-Liao (DL) method which depends on non-negative parameter  and conjugacy conditions for its computation. Although numerous optimal selections for the parameter were suggested, the best choice of  remains a subject of consideration. The pure conjugacy condition adopts an exact line search for numerical experiments and convergence analysis. Though, a practical mathematical experiment implies using an inexact line search to find the step size. To avoid such drawbacks, Dai and Liao substituted the earlier conjugacy condition with an extended conjugacy condition. Therefore, this paper suggests a new hybrid CG that combines the strength of Liu and Storey and Conjugate Descent CG methods by retaining a choice of Dai-Liao parameterthat is optimal. The theoretical analysis indicated that the search direction of the new CG scheme is descent and satisfies sufficient descent condition when the iterates jam under strong Wolfe line search. The algorithm is shown to converge globally using standard assumptions. The numerical experimentation of the scheme demonstrated that the proposed method is robust and promising than some known methods applying the performance profile Dolan and Mor´e on 250 unrestricted problems.  Numerical assessment of the tested CG algorithms with sparse signal reconstruction and image restoration in compressive sensing problems, file restoration, image video coding and other applications. The result shows that these CG schemes are comparable and can be applied in different fields such as temperature, fire, seismic sensors, and humidity detectors in forests, using wireless sensor network techniques.


2021 ◽  
Vol 5 (1) ◽  
pp. 47
Author(s):  
Sindy Devila ◽  
Maulana Malik ◽  
Wed Giyarti

In this paper, we propose a new hybrid coefficient of conjugate gradient method (CG) for solving unconstrained optimization model.  The new coefficient is combination of part the MMSIS (Malik et.al, 2020) and PRP (Polak, Ribi'ere \& Polyak, 1969) coefficients.  Under exact line search, the search direction of new method satisfies the sufficient descent condition and based on certain assumption, we establish the global convergence properties.  Using some test functions, numerical results show that the proposed method is more efficient than MMSIS method.  Besides, the new method can be used to solve problem in minimizing portfolio selection risk .


MATEMATIKA ◽  
2020 ◽  
Vol 36 (3) ◽  
pp. 197-207
Author(s):  
Nurul Hafawati Fadhilah ◽  
Mohd Rivaie ◽  
Fuziyah Ishak ◽  
Nur Idalisa

Conjugate Gradient (CG) methods have an important role in solving largescale unconstrained optimization problems. Nowadays, the Three-Term CG method hasbecome a research trend of the CG methods. However, the existing Three-Term CGmethods could only be used with the inexact line search. When the exact line searchis applied, this Three-Term CG method will be reduced to the standard CG method.Hence in this paper, a new Three-Term CG method that could be used with the exactline search is proposed. This new Three-Term CG method satisfies the descent conditionusing the exact line search. Performance profile based on numerical results show thatthis proposed method outperforms the well-known classical CG method and some relatedhybrid methods. In addition, the proposed method is also robust in term of number ofiterations and CPU time.


2020 ◽  
Vol 64 (4) ◽  
pp. 483-503
Author(s):  
Xiaona Ma ◽  
Guanghe Liang ◽  
Shanhui Xu ◽  
Zhiyuan Li ◽  
Haixin Feng

Mathematics ◽  
2020 ◽  
Vol 8 (6) ◽  
pp. 977
Author(s):  
Siti Farhana Husin ◽  
Mustafa Mamat ◽  
Mohd Asrul Hery Ibrahim ◽  
Mohd Rivaie

This study employs exact line search iterative algorithms for solving large scale unconstrained optimization problems in which the direction is a three-term modification of iterative method with two different scaled parameters. The objective of this research is to identify the effectiveness of the new directions both theoretically and numerically. Sufficient descent property and global convergence analysis of the suggested methods are established. For numerical experiment purposes, the methods are compared with the previous well-known three-term iterative method and each method is evaluated over the same set of test problems with different initial points. Numerical results show that the performances of the proposed three-term methods are more efficient and superior to the existing method. These methods could also produce an approximate linear regression equation to solve the regression model. The findings of this study can help better understanding of the applicability of numerical algorithms that can be used in estimating the regression model.


Author(s):  
Sulaiman Mohammed Ibrahim ◽  
Usman Abbas Yakubu ◽  
Mustafa Mamat

Conjugate gradient (CG) methods are among the most efficient numerical methods for solving unconstrained optimization problems. This is due to their simplicty and  less computational cost in solving large-scale nonlinear problems. In this paper, we proposed some spectral CG methods using the classical CG search direction. The proposed methods are applied to real-life problems in regression analysis. Their convergence proof was establised under exact line search. Numerical results has shown that the proposed methods are efficient and promising.


Author(s):  
Nur Syarafina Mohamed ◽  
Mustafa Mamat ◽  
Mohd Rivaie ◽  
Shazlyn Milleana Shaharudin

<p><span>Hybridization is one of the popular approaches in modifying the conjugate gradient method. In this paper, a new hybrid conjugate gradient is suggested and analyzed in which the parameter <!--[if gte mso 9]><xml> <o:OLEObject Type="Embed" ProgID="Equation.3" ShapeID="_x0000_i1025" DrawAspect="Content" ObjectID="_1640083713"> </o:OLEObject> </xml><![endif]-->is evaluated as a convex combination of <!--[if gte mso 9]><xml> <o:OLEObject Type="Embed" ProgID="Equation.3" ShapeID="_x0000_i1026" DrawAspect="Content" ObjectID="_1640083714"> </o:OLEObject> </xml><![endif]--> while using exact line search. The proposed method is shown to possess both sufficient descent and global convergence properties. Numerical performances show that the proposed method is promising and has overpowered other hybrid conjugate gradient methods in its number of iterations and central processing unit per time. </span></p>


2020 ◽  
Vol 41 ◽  
pp. 101073 ◽  
Author(s):  
Wentao Xiang ◽  
Ahmad Karfoul ◽  
Chunfeng Yang ◽  
Huazhong Shu ◽  
Régine Le Bouquin Jeannès

Sign in / Sign up

Export Citation Format

Share Document