inexact line search
Recently Published Documents


TOTAL DOCUMENTS

58
(FIVE YEARS 24)

H-INDEX

10
(FIVE YEARS 2)

2022 ◽  
Vol 2022 (1) ◽  
Author(s):  
Ibrahim Mohammed Sulaiman ◽  
Maulana Malik ◽  
Aliyu Muhammed Awwal ◽  
Poom Kumam ◽  
Mustafa Mamat ◽  
...  

AbstractThe three-term conjugate gradient (CG) algorithms are among the efficient variants of CG algorithms for solving optimization models. This is due to their simplicity and low memory requirements. On the other hand, the regression model is one of the statistical relationship models whose solution is obtained using one of the least square methods including the CG-like method. In this paper, we present a modification of a three-term conjugate gradient method for unconstrained optimization models and further establish the global convergence under inexact line search. The proposed method was extended to formulate a regression model for the novel coronavirus (COVID-19). The study considers the globally infected cases from January to October 2020 in parameterizing the model. Preliminary results have shown that the proposed method is promising and produces efficient regression model for COVID-19 pandemic. Also, the method was extended to solve a motion control problem involving a two-joint planar robot.


2021 ◽  
Vol 2021 ◽  
pp. 1-9
Author(s):  
Zhujun Wang ◽  
Li Cai

We propose a class of inexact secant methods in association with the line search filter technique for solving nonlinear equality constrained optimization. Compared with other filter methods that combine the line search method applied in most large-scale optimization problems, the inexact line search filter algorithm is more flexible and realizable. In this paper, we focus on the analysis of the local superlinear convergence rate of the algorithms, while their global convergence properties can be obtained by making an analogy with our previous work. These methods have been implemented in a Matlab code, and detailed numerical results indicate that the proposed algorithms are efficient for 43 problems from the CUTEr test set.


2021 ◽  
Vol 2 (2) ◽  
pp. 69
Author(s):  
Nasiru Salihu ◽  
Mathew Remilekun Odekunle ◽  
Mohammed Yusuf Waziri ◽  
Abubakar Sani Halilu ◽  
Suraj Salihu

One of todays’ best-performing CG methods is Dai-Liao (DL) method which depends on non-negative parameter  and conjugacy conditions for its computation. Although numerous optimal selections for the parameter were suggested, the best choice of  remains a subject of consideration. The pure conjugacy condition adopts an exact line search for numerical experiments and convergence analysis. Though, a practical mathematical experiment implies using an inexact line search to find the step size. To avoid such drawbacks, Dai and Liao substituted the earlier conjugacy condition with an extended conjugacy condition. Therefore, this paper suggests a new hybrid CG that combines the strength of Liu and Storey and Conjugate Descent CG methods by retaining a choice of Dai-Liao parameterthat is optimal. The theoretical analysis indicated that the search direction of the new CG scheme is descent and satisfies sufficient descent condition when the iterates jam under strong Wolfe line search. The algorithm is shown to converge globally using standard assumptions. The numerical experimentation of the scheme demonstrated that the proposed method is robust and promising than some known methods applying the performance profile Dolan and Mor´e on 250 unrestricted problems.  Numerical assessment of the tested CG algorithms with sparse signal reconstruction and image restoration in compressive sensing problems, file restoration, image video coding and other applications. The result shows that these CG schemes are comparable and can be applied in different fields such as temperature, fire, seismic sensors, and humidity detectors in forests, using wireless sensor network techniques.


Author(s):  
Ibrahim Mohammed Sulaiman ◽  
Norsuhaily Abu Bakar ◽  
Mustafa Mamat ◽  
Basim A. Hassan ◽  
Maulana Malik ◽  
...  

The hybrid conjugate gradient (CG) method is among the efficient variants of CG method for solving optimization problems. This is due to their low memory requirements and nice convergence properties. In this paper, we present an efficient hybrid CG method for solving unconstrained optimization models and show that the method satisfies the sufficient descent condition. The global convergence prove of the proposed method would be established under inexact line search. Application of the proposed method to the famous statistical regression model describing the global outbreak of the novel COVID-19 is presented. The study parameterized the model using the weekly increase/decrease of recorded cases from December 30, 2019 to March 30, 2020. Preliminary numerical results on some unconstrained optimization problems show that the proposed method is efficient and promising. Furthermore, the proposed method produced a good regression equation for COVID-19 confirmed cases globally.


Author(s):  
Chergui Ahmed ◽  
Bouali Tahar

In this paper, We propose a new nonlinear conjugate gradient method (FRA) that satisfies a sufficient descent condition and global convergence under the inexact line search of strong wolf powell. Our numerical experiment shaw the efficiency of the new method in solving a set of problems from the CUTEst package, the proposed new formula gives excellent numerical results at CPU time, number of iterations, number of gradient ratings when compared to WYL, DY, PRP, and FR methods.


2021 ◽  
Vol 2 (1) ◽  
pp. 33
Author(s):  
Nasiru Salihu ◽  
Mathew Remilekun Odekunle ◽  
Also Mohammed Saleh ◽  
Suraj Salihu

Some problems have no analytical solution or too difficult to solve by scientists, engineers, and mathematicians, so the development of numerical methods to obtain approximate solutions became necessary. Gradient methods are more efficient when the function to be minimized continuously in its first derivative. Therefore, this article presents a new hybrid Conjugate Gradient (CG) method to solve unconstrained optimization problems. The method requires the first-order derivatives but overcomes the steepest descent method’s shortcoming of slow convergence and needs not to save or compute the second-order derivatives needed by the Newton method. The CG update parameter is suggested from the Dai-Liao conjugacy condition as a convex combination of Hestenes-Stiefel and Fletcher-Revees algorithms by employing an optimal modulating choice parameterto avoid matrix storage. Numerical computation adopts an inexact line search to obtain the step-size that generates a decent property, showing that the algorithm is robust and efficient. The scheme converges globally under Wolfe line search, and it’s like is suitable in compressive sensing problems and M-tensor systems.


2021 ◽  
Vol 0 (0) ◽  
pp. 0
Author(s):  
Gonglin Yuan ◽  
Zhan Wang ◽  
Pengyuan Li

<p style='text-indent:20px;'>The Broyden family method is one of the most effective methods for solving unconstrained optimization problems. However, the study of the global convergence of the Broyden family method is not sufficient. In this paper, a new Broyden family method is proposed based on the BFGS formula of Yuan and Wei (Comput. Optim. Appl. 47: 237-255, 2010). The following approaches are used in the designed algorithm: (1) a modified Broyden family formula is given, (2) every matrix sequence <inline-formula><tex-math id="M1">\begin{document}$ \{B_k\} $\end{document}</tex-math></inline-formula> generated by the new algorithm possesses positive-definiteness, and (3) the global convergence of the new presented Broyden family algorithm with the Y-W-L inexact line search is obtained for general functions. Numerical performance shows that the modified Broyden family method is competitive with the classical Broyden family method.</p>


Author(s):  
Awad Abdelrahman ◽  
Osman Yousif ◽  
Mogtaba Mhammed ◽  
Murtada Elbashir

Nonlinear conjugate gradient (CG) methods are significant for solving large-scale, unconstrained optimization problems, providing vital knowledge to determine the minimum point or optimize the objective functions. Many studies of modifications for nonlinear CG methods have been carried out to improve the performance of numerical computation and to establish global convergence properties. One of these studies is the modified CG method, which has been proposed by Rivaie et al. (2015). In this paper, we modify their work in such a way that one can obtain efficient numerical performance and global convergence properties. Due to the widespread use of the strong Wolfe line search in practice, our proposed modified method implemented its use. At the same time, to show the performance of the modified method in practice, a numerical experiment is performed. KEYWORDS Unconstrained optimization; conjugate gradient method; sufficient descent property; global convergence


Sign in / Sign up

Export Citation Format

Share Document