A comparative study of the PL homotopy and BFGS methods for some nonsmooth optimization problems

2019 ◽  
Vol 28 (2) ◽  
pp. 97-104
Author(s):  
ANDREI BOZANTAN ◽  
VASILE BERINDE

We consider some non-smooth functions and investigate the numerical behavior of the Piecewise Linear Hompotopy (PLH) method ([Bozântan, A., An implementation of the piecewise-linear homotopy algorithm for the computation of fixed points, Creat. Math. Inform., 19 (2010), No.~2, 140–148] and [Bozântan, A. and Berinde, V., Applications of the PL homotopy algorithm for the computation of fixed points to unconstrained optimization problems, Creat. Math. Inform., 22 (2013), No. 1, 41–46]). We compare the PLH method with the BFGS with inexact line search, a quasi-Newton method, having some results reported in [Lewis, A. S. and Overton, M. L., Nonsmooth optimization via BFGS, submitted to SIAM J. Optimiz, (2009)]. For most of the considered cases, the characteristics of the PLH method are quite similar to the BFGS method, that is, the PLH method converges to local minimum values and the convergence rate seems to be linear with respect to the number of function evaluations, but we also identify some issues with the PLH method.

2013 ◽  
Vol 22 (1) ◽  
pp. 41-46
Author(s):  
ANDREI BOZANTAN ◽  
◽  
VASILE BERINDE ◽  

This paper describes the main aspects of the ”piecewise-linear homotopy method” for fixed point approximation proposed by Eaves and Saigal [Eaves, C. B. and Saigal, R., Homotopies for computation of fixed points on unbounded regions, Mathematical Programming, 3 (1972), No. 1, 225–237]. The implementation of the method is developed using the modern programming language C# and then is used for solving some unconstrained optimization problems. The PL homotopy algorithm appears to be more reliable than the classical Newton method in the case of the problem of finding a local minima for Schwefel’s function and other optimization problems.


2013 ◽  
Vol 2013 ◽  
pp. 1-10
Author(s):  
Hamid Reza Erfanian ◽  
M. H. Noori Skandari ◽  
A. V. Kamyad

We present a new approach for solving nonsmooth optimization problems and a system of nonsmooth equations which is based on generalized derivative. For this purpose, we introduce the first order of generalized Taylor expansion of nonsmooth functions and replace it with smooth functions. In other words, nonsmooth function is approximated by a piecewise linear function based on generalized derivative. In the next step, we solve smooth linear optimization problem whose optimal solution is an approximate solution of main problem. Then, we apply the results for solving system of nonsmooth equations. Finally, for efficiency of our approach some numerical examples have been presented.


2018 ◽  
Vol 7 (3.28) ◽  
pp. 36
Author(s):  
Norrlaili Shapiee ◽  
Mohd Rivaie ◽  
Mustafa Mamat ◽  
Puspa Liza Ghazali

Conjugate gradient (CG) methods are famous for their utilization in solving unconstrained optimization problems, particularly for large scale problems and have become more intriguing such as in engineering field. In this paper, we propose a new family of CG coefficient and apply in regression analysis. The global convergence is established by using exact and inexact line search. Numerical results are presented based on the number of iterations and CPU time. The findings show that our method is more efficient in comparison to some of the previous CG methods for a given standard test problems and successfully solve the real life problem.  


2019 ◽  
Vol 38 (7) ◽  
pp. 227-231
Author(s):  
Huda Younus Najm ◽  
Eman T. Hamed ◽  
Huda I. Ahmed

In this study, we propose a new parameter in the conjugate gradient method. It is shown that the new method fulfils the sufficient descent condition with the strong Wolfe condition when inexact line search has been used. The numerical results of this suggested method also shown that this method outperforms to other standard conjugate gradient method.


2021 ◽  
Vol 0 (0) ◽  
pp. 0
Author(s):  
Gonglin Yuan ◽  
Zhan Wang ◽  
Pengyuan Li

<p style='text-indent:20px;'>The Broyden family method is one of the most effective methods for solving unconstrained optimization problems. However, the study of the global convergence of the Broyden family method is not sufficient. In this paper, a new Broyden family method is proposed based on the BFGS formula of Yuan and Wei (Comput. Optim. Appl. 47: 237-255, 2010). The following approaches are used in the designed algorithm: (1) a modified Broyden family formula is given, (2) every matrix sequence <inline-formula><tex-math id="M1">\begin{document}$ \{B_k\} $\end{document}</tex-math></inline-formula> generated by the new algorithm possesses positive-definiteness, and (3) the global convergence of the new presented Broyden family algorithm with the Y-W-L inexact line search is obtained for general functions. Numerical performance shows that the modified Broyden family method is competitive with the classical Broyden family method.</p>


Author(s):  
Fanar N. Jardow ◽  
Ghada M. Al-Naemi

Many researchers are interested for developed and improved the conjugate gradient method for solving large scale unconstrained optimization problems. In this work a new parameter  will be presented as a convex combination between RMIL and MMWU. The suggestion method always produces a descent search direction at each iteration. Under Strong Wolfe Powell (SWP) line search conditions, the global convergence of the proposed method is established. The preliminary numerical comparisons with some others CG methods have shown that this new method is efficient and robust in solving all given problems.


Author(s):  
Ibrahim Mohammed Sulaiman ◽  
Norsuhaily Abu Bakar ◽  
Mustafa Mamat ◽  
Basim A. Hassan ◽  
Maulana Malik ◽  
...  

The hybrid conjugate gradient (CG) method is among the efficient variants of CG method for solving optimization problems. This is due to their low memory requirements and nice convergence properties. In this paper, we present an efficient hybrid CG method for solving unconstrained optimization models and show that the method satisfies the sufficient descent condition. The global convergence prove of the proposed method would be established under inexact line search. Application of the proposed method to the famous statistical regression model describing the global outbreak of the novel COVID-19 is presented. The study parameterized the model using the weekly increase/decrease of recorded cases from December 30, 2019 to March 30, 2020. Preliminary numerical results on some unconstrained optimization problems show that the proposed method is efficient and promising. Furthermore, the proposed method produced a good regression equation for COVID-19 confirmed cases globally.


MATEMATIKA ◽  
2020 ◽  
Vol 36 (3) ◽  
pp. 197-207
Author(s):  
Nurul Hafawati Fadhilah ◽  
Mohd Rivaie ◽  
Fuziyah Ishak ◽  
Nur Idalisa

Conjugate Gradient (CG) methods have an important role in solving largescale unconstrained optimization problems. Nowadays, the Three-Term CG method hasbecome a research trend of the CG methods. However, the existing Three-Term CGmethods could only be used with the inexact line search. When the exact line searchis applied, this Three-Term CG method will be reduced to the standard CG method.Hence in this paper, a new Three-Term CG method that could be used with the exactline search is proposed. This new Three-Term CG method satisfies the descent conditionusing the exact line search. Performance profile based on numerical results show thatthis proposed method outperforms the well-known classical CG method and some relatedhybrid methods. In addition, the proposed method is also robust in term of number ofiterations and CPU time.


Sign in / Sign up

Export Citation Format

Share Document