scholarly journals A Novel Value for the Parameter in the Dai-Liao-Type Conjugate Gradient Method

2021 ◽  
Vol 2021 ◽  
pp. 1-10
Author(s):  
Branislav Ivanov ◽  
Predrag S. Stanimirović ◽  
Bilall I. Shaini ◽  
Hijaz Ahmad ◽  
Miao-Kun Wang

A new rule for calculating the parameter t involved in each iteration of the MHSDL (Dai-Liao) conjugate gradient (CG) method is presented. The new value of the parameter initiates a more efficient and robust variant of the Dai-Liao algorithm. Under proper conditions, theoretical analysis reveals that the proposed method in conjunction with backtracking line search is of global convergence. Numerical experiments are also presented, which confirm the influence of the new value of the parameter t on the behavior of the underlying CG optimization method. Numerical comparisons and the analysis of obtained results considering Dolan and Moré’s performance profile show better performances of the novel method with respect to all three analyzed characteristics: number of iterative steps, number of function evaluations, and CPU time.

2015 ◽  
Vol 2015 ◽  
pp. 1-19
Author(s):  
Huaiyuan Li ◽  
Hongfu Zuo ◽  
Dan Lei ◽  
Kun Liang ◽  
Tingting Lu

Combining maintenance tasks into work packages is not only necessary for arranging maintenance activities, but also critical for the reduction of maintenance cost. In order to optimize the combination of maintenance tasks by fuzzy C-means clustering algorithm, an improved fuzzy C-means clustering model is introduced in this paper. In order to reduce the dimension, variables representing clustering centers are eliminated in the improved cluster model. So the improved clustering model can be directly solved by the optimization method. To optimize the clustering model, a novel nonlinear simplex optimization method is also proposed in this paper. The novel method searches along all rays emitting from the center to each vertex, and those search directions are rightlyn+1positive basis. The algorithm has both theoretical convergence and good experimental effect. Taking the optimal combination of some maintenance tasks of a certain aircraft as an instance, the novel simplex optimization method and the clustering model both exhibit excellent performance.


2012 ◽  
Vol 4 (2) ◽  
pp. 238-249
Author(s):  
Qiaolin He

AbstractIn this paper, we propose a new two-level preconditioned C-G method which uses the quadratic smoothing and the linear correction in distorted but topo-logically structured grid. The CPU time of this method is less than that of the multigrid preconditioned C-G method (MGCG) using the quadratic element, but their accuracy is almost the same. Numerical experiments and eigenvalue analysis are given and the results show that the proposed two-level preconditioned method is efficient.


2020 ◽  
Vol 1 (1) ◽  
pp. 12-17
Author(s):  
Yasir Salih ◽  
Mustafa Mamat ◽  
Sukono Sukono

Conjugate Gradient (CG) method is a technique used in solving nonlinear unconstrained optimization problems. In this paper, we analysed the performance of two modifications and compared the results with the classical conjugate gradient methods of. These proposed methods possesse global convergence properties for general functions using exact line search. Numerical experiments show that the two modifications are more efficient for the test problems compared to classical CG coefficients.


2018 ◽  
Vol 7 (3.28) ◽  
pp. 12
Author(s):  
Wan Khadijah ◽  
Mohd Rivaie ◽  
Mustafa Mamat ◽  
Nurul Hajar ◽  
Nurul ‘Aini ◽  
...  

The conjugate gradient (CG) method is one of the most prominent methods for solving linear and nonlinear problems in optimization. In this paper, we propose a CG method with sufficient descent property under strong Wolfe line search. The proposed CG method is then applied to solve systems of linear equations. The numerical results obtained from the tests are evaluated based on number iteration and CPU time and then analyzed through performance profile. In order to examine its efficiency, the performance of our CG formula is compared to that of other CG methods. The results show that the proposed CG formula has better performance than the other tested CG methods.  


Author(s):  
Ghada M. Al-Naemi ◽  
Ahmed H. Sheekoo

<p>A new scaled conjugate gradient (SCG) method is proposed throughout this paper, the SCG technique may be a special important generalization conjugate gradient (CG) method, and it is an efficient numerical method for solving nonlinear large scale unconstrained optimization. As a result, we proposed the new SCG method with a strong Wolfe condition (SWC) line search is proposed. The proposed technique's descent property, as well as its global convergence property, are satisfied without the use of any line searches under some suitable assumptions. The proposed technique's efficiency and feasibility are backed up by numerical experiments comparing them to traditional CG techniques.</p>


2021 ◽  
Vol 2021 ◽  
pp. 1-17
Author(s):  
Jing Meng ◽  
Xian-Ming Gu ◽  
Wei-Hua Luo ◽  
Liang Fang

In this paper, we mainly focus on the development and study of a new global GCRO-DR method that allows both the flexible preconditioning and the subspace recycling for sequences of shifted linear systems. The novel method presented here has two main advantages: firstly, it does not require the right-hand sides to be related, and, secondly, it can also be compatible with the general preconditioning. Meanwhile, we apply the new algorithm to solve the general coupled matrix equations. Moreover, by performing an error analysis, we deduce that a much looser tolerance can be applied to save computation by limiting the flexible preconditioned work without sacrificing the closeness of the computed and the true residuals. Finally, numerical experiments demonstrate that the proposed method illustrated can be more competitive than some other global GMRES-type methods.


MATEMATIKA ◽  
2020 ◽  
Vol 36 (3) ◽  
pp. 197-207
Author(s):  
Nurul Hafawati Fadhilah ◽  
Mohd Rivaie ◽  
Fuziyah Ishak ◽  
Nur Idalisa

Conjugate Gradient (CG) methods have an important role in solving largescale unconstrained optimization problems. Nowadays, the Three-Term CG method hasbecome a research trend of the CG methods. However, the existing Three-Term CGmethods could only be used with the inexact line search. When the exact line searchis applied, this Three-Term CG method will be reduced to the standard CG method.Hence in this paper, a new Three-Term CG method that could be used with the exactline search is proposed. This new Three-Term CG method satisfies the descent conditionusing the exact line search. Performance profile based on numerical results show thatthis proposed method outperforms the well-known classical CG method and some relatedhybrid methods. In addition, the proposed method is also robust in term of number ofiterations and CPU time.


2018 ◽  
Vol 7 (2.14) ◽  
pp. 25 ◽  
Author(s):  
Syazni Shoid ◽  
Norrlaili Shapiee ◽  
Norhaslinda Zull ◽  
Nur Hamizah Abdul Ghani ◽  
Nur Syarafina Mohamed ◽  
...  

Many researchers are intended to improve the conjugate gradient (CG) methods as well as their applications in real life. Besides, CG become more interesting and useful in many disciplines and has important role for solving large-scale optimization problems. In this paper, three types of new CG coefficients are presented with application in estimating data. Numerical experiments show that the proposed methods have succeeded in solving problems under strong Wolfe Powell line search conditions. 


Sign in / Sign up

Export Citation Format

Share Document