scholarly journals A Novel Stochastic Stratified Average Gradient Method: Convergence Rate and Its Complexity

Author(s):  
Aixiang Andy Chen ◽  
Xiaolong Chai ◽  
Bingchuan Chen ◽  
Rui Bian ◽  
Qingliang Chen
2011 ◽  
Vol 1 (1) ◽  
pp. 82-88
Author(s):  
Hong-Kui Pang ◽  
Ying-Ying Zhang ◽  
Xiao-Qing Jin

AbstractWe consider a nonsymmetric Toeplitz system which arises in the discretization of a partial integro-differential equation in option pricing problems. The preconditioned conjugate gradient method with a tri-diagonal preconditioner is used to solve this system. Theoretical analysis shows that under certain conditions the tri-diagonal preconditioner leads to a superlinear convergence rate. Numerical results exemplify our theoretical analysis.


Author(s):  
Hadi Abbaszadehpeivasti ◽  
Etienne de Klerk ◽  
Moslem Zamani

AbstractIn this paper, we study the convergence rate of the gradient (or steepest descent) method with fixed step lengths for finding a stationary point of an L-smooth function. We establish a new convergence rate, and show that the bound may be exact in some cases, in particular when all step lengths lie in the interval (0, 1/L]. In addition, we derive an optimal step length with respect to the new bound.


2013 ◽  
Vol 380-384 ◽  
pp. 1444-1447
Author(s):  
Min Zhang ◽  
Xue Zhang ◽  
Yun Tao Zhang ◽  
Jin Ming

As a result of poor local research performance and short of the population diversity in niche genetic algorithm (NGA), an improved niche genetic algorithm (I-NGA) is presented in this paper. The algorithm adopts migration strategy, grading gradient method and gene equilibrium strategy, and it has better performance on accelerating the convergence rate and increasing the population diversity. This approach is used in Camel function for confirming. Through comparisons to NGA and I-NGA, the improved algorithm shows its better optimization.


Sign in / Sign up

Export Citation Format

Share Document