Approximation of the steepest descent direction for the O-D matrix adjustment problem

2006 ◽  
Vol 144 (1) ◽  
pp. 329-362 ◽  
Author(s):  
Esteve Codina ◽  
Lídia Montero
10.29007/2sdc ◽  
2018 ◽  
Author(s):  
Zeyu Feng ◽  
Chang Xu ◽  
Dacheng Tao

We introduce the Historical Gradient Boosting Machine with the objective of improving the convergence speed of gradient boosting. Our approach is analyzed from the perspective of numerical optimization in function space and considers gradients in previous steps, which have rarely been appreciated by traditional methods. To better exploit the guiding effect of historical gradient information, we incorporate both the accumulated previous gradients and the current gradient into the computation of descent direction in the function space. By fitting to the descent direction given by our algorithm, the weak learner could enjoy the advantages of historical gradients that mitigate the greediness of the steepest descent direction. Experimental results show that our approach improves the convergence speed of gradient boosting without significant decrease in accuracy.


2014 ◽  
Vol 2014 ◽  
pp. 1-6
Author(s):  
Zhijun Luo ◽  
Lirong Wang

A new parallel variable distribution algorithm based on interior point SSLE algorithm is proposed for solving inequality constrained optimization problems under the condition that the constraints are block-separable by the technology of sequential system of linear equation. Each iteration of this algorithm only needs to solve three systems of linear equations with the same coefficient matrix to obtain the descent direction. Furthermore, under certain conditions, the global convergence is achieved.


Sign in / Sign up

Export Citation Format

Share Document