scholarly journals A New Inexact Line Search Method for Convex Optimization Problems

In general one can say that line search procedure for the steplength and search direction are two important elements of a line search algorithm. The line search procedure requires much attention because of its far implications on the robustness and efficiency of the algorithm. The purpose of this paper is to propose a simple yet effective line search strategy in solving unconstrained convex optimization problems. This line search procedure does not require the evaluation of the objective function. Instead, it forces reduction in gradient norm on each direction. Hence it is suitable for problems when function evaluation is very costly. To illustrate the effectiveness of our line search procedure, we employ this procedure together with the symmetric rank one quasi-Newton update and test it against the same quasi-Newton method with the well-known Armijo line search. Numerical results on a set of standard unconstrained optimization problems showed that the proposed procedure is superior to the Armijo line search.

2014 ◽  
Vol 8 (1) ◽  
pp. 218-221 ◽  
Author(s):  
Ping Hu ◽  
Zong-yao Wang

We propose a non-monotone line search combination rule for unconstrained optimization problems, the corresponding non-monotone search algorithm is established and its global convergence can be proved. Finally, we use some numerical experiments to illustrate the new combination of non-monotone search algorithm’s effectiveness.


2021 ◽  
Vol 0 (0) ◽  
Author(s):  
Darina Dvinskikh ◽  
Alexander Gasnikov

Abstract We introduce primal and dual stochastic gradient oracle methods for decentralized convex optimization problems. Both for primal and dual oracles, the proposed methods are optimal in terms of the number of communication steps. However, for all classes of the objective, the optimality in terms of the number of oracle calls per node takes place only up to a logarithmic factor and the notion of smoothness. By using mini-batching technique, we show that the proposed methods with stochastic oracle can be additionally parallelized at each node. The considered algorithms can be applied to many data science problems and inverse problems.


Sign in / Sign up

Export Citation Format

Share Document