scholarly journals Decentralized and parallel primal and dual accelerated methods for stochastic convex programming problems

2021 ◽  
Vol 0 (0) ◽  
Author(s):  
Darina Dvinskikh ◽  
Alexander Gasnikov

Abstract We introduce primal and dual stochastic gradient oracle methods for decentralized convex optimization problems. Both for primal and dual oracles, the proposed methods are optimal in terms of the number of communication steps. However, for all classes of the objective, the optimality in terms of the number of oracle calls per node takes place only up to a logarithmic factor and the notion of smoothness. By using mini-batching technique, we show that the proposed methods with stochastic oracle can be additionally parallelized at each node. The considered algorithms can be applied to many data science problems and inverse problems.

Author(s):  
Yi Xu ◽  
Zhuoning Yuan ◽  
Sen Yang ◽  
Rong Jin ◽  
Tianbao Yang

Extrapolation is a well-known technique for solving convex optimization and variational inequalities and recently attracts some attention for non-convex optimization. Several recent works have empirically shown its success in some machine learning tasks. However, it has not been analyzed for non-convex minimization and there still remains a gap between the theory and the practice. In this paper, we analyze gradient descent  and stochastic gradient descent with extrapolation for finding an approximate first-order stationary point in smooth non-convex optimization problems. Our convergence upper bounds show that the algorithms with extrapolation can be accelerated than without extrapolation.


2018 ◽  
Vol 5 (1) ◽  
pp. 42-60 ◽  
Author(s):  
Akshay Agrawal ◽  
Robin Verschueren ◽  
Steven Diamond ◽  
Stephen Boyd

Sign in / Sign up

Export Citation Format

Share Document