A comparative study of primal and dual approaches for solving separable and partially-separable nonlinear optimization problems

1989 ◽  
Vol 1 (2) ◽  
pp. 73-79 ◽  
Author(s):  
F. A. Lootsma
1995 ◽  
Vol 23 (4) ◽  
pp. 287-300 ◽  
Author(s):  
SHUI-SHUN LIN ◽  
CHUN (CHUCK) ZHANG ◽  
HSU-PIN (BEN) WANG

1999 ◽  
Vol 9 (3) ◽  
pp. 755-778 ◽  
Author(s):  
Paul T. Boggs ◽  
Anthony J. Kearsley ◽  
Jon W. Tolle

2021 ◽  
Vol 0 (0) ◽  
Author(s):  
Darina Dvinskikh ◽  
Alexander Gasnikov

Abstract We introduce primal and dual stochastic gradient oracle methods for decentralized convex optimization problems. Both for primal and dual oracles, the proposed methods are optimal in terms of the number of communication steps. However, for all classes of the objective, the optimality in terms of the number of oracle calls per node takes place only up to a logarithmic factor and the notion of smoothness. By using mini-batching technique, we show that the proposed methods with stochastic oracle can be additionally parallelized at each node. The considered algorithms can be applied to many data science problems and inverse problems.


Sign in / Sign up

Export Citation Format

Share Document