On the convergence of inexact block coordinate descent methods for constrained optimization

2013 ◽  
Vol 231 (2) ◽  
pp. 274-281 ◽  
Author(s):  
A. Cassioli ◽  
D. Di Lorenzo ◽  
M. Sciandrone
IEEE Access ◽  
2021 ◽  
pp. 1-1
Author(s):  
Fanhua Shang ◽  
Zhihui Zhang ◽  
Yuanyuan Liu ◽  
Hongying Liua ◽  
Jing Xu

2018 ◽  
Vol 16 (05) ◽  
pp. 741-755 ◽  
Author(s):  
Qin Fang ◽  
Min Xu ◽  
Yiming Ying

The problem of minimizing a separable convex function under linearly coupled constraints arises from various application domains such as economic systems, distributed control, and network flow. The main challenge for solving this problem is that the size of data is very large, which makes usual gradient-based methods infeasible. Recently, Necoara, Nesterov and Glineur [Random block coordinate descent methods for linearly constrained optimization over networks, J. Optim. Theory Appl. 173(1) (2017) 227–254] proposed an efficient randomized coordinate descent method to solve this type of optimization problems and presented an appealing convergence analysis. In this paper, we develop new techniques to analyze the convergence of such algorithms, which are able to greatly improve the results presented in the above. This refined result is achieved by extending Nesterov’s second technique [Efficiency of coordinate descent methods on huge-scale optimization problems, SIAM J. Optim. 22 (2012) 341–362] to the general optimization problems with linearly coupled constraints. A novel technique in our analysis is to establish the basis vectors for the subspace of the linear constraints.


2016 ◽  
Vol 163 (1-2) ◽  
pp. 85-114 ◽  
Author(s):  
Mingyi Hong ◽  
Xiangfeng Wang ◽  
Meisam Razaviyayn ◽  
Zhi-Quan Luo

2019 ◽  
Vol 41 (1) ◽  
pp. C1-C27 ◽  
Author(s):  
Aditya Devarakonda ◽  
Kimon Fountoulakis ◽  
James Demmel ◽  
Michael W. Mahoney

Sign in / Sign up

Export Citation Format

Share Document