scholarly journals A Block Successive Upper-Bound Minimization Method of Multipliers for Linearly Constrained Convex Optimization

2020 ◽  
Vol 45 (3) ◽  
pp. 833-861
Author(s):  
Mingyi Hong ◽  
Tsung-Hui Chang ◽  
Xiangfeng Wang ◽  
Meisam Razaviyayn ◽  
Shiqian Ma ◽  
...  

Consider the problem of minimizing the sum of a smooth convex function and a separable nonsmooth convex function subject to linear coupling constraints. Problems of this form arise in many contemporary applications, including signal processing, wireless networking, and smart grid provisioning. Motivated by the huge size of these applications, we propose a new class of first-order primal–dual algorithms called the block successive upper-bound minimization method of multipliers (BSUM-M) to solve this family of problems. The BSUM-M updates the primal variable blocks successively by minimizing locally tight upper bounds of the augmented Lagrangian of the original problem, followed by a gradient-type update for the dual variable in closed form. We show that under certain regularity conditions, and when the primal block variables are updated in either a deterministic or a random fashion, the BSUM-M converges to a point in the set of optimal solutions. Moreover, in the absence of linear constraints and under similar conditions as in the previous result, we show that the randomized BSUM-M (which reduces to the randomized block successive upper-bound minimization method) converges at an asymptotically linear rate without relying on strong convexity.

2021 ◽  
Author(s):  
Oskar Weser ◽  
Björn Hein Hanke ◽  
Ricardo Mata

In this work, we present a fully automated method for the construction of chemically meaningful sets of non-redundant internal coordinates (also commonly denoted as Z-matrices) from the cartesian coordinates of a molecular system. Particular focus is placed on avoiding ill-definitions of angles and dihedrals due to linear arrangements of atoms, to consistently guarantee a well-defined transformation to cartesian coordinates, even after structural changes. The representations thus obtained are particularly well suited for pathway construction in double-ended methods for transition state search and optimisations with non-linear constraints. Analytical gradients for the transformation between the coordinate systems were derived for the first time, which allows analytical geometry optimizations purely in Z-matrix coordinates. The geometry optimisation was coupled with a Symbolic Algebra package to support arbitrary non-linear constraints in Z-matrix coordinates, while retaining analytical energy gradient conversion. Sample applications are provided for a number of common chemical reactions and illustrative examples where these new algorithms can be used to automatically produce chemically reasonable structure interpolations, or to perform non-linearly constrained optimisations of molecules.


2021 ◽  
pp. 1-28
Author(s):  
Yuan Shen ◽  
Yannian Zuo ◽  
Liming Sun ◽  
Xiayang Zhang

We consider the linearly constrained separable convex optimization problem whose objective function is separable with respect to [Formula: see text] blocks of variables. A bunch of methods have been proposed and extensively studied in the past decade. Specifically, a modified strictly contractive Peaceman–Rachford splitting method (SC-PRCM) [S. H. Jiang and M. Li, A modified strictly contractive Peaceman–Rachford splitting method for multi-block separable convex programming, J. Ind. Manag. Optim. 14(1) (2018) 397-412] has been well studied in the literature for the special case of [Formula: see text]. Based on the modified SC-PRCM, we present modified proximal symmetric ADMMs (MPSADMMs) to solve the multi-block problem. In MPSADMMs, all subproblems but the first one are attached with a simple proximal term, and the multipliers are updated twice. At the end of each iteration, the output is corrected via a simple correction step. Without stringent assumptions, we establish the global convergence result and the [Formula: see text] convergence rate in the ergodic sense for the new algorithms. Preliminary numerical results show that our proposed algorithms are effective for solving the linearly constrained quadratic programming and the robust principal component analysis problems.


Author(s):  
Ion Necoara ◽  
Martin Takáč

Abstract In this paper we consider large-scale smooth optimization problems with multiple linear coupled constraints. Due to the non-separability of the constraints, arbitrary random sketching would not be guaranteed to work. Thus, we first investigate necessary and sufficient conditions for the sketch sampling to have well-defined algorithms. Based on these sampling conditions we develop new sketch descent methods for solving general smooth linearly constrained problems, in particular, random sketch descent (RSD) and accelerated random sketch descent (A-RSD) methods. To our knowledge, this is the first convergence analysis of RSD algorithms for optimization problems with multiple non-separable linear constraints. For the general case, when the objective function is smooth and non-convex, we prove for the non-accelerated variant sublinear rate in expectation for an appropriate optimality measure. In the smooth convex case, we derive for both algorithms, non-accelerated and A-RSD, sublinear convergence rates in the expected values of the objective function. Additionally, if the objective function satisfies a strong convexity type condition, both algorithms converge linearly in expectation. In special cases, where complexity bounds are known for some particular sketching algorithms, such as coordinate descent methods for optimization problems with a single linear coupled constraint, our theory recovers the best known bounds. Finally, we present several numerical examples to illustrate the performances of our new algorithms.


2007 ◽  
Vol 21 (4) ◽  
pp. 611-621 ◽  
Author(s):  
Karthik Natarajan ◽  
Zhou Linyi

In this article, we derive a tight closed-form upper bound on the expected value of a three-piece linear convex function E[max(0, X, mX − z)] given the mean μ and the variance σ2 of the random variable X. The bound is an extension of the well-known mean–variance bound for E[max(0, X)]. An application of the bound to price the strangle option in finance is provided.


2020 ◽  
Vol 2020 ◽  
pp. 1-13
Author(s):  
Sha Lu ◽  
Zengxin Wei

Proximal point algorithm is a type of method widely used in solving optimization problems and some practical problems such as machine learning in recent years. In this paper, a framework of accelerated proximal point algorithm is presented for convex minimization with linear constraints. The algorithm can be seen as an extension to G u ¨ ler’s methods for unconstrained optimization and linear programming problems. We prove that the sequence generated by the algorithm converges to a KKT solution of the original problem under appropriate conditions with the convergence rate of O 1 / k 2 .


Energy ◽  
2020 ◽  
Vol 208 ◽  
pp. 118306 ◽  
Author(s):  
Mohamed A. Mohamed ◽  
Tao Jin ◽  
Wencong Su

Sign in / Sign up

Export Citation Format

Share Document