scholarly journals Distributed Computational Framework for Large-Scale Stochastic Convex Optimization

Energies ◽  
2020 ◽  
Vol 14 (1) ◽  
pp. 23
Author(s):  
Vahab Rostampour ◽  
Tamás Keviczky

This paper presents a distributed computational framework for stochastic convex optimization problems using the so-called scenario approach. Such a problem arises, for example, in a large-scale network of interconnected linear systems with local and common uncertainties. Due to the large number of required scenarios to approximate the stochasticity of these problems, the stochastic optimization involves formulating a large-scale scenario program, which is in general computationally demanding. We present two novel ideas in this paper to address this issue. We first develop a technique to decompose the large-scale scenario program into distributed scenario programs that exchange a certain number of scenarios with each other to compute local decisions using the alternating direction method of multipliers (ADMM). We show the exactness of the decomposition with a-priori probabilistic guarantees for the desired level of constraint fulfillment for both local and common uncertainty sources. As our second contribution, we develop a so-called soft communication scheme based on a set parametrization technique together with the notion of probabilistically reliable sets to reduce the required communication between the subproblems. We show how to incorporate the probabilistic reliability notion into existing results and provide new guarantees for the desired level of constraint violations. Two different simulation studies of two types of interconnected network, namely dynamically coupled and coupling constraints, are presented to illustrate advantages of the proposed distributed framework.

Author(s):  
Alexander Murray ◽  
Timm Faulwasser ◽  
Veit Hagenmeyer ◽  
Mario E. Villanueva ◽  
Boris Houska

AbstractThis paper presents a novel partially distributed outer approximation algorithm, named PaDOA, for solving a class of structured mixed integer convex programming problems to global optimality. The proposed scheme uses an iterative outer approximation method for coupled mixed integer optimization problems with separable convex objective functions, affine coupling constraints, and compact domain. PaDOA proceeds by alternating between solving large-scale structured mixed-integer linear programming problems and partially decoupled mixed-integer nonlinear programming subproblems that comprise much fewer integer variables. We establish conditions under which PaDOA converges to global minimizers after a finite number of iterations and verify these properties with an application to thermostatically controlled loads and to mixed-integer regression.


2021 ◽  
Author(s):  
Vishal Gupta ◽  
Nathan Kallus

Managing large-scale systems often involves simultaneously solving thousands of unrelated stochastic optimization problems, each with limited data. Intuition suggests that one can decouple these unrelated problems and solve them separately without loss of generality. We propose a novel data-pooling algorithm called Shrunken-SAA that disproves this intuition. In particular, we prove that combining data across problems can outperform decoupling, even when there is no a priori structure linking the problems and data are drawn independently. Our approach does not require strong distributional assumptions and applies to constrained, possibly nonconvex, nonsmooth optimization problems such as vehicle-routing, economic lot-sizing, or facility location. We compare and contrast our results to a similar phenomenon in statistics (Stein’s phenomenon), highlighting unique features that arise in the optimization setting that are not present in estimation. We further prove that, as the number of problems grows large, Shrunken-SAA learns if pooling can improve upon decoupling and the optimal amount to pool, even if the average amount of data per problem is fixed and bounded. Importantly, we highlight a simple intuition based on stability that highlights when and why data pooling offers a benefit, elucidating this perhaps surprising phenomenon. This intuition further suggests that data pooling offers the most benefits when there are many problems, each of which has a small amount of relevant data. Finally, we demonstrate the practical benefits of data pooling using real data from a chain of retail drug stores in the context of inventory management. This paper was accepted by Chung Piaw Teo, Special Issue on Data-Driven Prescriptive Analytics.


2020 ◽  
Vol 85 (2) ◽  
Author(s):  
Radu Ioan Boţ ◽  
Axel Böhm

AbstractWe aim to solve a structured convex optimization problem, where a nonsmooth function is composed with a linear operator. When opting for full splitting schemes, usually, primal–dual type methods are employed as they are effective and also well studied. However, under the additional assumption of Lipschitz continuity of the nonsmooth function which is composed with the linear operator we can derive novel algorithms through regularization via the Moreau envelope. Furthermore, we tackle large scale problems by means of stochastic oracle calls, very similar to stochastic gradient techniques. Applications to total variational denoising and deblurring, and matrix factorization are provided.


2020 ◽  
Vol 34 (04) ◽  
pp. 6981-6988
Author(s):  
Zhou Zhai ◽  
Bin Gu ◽  
Xiang Li ◽  
Heng Huang

Robust support vector machine (RSVM) has been shown to perform remarkably well to improve the generalization performance of support vector machine under the noisy environment. Unfortunately, in order to handle the non-convexity induced by ramp loss in RSVM, existing RSVM solvers often adopt the DC programming framework which is computationally inefficient for running multiple outer loops. This hinders the application of RSVM to large-scale problems. Safe sample screening that allows for the exclusion of training samples prior to or early in the training process is an effective method to greatly reduce computational time. However, existing safe sample screening algorithms are limited to convex optimization problems while RSVM is a non-convex problem. To address this challenge, in this paper, we propose two safe sample screening rules for RSVM based on the framework of concave-convex procedure (CCCP). Specifically, we provide screening rule for the inner solver of CCCP and another rule for propagating screened samples between two successive solvers of CCCP. To the best of our knowledge, this is the first work of safe sample screening to a non-convex optimization problem. More importantly, we provide the security guarantee to our sample screening rules to RSVM. Experimental results on a variety of benchmark datasets verify that our safe sample screening rules can significantly reduce the computational time.


Author(s):  
Mohammad Kiani-Moghaddam ◽  
Mojtaba Shivaie

In this book chapter, the authors present an innovative strategy to enhance performance of the music-inspired algorithms. In this strategy, by using multiple-inhomogeneous music players and three different well-organized stages for improvisation, an innovative symphony orchestra search algorithm (SOSA) is proposed to solve large-scale non-linear non-convex optimization problems. Using multiple-inhomogeneous music players with different tastes, ideas, experiences can conduct players to choose better pitches, and increase the probability of playing a better melody. The strength of the newly proposed algorithm can enhance its superiority in comparison with other music-inspired algorithms, when feasible area of the solution space, and or dimensions of the optimization problem increases. Network expansion planning (NEP) problem has been employed to evaluate the performance of the newly proposed SOSA, compared with other existing optimization algorithms. The NEP problem is a large-scale non-convex optimization problem having a non-linear, mixed-integer nature.


1999 ◽  
Vol 121 (3) ◽  
pp. 443-447 ◽  
Author(s):  
Sungyung Lim ◽  
Homer D. Stevens ◽  
Jonathan P. How

This paper investigates a new design technique of input shaping filters for multi-input flexible systems using convex optimization synthesis techniques for finite impulse response filters (FIR filters). The objective of the input shaping filter design is to find the minimum length and the minimum number of nonzero impulses of the FIR filter that forces the system to track the reference command without any residual vibration, while satisfying additional performance and control constraints. This multi-objective optimization is solved using a two-step algorithm that sequentially solves two quasi-convex optimization problems. Compared with previously published nonlinear optimization approaches, this new approach does not require a priori knowledge of the forms of input shaping filters and enables much greater flexibility for including additional performance and robustness objectives. Furthermore, this convex-based approach can be applied to multi-input systems. The multiple input shaping filter has been experimentally verified on the Stanford University Two-Link Flexible Manipulator.


Author(s):  
Paul Cronin ◽  
Harry Woerde ◽  
Rob Vasbinder

Sign in / Sign up

Export Citation Format

Share Document