Large-Scale Stochastic Linear Programs: Importance Sampling and Benders Decomposition

Author(s):  
George B. Dantzig ◽  
Gerd Infanger
2009 ◽  
Vol 127 (2) ◽  
pp. 371-397 ◽  
Author(s):  
Marco Colombo ◽  
Jacek Gondzio ◽  
Andreas Grothey

Author(s):  
Harsha Gangammanavar ◽  
Yifan Liu ◽  
Suvrajeet Sen

Stochastic decomposition (SD) has been a computationally effective approach to solve large-scale stochastic programming (SP) problems arising in practical applications. By using incremental sampling, this approach is designed to discover an appropriate sample size for a given SP instance, thus precluding the need for either scenario reduction or arbitrary sample sizes to create sample average approximations (SAA). When compared with the solutions obtained using the SAA procedure, SD provides solutions of similar quality in far less computational time using ordinarily available computational resources. However, previous versions of SD were not applicable to problems with randomness in second-stage cost coefficients. In this paper, we extend its capabilities by relaxing this assumption on cost coefficients in the second stage. In addition to the algorithmic enhancements necessary to achieve this, we also present the details of implementing these extensions, which preserve the computational edge of SD. Finally, we illustrate the computational results obtained from the latest implementation of SD on a variety of test instances generated for problems from the literature. We compare these results with those obtained from the regularized L-shaped method applied to the SAA function of these problems with different sample sizes.


Sign in / Sign up

Export Citation Format

Share Document