scholarly journals Fast Best Subset Selection: Coordinate Descent and Local Combinatorial Optimization Algorithms

2020 ◽  
Vol 68 (5) ◽  
pp. 1517-1537 ◽  
Author(s):  
Hussein Hazimeh ◽  
Rahul Mazumder

In several scientific and industrial applications, it is desirable to build compact, interpretable learning models where the output depends on a small number of input features. Recent work has shown that such best-subset selection-type problems can be solved with modern mixed integer optimization solvers. Despite their promise, such solvers often come at a steep computational price when compared with open-source, efficient specialized solvers based on convex optimization and greedy heuristics. In “Fast Best-Subset Selection: Coordinate Descent and Local Combinatorial Optimization Algorithms,” Hussein Hazimeh and Rahul Mazumder push the frontiers of computation for best-subset-type problems. Their algorithms deliver near-optimal solutions for problems with up to a million features—in times comparable with the fast convex solvers. Their work suggests that principled optimization methods play a key role in devising tools central to interpretable machine learning, which can help in gaining a deeper understanding of their statistical properties.

Author(s):  
Josef Jablonský

Linear programming (LP) and mixed integer linear programming (MILP) problems belong among very important class of problems that find their applications in various managerial consequences. The aim of the paper is to discuss computational performance of current optimization packages for solving large scale LP and MILP optimization problems. Current market with LP and MILP solvers is quite extensive. Probably among the most powerful solvers GUROBI 6.0, IBM ILOG CPLEX 12.6.1, and XPRESS Optimizer 27.01 belong. Their attractiveness for academic research is given, except their computational performance, by their free availability for academic purposes. The solvers are tested on the set of selected problems from MIPLIB 2010 library that contains 361 test instances of different hardness (easy, hard, and not solved).


2019 ◽  
Vol 14 (4) ◽  
pp. 889-924 ◽  
Author(s):  
Michael J. Risbeck ◽  
Christos T. Maravelias ◽  
James B. Rawlings ◽  
Robert D. Turney

Author(s):  
Cheng Seong Khor

The chapter focuses on the recent advancements in commercial integer optimization solvers as exemplified by the CPLEX software package particularly but not limited to mixed-integer linear programming (MILP) models applied to business intelligence applications. We provide background on the main underlying algorithmic method of branch-and-cut, which is based on the established optimization solution methods of branch-and-bound and cutting planes. The chapter also covers heuristic-based algorithms, which include preprocessing and probing strategies as well as the more advanced methods of local or neighborhood search for polishing solutions toward enhanced use in practical settings. Emphasis is given to both theory and implementation of the methods available. Other considerations are offered on parallelization, solution pools, and tuning tools, culminating with some concluding remarks on computational performance vis-à-vis business intelligence applications with a view toward perspective for future work in this area.


2021 ◽  
Author(s):  
Diana Spieler ◽  
Niels Schütze

<p>Recent investigations have shown it is possible to simultaneously calibrate model structures and model parameters to identify appropriate models for a given task (Spieler et al., 2020). However, this is computationally challenging, as different model structures may use a different number of parameters. While some parameters may be shared between model structures, others might be relevant for only a few structures, which theoretically requires the calibration of conditionally active parameters. Additionally, shared model parameters might cause different effects in different model structures, causing their optimal values to differ across structures. In this study, we tested how two current “of the shelf” mixed-integer optimization algorithms perform when having to handle these peculiarities during the automatic model structure identification (AMSI) process recently introduced by Spieler et al. (2020).</p><p>To validate the current performance of the AMSI approach, we conduct a benchmark experiment with a model space consisting of 6912 different model structures.  First, all model structures are independently calibrated and validated for three hydro-climatically differing catchments using the CMA-ES algorithm and KGE as the objective function. This is referred to as standard calibration procedure. We identify the best performing model structure(s) based on validation performance and analyze the range of performance as well as the number of structures performing in a similar range. Secondly, we run AMSI on all three catchments to automatically identify the most feasible model structure based on the KGE performance. Two different mixed-integer optimization algorithms are used – namely DDS and CMA-ES. Afterwards, we compare the results to the best performing models of the standard calibration of all 6912 model structures.</p><p>Within this experimental setup, we analyze if the best performing model structure(s) AMSI identifies are identical to the best performing structures of the standard calibration and if there are differences in performance when using different optimization algorithms for AMSI. We also validate if AMSI can identify the best performing model structures for a catchment at a fraction of the computational cost than the standard calibration procedure requires by using “off the shelf” mixed-integer optimization algorithms.</p><p> </p><p> </p><p> </p><p>Spieler, D., Mai, J., Craig, J. R., Tolson, B. A., & Schütze, N. (2020). Automatic Model Structure Identification for Conceptual Hydrologic Models. Water Resources Research, 56(9). https://doi.org/10.1029/2019WR027009</p>


2005 ◽  
Vol 10 (3) ◽  
pp. 217-236 ◽  
Author(s):  
M. Baravykaite ◽  
R. Čiegis ◽  
J. Žilinskas

In this work we consider a template for implementation of parallel branch and bound algorithms. The main aim of this package to ease implementation of covering and combinatorial optimization methods for global optimization. Standard parts of global optimization algorithms are implemented in the package and only method specific rules should be implemented by the user. The parallelization part of the tool is described in details. Results of computational experiments are presented and discussed. Straipsnyje pristatyta apibendrinto šaku ir režiu algoritmo šablono realizacija. Irankis skirtas palengvinti nuosekliuju ir lygiagrečiuju optimizacijos uždaviniu programu kūrima. Nuo uždavinio nepriklausančios algoritmo dalys yra idiegtos šablone ir vartotojui reikia sukurti tik nuo uždavinio priklausančiu daliu realizacija. Šablone idiegti keli lygiagretieji algoritmai, paremti tyrimo srities padalinimu tarp procesoriu. Pateikiami skaičiavimo eksperimentu rezultatai.


Sign in / Sign up

Export Citation Format

Share Document