Automated algorithm configuration and design

Author(s):  
Thomas Stützle ◽  
Manuel López-Ibáñez
2019 ◽  
Vol 27 (1) ◽  
pp. 173-193
Author(s):  
Jerry Swan ◽  
Steven Adriænsen ◽  
Adam D. Barwell ◽  
Kevin Hammond ◽  
David R. White

Metaheuristics are an effective and diverse class of optimization algorithms: a means of obtaining solutions of acceptable quality for otherwise intractable problems. The selection, construction, and configuration of a metaheuristic for a given problem has historically been a manually intensive process based on experience, experimentation, and reasoning by metaphor. More recently, there has been interest in automating the process of algorithm configuration. In this article, we identify shared state as an inhibitor of progress for such automation. To solve this problem, we introduce the Automated Open-Closed Principle (AOCP), which stipulates design requirements for unintrusive reuse of algorithm frameworks and automated assembly of algorithms from an extensible palette of components. We demonstrate how the AOCP enables a greater degree of automation than previously possible via an example implementation.


2009 ◽  
Vol 36 ◽  
pp. 267-306 ◽  
Author(s):  
F. Hutter ◽  
H. H. Hoos ◽  
K. Leyton-Brown ◽  
T. Stuetzle

The identification of performance-optimizing parameter settings is an important part of the development and application of algorithms. We describe an automatic framework for this algorithm configuration problem. More formally, we provide methods for optimizing a target algorithm’s performance on a given class of problem instances by varying a set of ordinal and/or categorical parameters. We review a family of local-search-based algorithm configuration procedures and present novel techniques for accelerating them by adaptively limiting the time spent for evaluating individual configurations. We describe the results of a comprehensive experimental evaluation of our methods, based on the configuration of prominent complete and incomplete algorithms for SAT. We also present what is, to our knowledge, the first published work on automatically configuring the CPLEX mixed integer programming solver. All the algorithms we considered had default parameter settings that were manually identified with considerable effort. Nevertheless, using our automated algorithm configuration procedures, we achieved substantial and consistent performance improvements.


2019 ◽  
Vol 27 (1) ◽  
pp. 3-45 ◽  
Author(s):  
Pascal Kerschke ◽  
Holger H. Hoos ◽  
Frank Neumann ◽  
Heike Trautmann

It has long been observed that for practically any computational problem that has been intensely studied, different instances are best solved using different algorithms. This is particularly pronounced for computationally hard problems, where in most cases, no single algorithm defines the state of the art; instead, there is a set of algorithms with complementary strengths. This performance complementarity can be exploited in various ways, one of which is based on the idea of selecting, from a set of given algorithms, for each problem instance to be solved the one expected to perform best. The task of automatically selecting an algorithm from a given set is known as the per-instance algorithm selection problem and has been intensely studied over the past 15 years, leading to major improvements in the state of the art in solving a growing number of discrete combinatorial problems, including propositional satisfiability and AI planning. Per-instance algorithm selection also shows much promise for boosting performance in solving continuous and mixed discrete/continuous optimisation problems. This survey provides an overview of research in automated algorithm selection, ranging from early and seminal works to recent and promising application areas. Different from earlier work, it covers applications to discrete and continuous problems, and discusses algorithm selection in context with conceptually related approaches, such as algorithm configuration, scheduling, or portfolio selection. Since informative and cheaply computable problem instance features provide the basis for effective per-instance algorithm selection systems, we also provide an overview of such features for discrete and continuous problems. Finally, we provide perspectives on future work in the area and discuss a number of open research challenges.


2015 ◽  
Vol 53 ◽  
pp. 745-778 ◽  
Author(s):  
Marius Lindauer ◽  
Holger H. Hoos ◽  
Frank Hutter ◽  
Torsten Schaub

Algorithm selection (AS) techniques -- which involve choosing from a set of algorithms the one expected to solve a given problem instance most efficiently -- have substantially improved the state of the art in solving many prominent AI problems, such as SAT, CSP, ASP, MAXSAT and QBF. Although several AS procedures have been introduced, not too surprisingly, none of them dominates all others across all AS scenarios. Furthermore, these procedures have parameters whose optimal values vary across AS scenarios. This holds specifically for the machine learning techniques that form the core of current AS procedures, and for their hyperparameters. Therefore, to successfully apply AS to new problems, algorithms and benchmark sets, two questions need to be answered: (i) how to select an AS approach and (ii) how to set its parameters effectively. We address both of these problems simultaneously by using automated algorithm configuration. Specifically, we demonstrate that we can automatically configure claspfolio 2, which implements a large variety of different AS approaches and their respective parameters in a single, highly-parameterized algorithm framework. Our approach, dubbed AutoFolio, allows researchers and practitioners across a broad range of applications to exploit the combined power of many different AS methods. We demonstrate AutoFolio can significantly improve the performance of claspfolio 2 on 8 out of the 13 scenarios from the Algorithm Selection Library, leads to new state-of-the-art algorithm selectors for 7 of these scenarios, and matches state-of-the-art performance (statistically) on all other scenarios. Compared to the best single algorithm for each AS scenario, AutoFolio achieves average speedup factors between 1.3 and 15.4.


Author(s):  
Holger H. Hoos ◽  
Frank Hutter ◽  
Kevin Leyton-Brown

This chapter provides an introduction to the automated configuration and selection of SAT algorithms and gives an overview of the most prominent approaches. Since the early 2000s, these so-called meta-algorithmic approaches have played a major role in advancing the state of the art in SAT solving, giving rise to new ways of using and evaluating SAT solvers. At the same time, SAT has proven to be particularly fertile ground for research and development in the area of automated configuration and selection, and methods developed there have meanwhile achieved impact far beyond SAT, across a broad range of computationally challenging problems. Conceptually more complex approaches that go beyond “pure” algorithm configuration and selection are also discussed, along with some open challenges related to meta-algorithmic approaches, such as automated algorithm configuration and selection, to the tools based on these approaches, and to their effective application.


10.29007/28ww ◽  
2018 ◽  
Author(s):  
Adrian Balint ◽  
Norbert Manthey

Preprocessing techniques are crucial for SAT solvers when it comes to reaching state-of-the-art performance as it was shown by the results of the last SAT Competitions. Theusefulness of a preprocessing technique depends highly on its own parameters, on the in-stances on which it is applied and on the used solver. In this paper we first give an extendedanalysis of the performance gain reached by using different preprocessing techniques in-dividually in combination with CDCL solvers on application instances and SLS solverson crafted instances. Further, we provide an analysis of combinations of preprocessingtechniques by means of automated algorithm configuration, where we search for optimalpreprocessor configurations for different scenarios. Our results show that the performanceof CDCL and especially of SLS solvers can be further improved when using appropriatepreprocessor configurations. The solvers augmented with the best found preprocessingconfigurations outperform the original solvers on the instances from the SAT Challenge2012, achieving new state-of-the-art results.


2019 ◽  
Vol 64 ◽  
pp. 861-893 ◽  
Author(s):  
Katharina Eggensperger ◽  
Marius Lindauer ◽  
Frank Hutter

Good parameter settings are crucial to achieve high performance in many areas of artificial intelligence (AI), such as propositional satisfiability solving, AI planning, scheduling, and machine learning (in particular deep learning). Automated algorithm configuration methods have recently received much attention in the AI community since they replace tedious, irreproducible and error-prone manual parameter tuning and can lead to new state-of-the-art performance. However, practical applications of algorithm configuration are prone to several (often subtle) pitfalls in the experimental design that can render the procedure ineffective. We identify several common issues and propose best practices for avoiding them. As one possibility for automatically handling as many of these as possible, we also propose a tool called GenericWrapper4AC.


Author(s):  
Marie Anastacio

The performance of state-of-the-art algorithms is highly dependent on their parameter values, and choosing the right configuration can make the difference between solving a problem in a few minutes or hours. Automated algorithm configurators have shown their efficiency on a wide range of applications. However, they still encounter limitations when confronted to a large number of parameters to tune or long algorithm running time. We believe that there is untapped knowledge that can be gathered from the elements of the configuration problem, such as the default value in the configuration space, the source code of the algorithm, and the distribution of the problem instances at hand. We aim at utilising this knowledge to improve algorithm configurators.


Sign in / Sign up

Export Citation Format

Share Document