scholarly journals Estimating the Density of States of Boolean Satisfiability Problems on Classical and Quantum Computing Platforms

2020 ◽  
Vol 34 (02) ◽  
pp. 1627-1635 ◽  
Author(s):  
Tuhin Sahai ◽  
Anurag Mishra ◽  
Jose Miguel Pasini ◽  
Susmit Jha

Given a Boolean formula ϕ(x) in conjunctive normal form (CNF), the density of states counts the number of variable assignments that violate exactly e clauses, for all values of e. Thus, the density of states is a histogram of the number of unsatisfied clauses over all possible assignments. This computation generalizes both maximum-satisfiability (MAX-SAT) and model counting problems and not only provides insight into the entire solution space, but also yields a measure for the hardness of the problem instance. Consequently, in real-world scenarios, this problem is typically infeasible even when using state-of-the-art algorithms. While finding an exact answer to this problem is a computationally intensive task, we propose a novel approach for estimating density of states based on the concentration of measure inequalities. The methodology results in a quadratic unconstrained binary optimization (QUBO), which is particularly amenable to quantum annealing-based solutions. We present the overall approach and compare results from the D-Wave quantum annealer against the best-known classical algorithms such as the Hamze-de Freitas-Selby (HFS) algorithm and satisfiability modulo theory (SMT) solvers.

2008 ◽  
Vol 105 (40) ◽  
pp. 15253-15257 ◽  
Author(s):  
Mikko Alava ◽  
John Ardelius ◽  
Erik Aurell ◽  
Petteri Kaski ◽  
Supriya Krishnamurthy ◽  
...  

We study the performance of stochastic local search algorithms for random instances of the K-satisfiability (K-SAT) problem. We present a stochastic local search algorithm, ChainSAT, which moves in the energy landscape of a problem instance by never going upwards in energy. ChainSAT is a focused algorithm in the sense that it focuses on variables occurring in unsatisfied clauses. We show by extensive numerical investigations that ChainSAT and other focused algorithms solve large K-SAT instances almost surely in linear time, up to high clause-to-variable ratios α; for example, for K = 4 we observe linear-time performance well beyond the recently postulated clustering and condensation transitions in the solution space. The performance of ChainSAT is a surprise given that by design the algorithm gets trapped into the first local energy minimum it encounters, yet no such minima are encountered. We also study the geometry of the solution space as accessed by stochastic local search algorithms.


2021 ◽  
Vol 71 (2) ◽  
pp. 111-123
Author(s):  
Sveinung Nesheim ◽  
Kjell Arne Malo ◽  
Nathalie Labonnote

Abstract As long-spanning timber floor elements attempt to achieve a meaningful market share, proof of serviceability continues to be a demanding task as international consensus remains unsettled. Initiatives to improve vibration levels are achievable, but a lack of confidence in the market is resulting in increases in margins for both manufacturers and contractors. State-of-the-art concrete alternatives are offered at less than half the price, and even though timber floors offer reduced completion costs and low carbon emissions, the market is continuously reserved. Cost reductions for timber floor elements to competitive levels must be pursued throughout the product details and in the stages of manufacturing. As new wood products are introduced to the market, solution space is increased to levels that demand computerized optimization models, which require accurate expenditure predictions. To meet this challenge, a method called item-driven activity-based consumption (IDABC) has been developed and presented in this study. The method establishes an accurate relationship between product specifications and overall resource consumption linked to finished manufactured products. In addition to production time, method outcomes include cost distributions, including labor costs, and carbon emissions for both accrued materials and production-line activities. A novel approach to resource estimation linked to assembly friendliness is also presented. IDABC has been applied to a timber component and assembly line operated by a major manufacturer in Norway and demonstrates good agreement with empirical data.


2019 ◽  
Vol 7 ◽  
pp. 643-659
Author(s):  
Amichay Doitch ◽  
Ram Yazdi ◽  
Tamir Hazan ◽  
Roi Reichart

The best solution of structured prediction models in NLP is often inaccurate because of limited expressive power of the model or to non-exact parameter estimation. One way to mitigate this problem is sampling candidate solutions from the model’s solution space, reasoning that effective exploration of this space should yield high-quality solutions. Unfortunately, sampling is often computationally hard and many works hence back-off to sub-optimal strategies, such as extraction of the best scoring solutions of the model, which are not as diverse as sampled solutions. In this paper we propose a perturbation-based approach where sampling from a probabilistic model is computationally efficient. We present a learning algorithm for the variance of the perturbations, and empirically demonstrate its importance. Moreover, while finding the argmax in our model is intractable, we propose an efficient and effective approximation. We apply our framework to cross-lingual dependency parsing across 72 corpora from 42 languages and to lightly supervised dependency parsing across 13 corpora from 12 languages, and demonstrate strong results in terms of both the quality of the entire solution list and of the final solution. 1


Buildings ◽  
2020 ◽  
Vol 10 (11) ◽  
pp. 201
Author(s):  
Jani Mukkavaara ◽  
Marcus Sandberg

The use of generative design has been suggested to be a novel approach that allows designers to take advantage of computers’ computational capabilities in the exploration of design alternatives. However, the field is still sparsely explored. Therefore, this study aimed to investigate the potential use of generative design in an architectural design context. A framework was iteratively developed alongside a prototype, which was eventually demonstrated in a case study to evaluate its applicability. The development of a residential block in the northern parts of Sweden served as the case. The findings of this study further highlight the potential of generative design and its promise in an architectural context. Compared to previous studies, the presented framework is open to other generative algorithms than mainly genetic algorithms and other evaluation models than, for instance, energy performance models. The paper also presents a general technical view on the functionality of the generative design system, as well as elaborating on how to explore the solution space in a top-down fashion. This paper moves the field of generative design further by presenting a generic framework for architectural design exploration. Future research needs to focus on detailing how generative design should be applied and when in the design process.


2009 ◽  
Vol 18 (05) ◽  
pp. 783-799
Author(s):  
RICHARD OSTROWSKI ◽  
LIONEL PARIS

Given a Boolean formula in conjunctive normal form (CNF), the Exact Satisfiability problem (XSAT), a variant of the Satisfiability problem (SAT), consists in finding an assignment to the variables such that each clause contains exactly one satisfied literal. Best algorithms to solve this problem run in [Formula: see text] ([Formula: see text] for X3SAT). Another possibility is to transform each clause in a set of equivalent clauses for the Satisfiability problem and to use modern and powerful solvers (zChaff, Berkmin, MiniSat, RSat etc.) to find such truth assignment. In this paper we introduce three new encodings from XSAT instances to SAT instances that lead to a lot of structural information (equivalency gates and and gates) which is naturally hidden in the pairwise transformation. Some solvers (lsat,march_dl,eqsatz) can take into account this kinds of structural information to make simplifications as pretreatment and speed-up the resolution. Then we show the interest of dealing with the XSAT formalism by introducing an encoding of binary CSP and graph coloring problem into XSAT instances. Preliminary results on real-world binary CSP and graph coloring problem show the importance of exhibiting equivalencies for the XSAT problem.


2014 ◽  
Vol 2014 ◽  
pp. 1-11 ◽  
Author(s):  
Noureddine Bouhmala

The simplicity of the maximum satisfiability problem (MAX-SAT) combined with its applicability in many areas of artificial intelligence and computing science made it one of the fundamental optimization problems. This NP-complete problem refers to the task of finding a variable assignment that satisfies the maximum number of clauses (or the sum of weights of satisfied clauses) in a Boolean formula. The Walksat algorithm is considered to be the main skeleton underlying almost all local search algorithms for MAX-SAT. Most local search algorithms including Walksat rely on the 1-flip neighborhood structure. This paper introduces a variable neighborhood walksat-based algorithm. The neighborhood structure can be combined easily using any local search algorithm. Its effectiveness is compared with existing algorithms using 1-flip neighborhood structure and solvers such as CCLS and Optimax from the eighth MAX-SAT evaluation.


2019 ◽  
Vol 22 (64) ◽  
pp. 123-134
Author(s):  
Mohamed Amine Nemmich ◽  
Fatima Debbat ◽  
Mohamed Slimane

In this paper, we propose a novel efficient model based on Bees Algorithm (BA) for the Resource-Constrained Project Scheduling Problem (RCPSP). The studied RCPSP is a NP-hard combinatorial optimization problem which involves resource, precedence, and temporal constraints. It has been applied to many applications. The main objective is to minimize the expected makespan of the project. The proposed model, named Enhanced Discrete Bees Algorithm (EDBA), iteratively solves the RCPSP by utilizing intelligent foraging behaviors of honey bees. The potential solution is represented by the multidimensional bee, where the activity list representation (AL) is considered. This projection involves using the Serial Schedule Generation Scheme (SSGS) as decoding procedure to construct the active schedules. In addition, the conventional local search of the basic BA is replaced by a neighboring technique, based on the swap operator, which takes into account the specificity of the solution space of project scheduling problems and reduces the number of parameters to be tuned. The proposed EDBA is tested on well-known benchmark problem instance sets from Project Scheduling Problem Library (PSPLIB) and compared with other approaches from the literature. The promising computational results reveal the effectiveness of the proposed approach for solving the RCPSP problems of various scales.


Author(s):  
John Ziadat ◽  
Marius D. Ellingsen ◽  
Karim H. Muci-Küchler ◽  
Shaobo Huang ◽  
Cassandra M. Degen

Most undergraduate mechanical engineering curricula contain one or more courses that provide an introduction to the product design and development process. These courses include some topics that, without the proper motivation, may be perceived by students as being of low relevance. In addition, they also cover topics that may seem to be somewhat abstract and difficult to apply unless they are preceded by examples that clearly illustrate their practical value. The tasks of identifying customer needs and setting target specifications are typical examples of the first scenario described above. In general, engineering students have the notion that the activities of the detailed design phase are the ones that really matter and that those activities are the ones that determine the ultimate success of a product. They are so concerned with designing the physical components of the product correctly that they spend little time and effort in other steps that are necessary to make sure that they are designing the right product. The tasks of concept generation and defining the architecture of a product are good examples of the second scenario mentioned in the first paragraph. Most students quickly proceed to pick a concept that they think is viable without carefully exploring the entire solution space. In addition, when considering relatively complex products, many students don’t spend enough time considering aspects such as defining the interfaces between different components. As a result, student teams end up with a collection of components that are individually well-designed but integrate poorly, and the end product suffers accordingly. Short, introductory examples demonstrating the importance of tasks like the ones mentioned above were created in order to get the attention of students and spark their interest in learning about such topics. These presentations were also created with the intent that they would motivate students to apply what they had learned when designing their own product or system. Through the examples, which corresponded to real-world product development efforts, students were exposed to not just well-designed and well-made products or systems that turned out to be successful, but also to products or systems that failed in the marketplace or experienced significant problems because the designers failed to adequately perform a task such as identifying customer requirements. The latter clearly showcased the importance of such tasks and conveyed the fact that good technical design work can be rendered moot by failing to put the required effort into the early stages of the development of a product or system. This paper presents the general criteria used and the approach followed to select and develop short introductory examples for the topics of identifying customer needs, setting target specifications, concept generation, and systems architecture. It briefly describes the examples selected and presents the results of a pilot assessment that was conducted to evaluate the effectiveness of one of those examples.


2020 ◽  
Vol 1 ◽  
pp. 937-946
Author(s):  
D. Horber ◽  
B. Schleich ◽  
S. Wartzack

AbstractRequirements act as a limitation of the solution space, which represents the stakeholders’ needs and guides the whole product development process. Therefore, forgotten requirements can lead to wrong decisions when using them as a basis for decision-making. This contribution introduces a novel approach to link the requirement and evaluation criteria models to address this problem. For setting up those criteria consistently, the requirements are classified using natural language processing and derived by a ruleset based on a developed mapping between requirement classes and criteria types.


2020 ◽  
Vol 17 (6) ◽  
pp. 885-894
Author(s):  
Mohan Allam ◽  
Nandhini Malaiyappan

The performance of the machine learning models mainly relies on the key features available in the training dataset. Feature selection is a significant job for pattern recognition for finding an important group of features to build classification models with a minimum number of features. Feature selection with optimization algorithms will improve the prediction rate of the classification models. But, tuning the controlling parameters of the optimization algorithms is a challenging task. In this paper, we present a wrapper-based model called Feature Selection with Integrative Teaching Learning Based Optimization (FS-ITLBO), which uses multiple teachers to select the optimal set of features from feature space. The goal of the proposed algorithm is to search the entire solution space without struck in the local optima of features. Moreover, the proposed method only utilizes teacher count parameter along with the size of the population and a number of iterations. Various classification models have been used for finding the fitness of instances in the population and to estimate the effectiveness of the proposed model. The robustness of the proposed algorithm has been assessed on Wisconsin Diagnostic Breast Cancer (WDBC) as well as Parkinson’s Disease datasets and compared with different wrapper-based feature selection techniques, including genetic algorithm and Binary Teaching Learning Based Optimization (BTLBO). The outcomes have confirmed that FS-ITLBO model produced the best accuracy with the optimal subset of features


Sign in / Sign up

Export Citation Format

Share Document