Chapter 15. Proofs of Unsatisfiability

Author(s):  
Marijn J.H. Heule

Satisfiability (SAT) solvers have become complex tools, which raises the question of whether we can trust their results. This question is particularly important when the solvers are used to determine the correctness of hardware and software and when they are used to produce mathematical results. To deal with this issue, solvers can provide proofs of unsatisfiability to certify the correctness of their answers. This chapter presents the history and state-of-the-art of producing and validating proofs of unsatisfiability. The chapter covers the most popular proof formats with and without hints to speed up certification. Hints in proofs make validation easy, which resulted in several efficient formally-verified checkers. Various proof systems are discussed, ranging from resolution to the recent propagation redundancy system. The chapter also describes techniques to compress and optimize proofs.

Author(s):  
Fahiem Bacchus ◽  
Antti Hyttinen ◽  
Matti Järvisalo ◽  
Paul Saikko

Maximum satisfiability (MaxSAT) offers a competitive approach to solving NP-hard real-world optimization problems. While state-of-the-art MaxSAT solvers rely heavily on Boolean satisfiability (SAT) solvers, a recent trend, brought on by MaxSAT solvers implementing the so-called implicit hitting set (IHS) approach, is to integrate techniques from the realm of integer programming (IP) into the solving process. This allows for making use of additional IP solving techniques to further speed up MaxSAT solving. In this line of work, we investigate the integration of the technique of reduced cost fixing from the IP realm into IHS solvers, and empirically show that reduced cost fixing considerable speeds up a state-of-the-art MaxSAT solver implementing the IHS approach.


2021 ◽  
Vol 297 ◽  
pp. 126645
Author(s):  
Gajanan Sampatrao Ghodake ◽  
Surendra Krushna Shinde ◽  
Avinash Ashok Kadam ◽  
Rijuta Ganesh Saratale ◽  
Ganesh Dattatraya Saratale ◽  
...  

2020 ◽  
Vol 14 (4) ◽  
pp. 653-667
Author(s):  
Laxman Dhulipala ◽  
Changwan Hong ◽  
Julian Shun

Connected components is a fundamental kernel in graph applications. The fastest existing multicore algorithms for solving graph connectivity are based on some form of edge sampling and/or linking and compressing trees. However, many combinations of these design choices have been left unexplored. In this paper, we design the ConnectIt framework, which provides different sampling strategies as well as various tree linking and compression schemes. ConnectIt enables us to obtain several hundred new variants of connectivity algorithms, most of which extend to computing spanning forest. In addition to static graphs, we also extend ConnectIt to support mixes of insertions and connectivity queries in the concurrent setting. We present an experimental evaluation of ConnectIt on a 72-core machine, which we believe is the most comprehensive evaluation of parallel connectivity algorithms to date. Compared to a collection of state-of-the-art static multicore algorithms, we obtain an average speedup of 12.4x (2.36x average speedup over the fastest existing implementation for each graph). Using ConnectIt, we are able to compute connectivity on the largest publicly-available graph (with over 3.5 billion vertices and 128 billion edges) in under 10 seconds using a 72-core machine, providing a 3.1x speedup over the fastest existing connectivity result for this graph, in any computational setting. For our incremental algorithms, we show that our algorithms can ingest graph updates at up to several billion edges per second. To guide the user in selecting the best variants in ConnectIt for different situations, we provide a detailed analysis of the different strategies. Finally, we show how the techniques in ConnectIt can be used to speed up two important graph applications: approximate minimum spanning forest and SCAN clustering.


Sensors ◽  
2021 ◽  
Vol 21 (19) ◽  
pp. 6523
Author(s):  
Pieter Van Van Molle ◽  
Cedric De De Boom ◽  
Tim Verbelen ◽  
Bert Vankeirsbilck ◽  
Jonas De De Vylder ◽  
...  

Deep neural networks have achieved state-of-the-art performance in image classification. Due to this success, deep learning is now also being applied to other data modalities such as multispectral images, lidar and radar data. However, successfully training a deep neural network requires a large reddataset. Therefore, transitioning to a new sensor modality (e.g., from regular camera images to multispectral camera images) might result in a drop in performance, due to the limited availability of data in the new modality. This might hinder the adoption rate and time to market for new sensor technologies. In this paper, we present an approach to leverage the knowledge of a teacher network, that was trained using the original data modality, to improve the performance of a student network on a new data modality: a technique known in literature as knowledge distillation. By applying knowledge distillation to the problem of sensor transition, we can greatly speed up this process. We validate this approach using a multimodal version of the MNIST dataset. Especially when little data is available in the new modality (i.e., 10 images), training with additional teacher supervision results in increased performance, with the student network scoring a test set accuracy of 0.77, compared to an accuracy of 0.37 for the baseline. We also explore two extensions to the default method of knowledge distillation, which we evaluate on a multimodal version of the CIFAR-10 dataset: an annealing scheme for the hyperparameter α and selective knowledge distillation. Of these two, the first yields the best results. Choosing the optimal annealing scheme results in an increase in test set accuracy of 6%. Finally, we apply our method to the real-world use case of skin lesion classification.


10.29007/hvqt ◽  
2018 ◽  
Author(s):  
Gilles Audemard ◽  
Benoît Hoessen ◽  
Saïd Jabbour ◽  
Cédric Piette

Over the years, parallel SAT solving becomes more and more important. However, most of state-of-the-art parallel SAT solvers are portfolio-based ones. They aim at running several times the same solver with different parameters. In this paper, we propose a tool called Dolius, mainly based on the divide and conquer paradigm. In contrast to most current parallel efficient engines, Dolius does not need shared memory, can be distributed, and scales well when a large number of computing units is available. Furthermore, our tool contains an API allowing to plug any SAT solver in a simple way.


10.29007/tc7q ◽  
2018 ◽  
Author(s):  
Adrián Rebola-Pardo ◽  
Martin Suda

We study the semantics of propositional interference-based proof systems such as DRAT and DPR. These are characterized by modifying a CNF formula in ways that preserve satisfiability but not necessarily logical truth. We propose an extension of propositional logic called overwrite logic with a new construct which captures the meta-level reasoning behind interferences. We analyze this new logic from the point of view of expressivity and complexity, showing that while greater expressivity is achieved, the satisfiability problem for overwrite logic is essentially as hard as SAT, and can be reduced in a way that is well-behaved for modern SAT solvers. We also show that DRAT and DPR proofs can be seen as overwrite logic proofs which preserve logical truth. This much stronger invariant than the mere satisfiability preservation maintained by the traditional view gives us better understanding on these practically important proof systems. Finally, we showcase this better understanding by finding intrinsic limitations in interference-based proof systems.


Author(s):  
Adnan Darwiche ◽  
Knot Pipatsrisawat

Complete SAT algorithms form an important part of the SAT literature. From a theoretical perspective, complete algorithms can be used as tools for studying the complexities of different proof systems. From a practical point of view, these algorithms form the basis for tackling SAT problems arising from real-world applications. The practicality of modern, complete SAT solvers undoubtedly contributes to the growing interest in the class of complete SAT algorithms. We review these algorithms in this chapter, including Davis-Putnum resolution, Stalmarck’s algorithm, symbolic SAT solving, the DPLL algorithm, and modern clause-learning SAT solvers. We also discuss the issue of certifying the answers of modern complete SAT solvers.


Author(s):  
Karem A. Sakallah

Symmetry is at once a familiar concept (we recognize it when we see it!) and a profoundly deep mathematical subject. At its most basic, a symmetry is some transformation of an object that leaves the object (or some aspect of the object) unchanged. For example, a square can be transformed in eight different ways that leave it looking exactly the same: the identity “do-nothing” transformation, 3 rotations, and 4 mirror images (or reflections). In the context of decision problems, the presence of symmetries in a problem’s search space can frustrate the hunt for a solution by forcing a search algorithm to fruitlessly explore symmetric subspaces that do not contain solutions. Recognizing that such symmetries exist, we can direct a search algorithm to look for solutions only in non-symmetric parts of the search space. In many cases, this can lead to significant pruning of the search space and yield solutions to problems which are otherwise intractable. This chapter explores the symmetries of Boolean functions, particularly the symmetries of their conjunctive normal form (CNF) representations. Specifically, it examines what those symmetries are, how to model them using the mathematical language of group theory, how to derive them from a CNF formula, and how to utilize them to speed up CNF SAT solvers.


2015 ◽  
Vol 7 (2) ◽  
pp. 113
Author(s):  
Markus Petri ◽  
Marcus Ehrig ◽  
Markus Günther

<p>To deal with the enormous increase of mobile data traffic, new cellular network topologies are necessary. The reduction of cell area and the usage of light-weighted base stations serving only a handful of users, commonly known as the small cell approach, seems to be a suitable solution addressing changes in user expectations and usage scenarios. This paper is an extended version of [1], where current challenges of small cell deployments were presented from a backhaul perspective. A mesh-type backhaul network topology based on beam-steering millimeter-wave systems was proposed as a future-proof solution. In this paper, we focus on a link initialization protocol for beam-steering with highly directive antennas. Special requirements and problems for link setup are analyzed. Based on that, a fast protocol for link initialization is presented and it is evaluated in terms of the resulting initialization speed-up compared to state-of-the-art solutions. Furthermore, a potential approach for extending the fast link initialization protocol to support point-to-multipoint connections is given.</p>


2019 ◽  
Vol 3 (3) ◽  
pp. 62 ◽  
Author(s):  
Maaike H. T. de Boer ◽  
Babette J. Bakker ◽  
Erik Boertjes ◽  
Mike Wilmer ◽  
Stephan Raaijmakers ◽  
...  

The number of cyberattacks on organizations is growing. To increase cyber resilience, organizations need to obtain foresight to anticipate cybersecurity vulnerabilities, developments, and potential threats. This paper describes a tool that combines state of the art text mining and information retrieval techniques to explore the opportunities of using these techniques in the cybersecurity domain. Our tool, the Horizon Scanner, can scrape and store data from websites, blogs and PDF articles, and search a database based on a user query, show textual entities in a graph, and provide and visualize potential trends. The aim of the Horizon Scanner is to help experts explore relevant data sources for potential threats and trends and to speed up the process of foresight. In a requirements session and user evaluation of the tool with cyber experts from the Dutch Defense Cyber Command, we explored whether the Horizon Scanner tool has the potential to fulfill its aim in the cybersecurity domain. Although the overall evaluation of the tool was not as good as expected, some aspects of the tool were found to have added value, providing us with valuable insights into how to design decision support for forecasting analysts.


Sign in / Sign up

Export Citation Format

Share Document