scholarly journals Interpolating bit-vector formulas using uninterpreted predicates and Presburger arithmetic

Author(s):  
Peter Backeman ◽  
Philipp Rümmer ◽  
Aleksandar Zeljić

AbstractThe inference of program invariants over machine arithmetic, commonly called bit-vector arithmetic, is an important problem in verification. Techniques that have been successful for unbounded arithmetic, in particular Craig interpolation, have turned out to be difficult to generalise to machine arithmetic: existing bit-vector interpolation approaches are based either on eager translation from bit-vectors to unbounded arithmetic, resulting in complicated constraints that are hard to solve and interpolate, or on bit-blasting to propositional logic, in the process losing all arithmetic structure. We present a new approach to bit-vector interpolation, as well as bit-vector quantifier elimination (QE), that works by lazy translation of bit-vector constraints to unbounded arithmetic. Laziness enables us to fully utilise the information available during proof search (implied by decisions and propagation) in the encoding, and this way produce constraints that can be handled relatively easily by existing interpolation and QE procedures for Presburger arithmetic. The lazy encoding is complemented with a set of native proof rules for bit-vector equations and non-linear (polynomial) constraints, this way minimising the number of cases a solver has to consider. We also incorporate a method for handling concatenations and extractions of bit-vector efficiently.

2022 ◽  
Vol 44 (1) ◽  
pp. 1-50
Author(s):  
Omar Inverso ◽  
Ermenegildo Tomasco ◽  
Bernd Fischer ◽  
Salvatore La Torre ◽  
Gennaro Parlato

Bounded verification techniques such as bounded model checking (BMC) have successfully been used for many practical program analysis problems, but concurrency still poses a challenge. Here, we describe a new approach to BMC of sequentially consistent imperative programs that use POSIX threads. We first translate the multi-threaded program into a nondeterministic sequential program that preserves reachability for all round-robin schedules with a given bound on the number of rounds. We then reuse existing high-performance BMC tools as backends for the sequential verification problem. Our translation is carefully designed to introduce very small memory overheads and very few sources of nondeterminism, so it produces tight SAT/SMT formulae, and is thus very effective in practice: Our Lazy-CSeq tool implementing this translation for the C programming language won several gold and silver medals in the concurrency category of the Software Verification Competitions (SV-COMP) 2014–2021 and was able to find errors in programs where all other techniques (including testing) failed. In this article, we give a detailed description of our translation and prove its correctness, sketch its implementation using the CSeq framework, and report on a detailed evaluation and comparison of our approach.


Author(s):  
Radu-Dinel Miruta ◽  
Cosmin Stanuica ◽  
Eugen Borcoci

The content aware (CA) packet classification and processing at network level is a new approach leading to significant increase of delivery quality of the multimedia traffic in Internet. This paper presents a solution for a new multi-dimensional packet classifier of an edge router, based on content - related new fields embedded in the data packets. The technique is applicable to content aware networks. The classification algorithm is using three new packet fields named Virtual Content Aware Network (VCAN), Service Type (STYPE), and U (unicast/multicast) which are part of the Content Awareness Transport Information (CATI) header. A CATI header is inserted into the transmitted data packets at the Service/Content Provider server side, in accordance with the media service definition, and enables the content awareness features at a new overlay Content Aware Network layer. The functionality of the CATI header within the classification process is then analyzed. Two possibilities are considered: the adaptation of the Lucent Bit vector algorithm and, respectively, of the tuple space search, in order to respond to the suggested multi-fields classifier. The results are very promising and they prove that theoretical model of inserting new packet fields for content aware classification can be implemented and can work in a real time classifier.


Author(s):  
Camillo Fiorentini

AbstractWe present an efficient proof search procedure for Intuitionistic Propositional Logic which involves the use of an incremental SAT-solver. Basically, it is obtained by adding a restart operation to the system by Claessen and Rosén, thus we call our implementation . We gain some remarkable advantages: derivations have a simple structure; countermodels are in general small; using a standard benchmarks suite, we outperform and other state-of-the-art provers.


10.29007/33k5 ◽  
2018 ◽  
Author(s):  
Conor McBride

Dyckhoff's algorithm for contraction-free proof search in intuitionistic propositional logic (popularized by Augustsson as the type-directed program synthesis tool, Djinn) is a simple program with a rather tricky termination proof. In this talk, I describe my efforts to reduce this program to a steady structural descent. On the way, I shall present an attempt at a compositional approach to explaining termination, via a uniform presentation of memoization.


Author(s):  
KENNETH M. DAWSON-HOWE ◽  
DAVID VERNON

A new approach to object recognition is presented, in which secondary representations of 3-D models are synthesized/derived (in various forms) and subsequently compared in order to invoke views of models, tune model pose and verify recognition hypotheses. The use of these secondary representations allows complex models (e.g. surface-based or volumetric models) to be compared implicitly (rather than explicitly comparing the component primitives of the models). This in turn overcomes the problem of the stability of the model primitives, and provides independence between the complex 3-D representations and the recognition strategy (i.e. the invocation, matching and verification techniques). The secondary representations employed are Extended Gaussian Images, directional histograms, needle diagrams, depth maps and boundary curvature signatures. The technique is demonstrated using models, of reasonably complex objects, derived from actively sensed range data.


Sign in / Sign up

Export Citation Format

Share Document