quantifier elimination
Recently Published Documents


TOTAL DOCUMENTS

384
(FIVE YEARS 45)

H-INDEX

28
(FIVE YEARS 2)

2022 ◽  
Vol Volume 18, Issue 1 ◽  
Author(s):  
Ankush Das ◽  
Frank Pfenning

Traditional session types prescribe bidirectional communication protocols for concurrent computations, where well-typed programs are guaranteed to adhere to the protocols. However, simple session types cannot capture properties beyond the basic type of the exchanged messages. In response, recent work has extended session types with refinements from linear arithmetic, capturing intrinsic attributes of processes and data. These refinements then play a central role in describing sequential and parallel complexity bounds on session-typed programs. The Rast language provides an open-source implementation of session-typed concurrent programs extended with arithmetic refinements as well as ergometric and temporal types to capture work and span of program execution. To further support generic programming, Rast also enhances arithmetically refined session types with recently developed nested parametric polymorphism. Type checking relies on Cooper's algorithm for quantifier elimination in Presburger arithmetic with a few significant optimizations, and a heuristic extension to nonlinear constraints. Rast furthermore includes a reconstruction engine so that most program constructs pertaining the layers of refinements and resources are inserted automatically. We provide a variety of examples to demonstrate the expressivity of the language.


Information ◽  
2021 ◽  
Vol 12 (8) ◽  
pp. 309
Author(s):  
Peng Wu ◽  
Ning Xiong ◽  
Juxia Xiong ◽  
Jinzhao Wu

Error coefficients are ubiquitous in systems. In particular, errors in reasoning verification must be considered regarding safety-critical systems. We present a reasoning method that can be applied to systems described by the polynomial error assertion (PEA). The implication relationship between PEAs can be converted to an inclusion relationship between zero sets of PEAs; the PEAs are then transformed into first-order polynomial logic. Combined with the quantifier elimination method, based on cylindrical algebraic decomposition, the judgment of the inclusion relationship between zero sets of PEAs is transformed into judgment error parameters and specific error coefficient constraints, which can be obtained by the quantifier elimination method. The proposed reasoning method is validated by proving the related theorems. An example of intercepting target objects is provided, and the correctness of our method is tested through large-scale random cases. Compared with reasoning methods without error semantics, our reasoning method has the advantage of being able to deal with error parameters.


Author(s):  
Diego Calvanese ◽  
Silvio Ghilardi ◽  
Alessandro Gianola ◽  
Marco Montali ◽  
Andrey Rivkin

AbstractUniform interpolants have been largely studied in non-classical propositional logics since the nineties; a successive research line within the automated reasoning community investigated uniform quantifier-free interpolants (sometimes referred to as “covers”) in first-order theories. This further research line is motivated by the fact that uniform interpolants offer an effective solution to tackle quantifier elimination and symbol elimination problems, which are central in model checking infinite state systems. This was first pointed out in ESOP 2008 by Gulwani and Musuvathi, and then by the authors of the present contribution in the context of recent applications to the verification of data-aware processes. In this paper, we show how covers are strictly related to model completions, a well-known topic in model theory. We also investigate the computation of covers within the Superposition Calculus, by adopting a constrained version of the calculus and by defining appropriate settings and reduction strategies. In addition, we show that computing covers is computationally tractable for the fragment of the language used when tackling the verification of data-aware processes. This observation is confirmed by analyzing the preliminary results obtained using the mcmt tool to verify relevant examples of data-aware processes. These examples can be found in the last version of the tool distribution.


Author(s):  
Peter Backeman ◽  
Philipp Rümmer ◽  
Aleksandar Zeljić

AbstractThe inference of program invariants over machine arithmetic, commonly called bit-vector arithmetic, is an important problem in verification. Techniques that have been successful for unbounded arithmetic, in particular Craig interpolation, have turned out to be difficult to generalise to machine arithmetic: existing bit-vector interpolation approaches are based either on eager translation from bit-vectors to unbounded arithmetic, resulting in complicated constraints that are hard to solve and interpolate, or on bit-blasting to propositional logic, in the process losing all arithmetic structure. We present a new approach to bit-vector interpolation, as well as bit-vector quantifier elimination (QE), that works by lazy translation of bit-vector constraints to unbounded arithmetic. Laziness enables us to fully utilise the information available during proof search (implied by decisions and propagation) in the encoding, and this way produce constraints that can be handled relatively easily by existing interpolation and QE procedures for Presburger arithmetic. The lazy encoding is complemented with a set of native proof rules for bit-vector equations and non-linear (polynomial) constraints, this way minimising the number of cases a solver has to consider. We also incorporate a method for handling concatenations and extractions of bit-vector efficiently.


2021 ◽  
Vol 359 (3) ◽  
pp. 291-295
Author(s):  
Mickaël Matusinski ◽  
Simon Müller

2021 ◽  
Vol 20 (3) ◽  
Author(s):  
Grzegorz Pastuszak ◽  
Adam Skowyrski ◽  
Andrzej Jamiołkowski

Author(s):  
Gennadiy Averkov ◽  
Matthias Schymura

AbstractFor a setXof integer points in a polyhedron, the smallest number of facets of any polyhedron whose set of integer points coincides with Xis called the relaxation complexity $${{\,\mathrm{rc}\,}}(X)$$rc(X). This parameter, introduced by Kaibel & Weltge (2015), captures the complexity of linear descriptions of Xwithout using auxiliary variables. Using tools from combinatorics, geometry of numbers, and quantifier elimination, we make progress on several open questions regarding$${{\,\mathrm{rc}\,}}(X)$$rc(X)and its variant$${{\,\mathrm{rc}\,}}_\mathbb {Q}(X)$$rcQ(X), restricting the descriptions of Xto rational polyhedra. As our main results we show that$${{\,\mathrm{rc}\,}}(X) = {{\,\mathrm{rc}\,}}_\mathbb {Q}(X)$$rc(X)=rcQ(X)when: (a)Xis at most four-dimensional, (b)Xrepresents every residue class in$$(\mathbb {Z}/2\mathbb {Z})^d$$(Z/2Z)d, (c) the convex hull of Xcontains an interior integer point, or (d) the lattice-width of Xis above a certain threshold. Additionally,$${{\,\mathrm{rc}\,}}(X)$$rc(X)can be algorithmically computed when Xis at most three-dimensional, orXsatisfies one of the conditions (b), (c), or (d) above. Moreover, we obtain an improved lower bound on$${{\,\mathrm{rc}\,}}(X)$$rc(X)in terms of the dimension of X.


2021 ◽  
Vol 2 (2) ◽  
Author(s):  
Maximilian Gerwien ◽  
Rick Voßwinkel ◽  
Hendrik Richter

AbstractThis paper adds to the discussion about theoretical aspects of particle swarm stability by proposing to employ stochastic Lyapunov functions and to determine the convergence set by quantifier elimination. We present a computational procedure and show that this approach leads to a reevaluation and extension of previously known stability regions for PSO using a Lyapunov approach under stagnation assumptions.


Sign in / Sign up

Export Citation Format

Share Document