scholarly journals Propositional logics of partial predicates with composition of predicate complement

2019 ◽  
pp. 003-013 ◽  
Author(s):  
M.S. Nikitchenko ◽  
◽  
O.S. Shkilniak ◽  
S.S. Shkilniak ◽  
T.A. Mamedov ◽  
...  
Keyword(s):  
2012 ◽  
Vol 2 (3) ◽  
Author(s):  
Mykola Nikitchenko ◽  
Valentyn Tymofieiev

AbstractComposition-nominative logics are algebra-based logics of partial predicates constructed in a semantic-syntactic style on the methodological basis, which is common with programming. They can be considered as generalizations of traditional logics on classes of partial predicates that do not have fixed arity. In this paper we present and investigate algorithms for solving the satisfiability problem in various classes of composition-nominative logics. We consider the satisfiability problem for logics of the propositional, renominative, and quantifier levels and prove the reduction of the problem to the satisfiability problem for classical logics. The method developed in the paper enables us to leverage existent state-of-the-art satisfiability checking procedures for solving the satisfiability problem in composition-nominative logics, which could be crucial for handling industrial instances coming from domains such as program analysis and verification. The reduction proposed in the paper requires extension of logic language and logic models with an infinite number of unessential variables and with a predicate of equality to a constant.


2018 ◽  
Vol 26 (1) ◽  
pp. 11-20 ◽  
Author(s):  
Artur Korniłowicz ◽  
Ievgen Ivanov ◽  
Mykola Nikitchenko

Summary We show that the set of all partial predicates over a set D together with the disjunction, conjunction, and negation operations, defined in accordance with the truth tables of S.C. Kleene’s strong logic of indeterminacy [17], forms a Kleene algebra. A Kleene algebra is a De Morgan algebra [3] (also called quasi-Boolean algebra) which satisfies the condition x ∧¬:x ⩽ y ∨¬ :y (sometimes called the normality axiom). We use the formalization of De Morgan algebras from [8]. The term “Kleene algebra” was introduced by A. Monteiro and D. Brignole in [3]. A similar notion of a “normal i-lattice” had been previously studied by J.A. Kalman [16]. More details about the origin of this notion and its relation to other notions can be found in [24, 4, 1, 2]. It should be noted that there is a different widely known class of algebras, also called Kleene algebras [22, 6], which generalize the algebra of regular expressions, however, the term “Kleene algebra” used in this paper does not refer to them. Algebras of partial predicates naturally arise in computability theory in the study on partial recursive predicates. They were studied in connection with non-classical logics [17, 5, 18, 32, 29, 30]. A partial predicate also corresponds to the notion of a partial set [26] on a given domain, which represents a (partial) property which for any given element of this domain may hold, not hold, or neither hold nor not hold. The field of all partial sets on a given domain is an algebra with generalized operations of union, intersection, complement, and three constants (0, 1, n which is the fixed point of complement) which can be generalized to an equational class of algebras called DMF-algebras (De Morgan algebras with a single fixed point of involution) [25]. In [27] partial sets and DMF-algebras were considered as a basis for unification of set-theoretic and linguistic approaches to probability. Partial predicates over classes of mathematical models of data were used for formalizing semantics of computer programs in the composition-nominative approach to program formalization [31, 28, 33, 15], for formalizing extensions of the Floyd-Hoare logic [7, 9] which allow reasoning about properties of programs in the case of partial pre- and postconditions [23, 20, 19, 21], for formalizing dynamical models with partial behaviors in the context of the mathematical systems theory [11, 13, 14, 12, 10].


1991 ◽  
Vol 14 (4) ◽  
pp. 387-410
Author(s):  
Andrzej Blikle

Partial functions, hence also partial predicates, cannot be avoided in algorithms. However, in spite of the fact that partial functions have been formally introduced into the theory of software very early, partial predicates are still not quite commonly recognized. In many programming- and software-specification languages partial Boolean expressions are treated in a rather simplistic way: the evaluation of a Boolean sub-expression to an error leads to the evaluation of the hosting Boolean expression to an error and, in the consequence, to the abortion of the whole program. This technique is known as an eager evaluation of expressions. A more practical approach to the evaluation of expressions – gaining more interest today among both theoreticians and programming-language designers – is lazy evaluation. Lazily evaluated Boolean expressions correspond to (non-strict) three-valued predicates where the third value represents both an error and an undefinedness. On the semantic ground this leads to a three-valued propositional calculus, three-valued quantifiers and an appropriate logic. This paper is a survey-essay devoted to the discussion and the comparison of a few three-valued propositional and predicate calculi and to the discussion of the author’s claim that a two-valued logic, rather than a three-valued logic, is suitable for the treatment of programs with three-valued Boolean expressions. The paper is written in a formal but not in a formalized style. All discussion is carried on a semantic ground. We talk about predicates (functions) and a semantic consequence relation rather than about expressions and inference rules. However, the paper is followed by more formalized works which carry our discussion further on a formalized ground, and where corresponding formal logics are constructed and discussed.


2007 ◽  
Vol 7 (1-2) ◽  
pp. 153-182
Author(s):  
JULIO MARIÑO ◽  
ÁNGEL HERRANZ ◽  
JUAN JOSÉ MORENO-NAVARRO

AbstractTo alleviate the inefficiencies caused by the interaction of the logic and functional sides, integrated languages may take advantage of demand information, i.e. knowing in advance which computations are needed and, to which extent, in a particular context. This work studies demand analysis – which is closely related to backwards strictness analysis – in a semantic framework of partial predicates, which in turn are constructive realizations of ideals in a domain. This will allow us to give a concise, unified presentation of demand analysis, to relate it to other analyses based on abstract interpretation or strictness logics, some hints for the implementation, and, more important, to prove the soundness of our analysis based on demand equations. There are also some innovative results. One of them is that a set constraint-based analysis has been derived in a stepwise manner using ideas taken from the area of program transformation. The other one is the possibility of using program transformation itself to perform the analysis, specially in those domains of properties where algorithms based on constraint solving are too weak.


ARHE ◽  
2021 ◽  
Vol 27 (34) ◽  
pp. 85-102
Author(s):  
JOVANA KOSTIĆ

In this paper, we follow Gödel’s remarks on an envisioned theory of concepts to determine which properties should a logical basis of such a theory have. The discussion is organized around the question of suitability of the classical predicate calculus for this role. Some reasons to think that classical logic is not an appropriate basis for the theory of concepts, will be presented. We consider, based on these reasons, which alternative logical system could fare better as a logical foundation of, in Gödel’s opinion, the most important theory in logic yet to be developed. This paper should, in particular, motivate the study of partial predicates in a certain system of three-valued logic, as a promising starting point for the foundation of the theory of concepts.


2019 ◽  
Vol 29 (04) ◽  
pp. 743-759
Author(s):  
Khí-Uí Soo ◽  
Tim Stokes

This paper establishes a finite axiomatization of possibly non-halting computer programs and tests, with the if-then-else operation. The model is a two-sorted algebra, with one sort being the programs and the other being the tests. The main operation on programs is composition, and 1 and 0 represent the programs skip and loop (i.e. never halts) respectively. Programs are modeled as partial functions on some state space [Formula: see text], with tests modeled as partial predicates on [Formula: see text]. The operations on the tests are the usual logical connectives ∧, ∨, [Formula: see text], [Formula: see text] and [Formula: see text]. In addition, there is the hybrid operation of if-then-else, and the test-valued operation [Formula: see text] on programs which is true when a program halts, and undefined otherwise. The halting operation [Formula: see text] implies that operations of domain [Formula: see text] and domain join ∨ may also be expressed. When tests are assumed to be possibly non-halting, the evaluation strategy of the logical connectives affects the result. Here we model parallel evaluation, as opposed to the common sequential (or short-circuit) evaluation strategy. For example, we view [Formula: see text] as false if either [Formula: see text] or [Formula: see text] is false, even if the other does not halt.


Sign in / Sign up

Export Citation Format

Share Document