scholarly journals Proof Compression and NP Versus PSPACE II: Addendum

Author(s):  
Lew Gordeev ◽  
Edward Hermann Haeusler

In [3] we proved the conjecture NP = PSPACE by advanced proof theoretic methods that combined Hudelmaier's cut-free sequent calculus for minimal logic (HSC) [5] with the horizontal compressing in the corresponding minimal Prawitz-style natural deduction (ND) [6]. In this Addendum we show how to prove a weaker result NP = coNP without referring to HSC. The underlying idea (due to the second author) is to omit full minimal logic and compress only \naive" normal tree-like ND refutations of the existence of Hamiltonian cycles in given non-Hamiltonian graphs, since the Hamiltonian graph problem in NP-complete. Thus, loosely speaking, the proof of NP = coNP can be obtained by HSC-elimination from our proof of NP = PSPACE [3].

1987 ◽  
Vol 52 (3) ◽  
pp. 665-680 ◽  
Author(s):  
Neil Tennant

Relevance logic began in an attempt to avoid the so-called fallacies of relevance. These fallacies can be in implicational form or in deductive form. For example, Lewis's first paradox can beset a system in implicational form, in that the system contains as a theorem the formula (A & ∼A) → B; or it can beset it in deductive form, in that the system allows one to deduce B from the premisses A, ∼A.Relevance logic in the tradition of Anderson and Belnap has been almost exclusively concerned with characterizing a relevant conditional. Thus it has attacked the problem of relevance in its implicational form. Accordingly for a relevant conditional → one would not have as a theorem the formula (A & ∼A) → B. Other theorems even of minimal logic would also be lacking. Perhaps most important among these is the formula (A → (B → A)). It is also a well-known feature of their system R that it lacks the intuitionistically valid formula ((A ∨ B) & ∼A) → B (disjunctive syllogism).But it is not the case that any relevance logic worth the title even has to concern itself with the conditional, and hence with the problem in its implicational form. The problem arises even for a system without the conditional primitive. It would still be an exercise in relevance logic, broadly construed, to formulate a deductive system free of the fallacies of relevance in deductive form even if this were done in a language whose only connectives were, say, &, ∨ and ∼. Solving the problem of relevance in this more basic deductive form is arguably a precondition for solving it for the conditional, if we suppose (as is reasonable) that the relevant conditional is to be governed by anything like the rule of conditional proof.


Author(s):  
Mahtab Hosseininia ◽  
Faraz Dadgostari

In this chapter, the concepts of Hamiltonian paths and Hamiltonian cycles are discussed. In the first section, the history of Hamiltonian graphs is described, and then some concepts such as Hamiltonian paths, Hamiltonian cycles, traceable graphs, and Hamiltonian graphs are defined. Also some most known Hamiltonian graph problems such as travelling salesman problem (TSP), Kirkman’s cell of a bee, Icosian game, and knight’s tour problem are presented. In addition, necessary and (or) sufficient conditions for existence of a Hamiltonian cycle are investigated. Furthermore, in order to solve Hamiltonian cycle problems, some algorithms are introduced in the last section.


1996 ◽  
Vol 5 (4) ◽  
pp. 437-442 ◽  
Author(s):  
Carsten Thomassen

We prove that a bipartite uniquely Hamiltonian graph has a vertex of degree 2 in each color class. As consequences, every bipartite Hamiltonian graph of minimum degree d has at least 21−dd! Hamiltonian cycles, and every bipartite Hamiltonian graph of minimum degree at least 4 and girth g has at least (3/2)g/8 Hamiltonian cycles. We indicate how the existence of more than one Hamiltonian cycle may lead to a general reduction method for Hamiltonian graphs.


2021 ◽  
Vol 43 (2) ◽  
pp. 1-55
Author(s):  
Bernardo Toninho ◽  
Nobuko Yoshida

This work exploits the logical foundation of session types to determine what kind of type discipline for the Λ-calculus can exactly capture, and is captured by, Λ-calculus behaviours. Leveraging the proof theoretic content of the soundness and completeness of sequent calculus and natural deduction presentations of linear logic, we develop the first mutually inverse and fully abstract processes-as-functions and functions-as-processes encodings between a polymorphic session π-calculus and a linear formulation of System F. We are then able to derive results of the session calculus from the theory of the Λ-calculus: (1) we obtain a characterisation of inductive and coinductive session types via their algebraic representations in System F; and (2) we extend our results to account for value and process passing, entailing strong normalisation.


Author(s):  
Lew Gordeev ◽  
Edward Hermann Haeusler

We upgrade [3] to a complete proof of the conjecture NP = PSPACE that is known as one of the fundamental open problems in the mathematical theory of computational complexity; this proof is based on [2]. Since minimal propositional logic is known to be PSPACE complete, while PSPACE to include NP, it suffices to show that every valid purely implicational formula ρ has a proof whose weight (= total number of symbols) and time complexity of the provability involved are both polynomial in the weight of ρ. As in [3], we use proof theoretic approach. Recall that in [3] we considered any valid ρ in question that had (by the definition of validity) a "short" tree-like proof π in the Hudelmaier-style cutfree sequent calculus for minimal logic. The "shortness" means that the height of π and the total weight of different formulas occurring in it are both polynomial in the weight of ρ. However, the size (= total number of nodes), and hence also the weight, of π could be exponential in that of ρ. To overcome this trouble we embedded π into Prawitz's proof system of natural deductions containing single formulas, instead of sequents. As in π, the height and the total weight of different formulas of the resulting tree-like natural deduction ∂1 were polynomial, although the size of ∂1 still could be exponential, in the weight of ρ. In our next, crucial move, ∂1 was deterministically compressed into a "small", although multipremise, dag-like deduction ∂ whose horizontal levels contained only mutually different formulas, which made the whole weight polynomial in that of ρ. However, ∂ required a more complicated verification of the underlying provability of ρ. In this paper we present a nondeterministic compression of ∂ into a desired standard dag-like deduction ∂0 that deterministically proves ρ in time and space polynomial in the weight of ρ. Together with [3] this completes the proof of NP = PSPACE. Natural deductions are essential for our proof. Tree-to-dag horizontal compression of π merging equal sequents, instead of formulas, is (possible but) not sufficient, since the total number of different sequents in π might be exponential in the weight of ρ − even assuming that all formulas occurring in sequents are subformulas of ρ. On the other hand, we need Hudelmaier's cutfree sequent calculus in order to control both the height and total weight of different formulas of the initial tree-like proof π, since standard Prawitz's normalization although providing natural deductions with the subformula property does not preserve polynomial heights. It is not clear yet if we can omit references to π even in the proof of the weaker result NP = coNP.


1977 ◽  
Vol 42 (1) ◽  
pp. 11-28 ◽  
Author(s):  
Jonathan P. Seldin

The sequent calculus formulation (L-formulation) of the theory of functionality without the rules allowing for conversion of subjects of [3, §14E6] fails because the (cut) elimination theorem (ET) fails. This can be most easily seen by the fact that it is easy to prove in the systemandbut not (as is obvious if α is an atomic type [an F-simple])The error in the “proof” of ET in [14, §3E6], [3, §14E6], and [7, §9C] occurs in Stage 3, where it is implicitly assumed that if [x]X ≡ [x] Y then X ≡ Y. In the above counterexample, we have [x]x ≡ ∣ ≡ [x](∣x) but x ≢ ∣x. Since the formulation of [2, §9F] is not really satisfactory (for reasons stated in [3, §14E]), a new seguent calculus formulation is needed for the case in which the rules for subject conversions are not present. The main part of this paper is devoted to presenting such a formulation and proving it equivalent to the natural deduction formulation (T-formulation). The paper will conclude in §6 with some remarks on the result that every ob (term) with a type (functional character) has a normal form.The conventions and definitions of [3], especially of §12D and Chapter 14, will be used throughout the paper.


2008 ◽  
Vol 14 (2) ◽  
pp. 240-257 ◽  
Author(s):  
Jan von Plato

AbstractGentzen writes in the published version of his doctoral thesis Untersuchungen über das logische Schliessen (Investigations into logical reasoning) that he was able to prove the normalization theorem only for intuitionistic natural deduction, but not for classical. To cover the latter, he developed classical sequent calculus and proved a corresponding theorem, the famous cut elimination result. Its proof was organized so that a cut elimination result for an intuitionistic sequent calculus came out as a special case, namely the one in which the sequents have at most one formula in the right, succedent part. Thus, there was no need for a direct proof of normalization for intuitionistic natural deduction. The only traces of such a proof in the published thesis are some convertibilities, such as when an implication introduction is followed by an implication elimination [1934–35, II.5.13]. It remained to Dag Prawitz in 1965 to work out a proof of normalization. Another, less known proof was given also in 1965 by Andres Raggio.We found in February 2005 an early handwritten version of Gentzen's thesis, with exactly the above title, but with rather different contents: Most remarkably, it contains a detailed proof of normalization for what became the standard system of natural deduction. The manuscript is located in the Paul Bernays collection at the ETH-Zurichwith the signum Hs. 974: 271. Bernays must have gotten it well before the time of his being expelled from Göttingen on the basis of the racial laws in April 1933.


10.37236/9143 ◽  
2021 ◽  
Vol 28 (1) ◽  
Author(s):  
Marién Abreu ◽  
John Baptist Gauci ◽  
Domenico Labbate ◽  
Giuseppe Mazzuoccolo ◽  
Jean Paul Zerafa

A graph admitting a perfect matching has the Perfect–Matching–Hamiltonian property (for short the PMH–property) if each of its perfect matchings can be extended to a hamiltonian cycle. In this paper we establish some sufficient conditions for a graph $G$ in order to guarantee that its line graph $L(G)$ has the PMH–property. In particular, we prove that this happens when $G$ is (i) a Hamiltonian graph with maximum degree at most 3, (ii) a complete graph, (iii) a balanced complete bipartite graph with at least 100 vertices, or (iv) an arbitrarily traceable graph. Further related questions and open problems are proposed along the paper.


Sign in / Sign up

Export Citation Format

Share Document