Sequential method in quantum logic

1980 ◽  
Vol 45 (2) ◽  
pp. 339-352 ◽  
Author(s):  
Hirokazu Nishimura

Since Birkhoff and von Neumann [2] a new area of logical investigation has grown up under the name of quantum logic. However it seems to me that many authors have been inclined to discuss algebraic semantics as such (mainly lattices of a certain kind) almost directly without presenting any axiomatic system, far from developing any proof theory of quantum logic. See, e.g., Gunson [9], Jauch [10], Varadarajan [15], Zeirler [16], etc. In this sense many works presented under the name of quantum logic are algebraic in essence rather than genuinely logical, though it is absurd to doubt the close relationship between algebraic and logical study on quantum mechanics.The main purpose of this paper is to alter this situation by presenting an axiomatization of quantum logic as natural and as elegant as possible, which further proof-theoretical study is to be based on.It is true that several axiomatizations of quantum logic are present now. Several authors have investigated the so-called material implication α → β ( = ¬α∨(α ∧ β)) very closely with due regard to its importance. See, e.g., Finch [5], Piziak [11], etc. Indeed material implication plays a predominant role in any axiomatization of a logic in Hilbert-style. Clark [4] has presented an axiomatization of quantum logic with negation ¬ and material implication → as primitive connectives. In this paper we do not follow this approach. First of all, this approach is greatly complicated because orthomodular lattices are only locally distributive.


1994 ◽  
Vol 33 (7) ◽  
pp. 1427-1443 ◽  
Author(s):  
Hirokazu Nishimura
Keyword(s):  


1995 ◽  
Vol 34 (4) ◽  
pp. 649-654 ◽  
Author(s):  
Mitio Takano
Keyword(s):  


1990 ◽  
Vol 05 (18) ◽  
pp. 1441-1449 ◽  
Author(s):  
B.A. ARBUZOV ◽  
S.A. SHICHANIN ◽  
E.E. BOOS ◽  
V.I. SAVRIN

In the paper the existence of new quasi-stationary levels in the relativistic Coulomb problem is predicted and their positions are calculated on the basis of the numerical solution of the quasi-potential equation. The results obtained are used for interpretation of the narrow electron-positron resonances revealed in heavy ions collisions and of the diproton resonances observed in neutron-proton interactions. The close relationship of the observed states with the von Neumann-Wigner levels embedded in the continuum, is indicated.



2016 ◽  
Vol 30 (2) ◽  
pp. 219-236 ◽  
Author(s):  
Ivan Moscati

Expected utility theory dominated the economic analysis of individual decision-making under risk from the early 1950s to the 1990. Among the early supporters of the expected utility hypothesis in the von Neumann–Morgenstern version were Milton Friedman and Leonard Jimmie Savage, both based at the University of Chicago, and Jacob Marschak, a leading member of the Cowles Commission for Research in Economics. Paul Samuelson of MIT was initially a severe critic of expected utility theory. Between mid-April and early May 1950, Samuelson composed three papers in which he attacked von Neumann and Morgenstern's axiomatic system. By 1952, however, Samuelson had somewhat unexpectedly become a resolute supporter of the expected utility hypothesis. Why did Samuelson change his mind? Based on the correspondence between Samuelson, Savage, Marschak, and Friedman, this article reconstructs the joint intellectual journey that led Samuelson to accept expected utility theory and Savage to revise his motivations for supporting it.



10.29007/k8cb ◽  
2018 ◽  
Author(s):  
Yun Shang ◽  
Xian Lu ◽  
Ruqian Lu

Turing machines based on quantum logic can solve undecidableproblems. In this paper we will give recursion-theoreticalcharacterization of the computational power of this kind of quantumTuring machines. In detail, for the unsharp case, it is proved that&#931<sup>0</sup><sub>1</sub>&#8746&#928<sup>0</sup><sub>1</sub>&#8838L<sup>T</sup><sub>d</sub>(&#949,&#931)(L<sup>T</sup><sub>w</sub>(&#949,&#931))&#8838&#928<sup>0</sup><sub>2</sub>when the truth value lattice is locally finite and the operation &#8743is computable, whereL<sup>T</sup><sub>d</sub>(&#949,&#931)(L<sup>T</sup><sub>w</sub>(&#949,&#931))denotes theclass of quantum language accepted by these Turing machine indepth-first model (respectively, width-first model);for the sharp case, we can obtain similar results for usual orthomodular lattices.



2021 ◽  
Vol 3 (4) ◽  
pp. 643-655
Author(s):  
Louis Narens

In 1933, Kolmogorov synthesized the basic concepts of probability that were in general use at the time into concepts and deductions from a simple set of axioms that said probability was a σ-additive function from a boolean algebra of events into [0, 1]. In 1932, von Neumann realized that the use of probability in quantum mechanics required a different concept that he formulated as a σ-additive function from the closed subspaces of a Hilbert space onto [0,1]. In 1935, Birkhoff & von Neumann replaced Hilbert space with an algebraic generalization. Today, a slight modification of the Birkhoff-von Neumann generalization is called “quantum logic”. A central problem in the philosophy of probability is the justification of the definition of probability used in a given application. This is usually done by arguing for the rationality of that approach to the situation under consideration. A version of the Dutch book argument given by de Finetti in 1972 is often used to justify the Kolmogorov theory, especially in scientific applications. As von Neumann in 1955 noted, and his criticisms still hold, there is no acceptable foundation for quantum logic. While it is not argued here that a rational approach has been carried out for quantum physics, it is argued that (1) for many important situations found in behavioral science that quantum probability theory is a reasonable choice, and (2) that it has an arguably rational foundation to certain areas of behavioral science, for example, the behavioral paradigm of Between Subjects experiments.



2017 ◽  
Vol 47 (2) ◽  
pp. 301-324 ◽  
Author(s):  
Norihiro Kamide
Keyword(s):  


Author(s):  
Peter Forrest

The topic of quantum logic was introduced by Birkhoff and von Neumann (1936), who described the formal properties of a certain algebraic system associated with quantum theory. To avoid begging questions, it is convenient to use the term ‘logic’ broadly enough to cover any algebraic system with formal characteristics similar to the standard sentential calculus. In that sense it is uncontroversial that there is a logic of experimental questions (for example, ‘Is the particle in region R?’ or ‘Do the particles have opposite spins?’) associated with any physical system. Having introduced this logic for quantum theory, we may ask how it differs from the standard sentential calculus, the logic for the experimental questions in classical mechanics. The most notable difference is that the distributive laws fail, being replaced by a weaker law known as orthomodularity. All this can be discussed without deciding whether quantum logic is a genuine logic, in the sense of a system of deduction. Putnam argued that quantum logic was indeed a genuine logic, because taking it as such solved various problems, notably that of reconciling the wave-like character of a beam of, say, electrons, as it passes through two slits, with the thesis that the electrons in the beam go through one or other of the two slits. If Putnam’s argument succeeds this would be a remarkable case of the empirical defeat of logical intuitions. Subsequent discussion, however, seems to have undermined his claim.



Author(s):  
Wilfried Sieg

Proof theory is a branch of mathematical logic founded by David Hilbert around 1920 to pursue Hilbert’s programme. The problems addressed by the programme had already been formulated, in some sense, at the turn of the century, for example, in Hilbert’s famous address to the First International Congress of Mathematicians in Paris. They were closely connected to the set-theoretic foundations for analysis investigated by Cantor and Dedekind – in particular, to difficulties with the unrestricted notion of system or set; they were also related to the philosophical conflict with Kronecker on the very nature of mathematics. At that time, the central issue for Hilbert was the ‘consistency of sets’ in Cantor’s sense. Hilbert suggested that the existence of consistent sets, for example, the set of real numbers, could be secured by proving the consistency of a suitable, characterizing axiom system, but indicated only vaguely how to give such proofs model-theoretically. Four years later, Hilbert departed radically from these indications and proposed a novel way of attacking the consistency problem for theories. This approach required, first of all, a strict formalization of mathematics together with logic; then, the syntactic configurations of the joint formalism would be considered as mathematical objects; finally, mathematical arguments would be used to show that contradictory formulas cannot be derived by the logical rules. This two-pronged approach of developing substantial parts of mathematics in formal theories (set theory, second-order arithmetic, finite type theory and still others) and of proving their consistency (or the consistency of significant sub-theories) was sharpened in lectures beginning in 1917 and then pursued systematically in the 1920s by Hilbert and a group of collaborators including Paul Bernays, Wilhelm Ackermann and John von Neumann. In particular, the formalizability of analysis in a second-order theory was verified by Hilbert in those very early lectures. So it was possible to focus on the second prong, namely to establish the consistency of ‘arithmetic’ (second-order number theory and set theory) by elementary mathematical, ‘finitist’ means. This part of the task proved to be much more recalcitrant than expected, and only limited results were obtained. That the limitation was inevitable was explained in 1931 by Gödel’s theorems; indeed, they refuted the attempt to establish consistency on a finitist basis – as soon as it was realized that finitist considerations could be carried out in a small fragment of first-order arithmetic. This led to the formulation of a general reductive programme. Gentzen and Gödel made the first contributions to this programme by establishing the consistency of classical first-order arithmetic – Peano arithmetic (PA) – relative to intuitionistic arithmetic – Heyting arithmetic. In 1936 Gentzen proved the consistency of PA relative to a quantifier-free theory of arithmetic that included transfinite recursion up to the first epsilon number, ε0; in his 1941 Yale lectures, Gödel proved the consistency of the same theory relative to a theory of computable functionals of finite type. These two fundamental theorems turned out to be most important for subsequent proof-theoretic work. Currently it is known how to analyse, in Gentzen’s style, strong subsystems of second-order arithmetic and set theory. The first prong of proof-theoretic investigations, the actual formal development of parts of mathematics, has also been pursued – with a surprising result: the bulk of classical analysis can be developed in theories that are conservative over (fragments of) first-order arithmetic.



2004 ◽  
Vol 4 (3) ◽  
pp. 186-195
Author(s):  
T. Gao ◽  
F.-L. Yan ◽  
Z.-X. Wang

The scheme for probabilistic teleportation of an $N$-particle state of general form is proposed. As the special cases we construct efficient quantum logic networks for implementing probabilistic teleportation of a two-particle state, a three-particle state and a four-particle state of general form, built from single qubit gates, two-qubit controlled-not gates, Von Neumann measurement and classically controlled operations.



Sign in / Sign up

Export Citation Format

Share Document