Meaning

Author(s):  
Richard Healey

Novel quantum concepts acquire content not by representing new beables but through material-inferential relations between claims about them and other claims. Acceptance of quantum theory modifies other concepts in accordance with a pragmatist inferentialist account of how claims acquire content. Quantum theory itself introduces no new beables, but accepting it affects the content of claims about classical magnitudes and other beables unknown to classical physics: the content of a magnitude claim about a physical object is a function of its physical context in a way that eludes standard pragmatics but may be modeled by decoherence. Leggett’s proposed test of macro-realism illustrates this mutation of conceptual content. Quantum fields are not beables but assumables of a quantum theory we use to make claims about particles and non-quantum fields whose denotational content may also be certified by models of decoherence.

Author(s):  
Jeremy Butterfield

Over the centuries, the doctrine of determinism has been understood, and assessed, in different ways. Since the seventeenth century, it has been commonly understood as the doctrine that every event has a cause; or as the predictability, in principle, of the entire future. To assess the truth of determinism, so understood, philosophers have often looked to physical science; they have assumed that their current best physical theory is their best guide to the truth of determinism. It seems that most have believed that classical physics, especially Newton’s physics, is deterministic. And in this century, most have believed that quantum theory is indeterministic. Since quantum theory has superseded classical physics, philosophers have typically come to the tentative conclusion that determinism is false. In fact, these impressions are badly misleading. The above formulations of determinism are unsatisfactory. Once we use a better formulation, we see that there is a large gap between the determinism of a given physical theory, and the bolder, vague idea that motivated the traditional formulations: the idea that the world in itself is deterministic. Admittedly, one can make sense of this idea by adopting a sufficiently bold metaphysics; but it cannot be made sense of just by considering determinism for physical theories. As regards physical theories, the traditional impression is again misleading. Which theories are deterministic turns out to be a subtle and complicated matter, with many open questions. But broadly speaking, it turns out that much of classical physics, even much of Newton’s physics, is indeterministic. Furthermore, the alleged indeterminism of quantum theory is very controversial: it enters, if at all, only in quantum theory’s account of measurement processes, an account which remains the most controversial part of the theory.


Entropy ◽  
2020 ◽  
Vol 22 (11) ◽  
pp. 1224 ◽  
Author(s):  
Adrian Kent

Models in which causation arises from higher level structures as well as from microdynamics may be relevant to unifying quantum theory with classical physics or general relativity. They also give a way of defining a form of panprotopsychist property dualism, in which consciousness and material physics causally affect one another. I describe probabilistic toy models based on cellular automata that illustrate possibilities and difficulties with these ideas.


1989 ◽  
Vol 44 (4) ◽  
pp. 262-268 ◽  
Author(s):  
H. Stumpf

Abstract Quantum fields can be characterized by the set of transition amplitudes {〈0|π(A)|a〉V̶|a,〉∈V} where π(A) is a representation of the field operator algebra in V. This set has to satisfy renormalized energy equations and the elements of this set are called wavefunctions. However, these wavefunctions are not identical with wavefunctions of conventional quantum theory in Fockspace. Thus a theoret­ical interpretation is needed. In the present paper, by means of some theorems a method of normalization and construction of probability densities for these wavefunctions is given, which differs from the method of derivation of the normalization condition for Bethe-Salpeter amplitudes. The method can be applied both to nonrelativistic and relativistic fields with positive definite or indefinite state spaces, provided the renormalized energy equations possess finite solutions.


Author(s):  
Andrew J. P. Garner ◽  
Markus P. Müller ◽  
Oscar C. O. Dahlsten

The patterns of fringes produced by an interferometer have long been important testbeds for our best contemporary theories of physics. Historically, interference has been used to contrast quantum mechanics with classical physics, but recently experiments have been performed that test quantum theory against even more exotic alternatives. A physically motivated family of theories are those where the state space of a two-level system is given by a sphere of arbitrary dimension. This includes classical bits, and real, complex and quaternionic quantum theory. In this paper, we consider relativity of simultaneity (i.e. that observers may disagree about the order of events at different locations) as applied to a two-armed interferometer, and show that this forbids most interference phenomena more complicated than those of complex quantum theory. If interference must depend on some relational property of the setting (such as path difference), then relativity of simultaneity will limit state spaces to standard complex quantum theory, or a subspace thereof. If this relational assumption is relaxed, we find one additional theory compatible with relativity of simultaneity: quaternionic quantum theory. Our results have consequences for current laboratory interference experiments: they have to be designed carefully to avoid rendering beyond-quantum effects invisible by relativity of simultaneity.


Author(s):  
J.S Rowlinson

Einstein is remembered for his contributions to the re-ordering of the foundations of physics in the first years of the twentieth century. Much of his achievement was, however, based on the classical physics of the late nineteenth century and it was his work on statistical mechanics that underlay his first contributions to quantum theory. This essay is an account of an aspect of his achievement that is often overlooked.


Author(s):  
Jeremy Butterfield

Over the centuries, the doctrine of determinism has been understood, and assessed, in different ways. Since the seventeenth century, it has been commonly understood as the doctrine that every event has a cause; or as the predictability, in principle, of the entire future. To assess the truth of determinism, so understood, philosophers have often looked to physical science; they have assumed that their current best physical theory is their best guide to the truth of determinism. Most have believed that classical physics, especially Newton’s physics, is deterministic. And in this century, most have believed that quantum theory is indeterministic. Since quantum theory has superseded classical physics, philosophers have typically come to the tentative conclusion that determinism is false. In fact, these impressions are badly misleading, on three counts. First of all, formulations of determinism in terms of causation or predictability are unsatisfactory, since ‘event’, ‘causation’ and ‘prediction’ are vague and controversial notions, and are not used (at least not univocally) in most physical theories. So if we propose to assess determinism by considering physical theories, our formulation of determinism should be more closely tied to such theories. To do this, the key idea is that determinism is a property of a theory. Imagine a theory that ascribes properties to objects of a certain kind, and claims that the sequence through time of any such object’s properties satisfies certain regularities. Then we say that the theory is deterministic if and only if for any two such objects: if their properties match exactly at a given time, then according to the theory, they will match exactly at all future times. Second, this improved formulation reveals that there is a large gap between the determinism of a given physical theory, and the bolder, vague idea that motivated the traditional formulations: the idea that the world as a whole, independent of any single theory, is deterministic. Admittedly, one can make sense of this idea by adopting a sufficiently bold metaphysics: namely, a metaphysics that accepts the idea of a theory of the world as a whole, so that its objects are possible worlds, and determinism becomes the requirement that any two possible worlds described by the theory that match exactly at a given time also match exactly at all future times. But this idea cannot be made sense of using the more cautious strategy of considering determinism as a feature of a given physical theory. Third, according to this more cautious strategy, the traditional consensus is again misleading. Which theories are deterministic turns out to be a subtle and complicated matter, with many questions still open. But broadly speaking, it turns out that much of classical physics, even much of Newton’s physics, is indeterministic. Furthermore, the alleged indeterminism of quantum theory is very controversial: it enters, if at all, only in quantum theory’s account of measurement processes, an account which remains the most controversial part of the theory. These subtleties and controversies mean that physics does not pass to philosophers any simple verdict about determinism. But more positively, they also mean that determinism remains an exciting topic in the philosophy of science.


2004 ◽  
Vol 2004 (1) ◽  
pp. 75-83 ◽  
Author(s):  
R. C. Bishop ◽  
A. Bohm ◽  
M. Gadella

Time asymmetry and irreversibility are signal features of our world. They are the reason of our aging and the basis for our belief that effects are preceded by causes. These features have many manifestations called arrows of time. In classical physics, some of these arrows are described by the increase of entropy or probability, and others by time-asymmetric boundary conditions of time-symmetric equations (e.g., Maxwell or Einstein). However, there is some controversy over whether probability or boundary conditions are more fundamental. For quantum systems, entropy increase is usually associated with the effects of an environment or measurement apparatus on a quantum system and is described by the von Neumann-Liouville equation. But since the traditional (von Neumann) axioms of quantum mechanics do not allow time-asymmetric boundary conditions for the dynamical differential equations (Schrödinger or Heisenberg), there is no quantum analogue of the radiation arrow of time. In this paper, we review consequences of a modification of a fundamental axiom of quantum mechanics. The new quantum theory is time asymmetric and accommodates an irreversible time evolution of isolated quantum systems.


Author(s):  
Ciarán M. Lee ◽  
John H. Selby

To date, there has been no experimental evidence that invalidates quantum theory. Yet it may only be an effective description of the world, in the same way that classical physics is an effective description of the quantum world. We ask whether there exists an operationally defined theory superseding quantum theory, but which reduces to it via a decoherence-like mechanism. We prove that no such post-quantum theory exists if it is demanded that it satisfy two natural physical principles: causality and purification . Causality formalizes the statement that information propagates from present to future, and purification that each state of incomplete information arises in an essentially unique way due to lack of information about an environment. Hence, our result can be viewed either as evidence that the fundamental theory of Nature is quantum or as showing in a rigorous manner that any post-quantum theory must abandon causality, purification or both.


Sign in / Sign up

Export Citation Format

Share Document