scholarly journals Scientific Realism and Primitive Ontology Or: The Pessimistic Induction and the Nature of the Wave Function

2018 ◽  
Vol 5 (1) ◽  
pp. 69-76 ◽  
Author(s):  
Valia Allori

In this paper, I wish to connect the recent debate in the philosophy of quantum mechanics concerning the nature of the wave function to the historical debate in the philosophy of science regarding the tenability of scientific realism. Advocating realism about quantum mechanics is particularly challenging when focusing on the wave function. According to the wave function ontology approach, the wave function is a concrete physical entity. In contrast, according to an alternative viewpoint, namely the primitive ontology approach, the wave function does not represent physical objects. In this paper, I argue that the primitive ontology approach can naturally be interpreted as an instance of the so-called explanationist realism, which has been proposed as a response to the pessimistic-meta induction argument against scientific realism. If my arguments are sound, then one could conclude that: (1) contrary to what is commonly thought, if explanationism realism is a good response to the pessimistic-meta induction argument, it can be straightforwardly extended also to the quantum domain; (2) the primitive ontology approach is in better shape than the wave function ontology approach in resisting the pessimistic-meta induction argument against scientific realism.

Author(s):  
Valia Allori

Scientific realism assumes that our best scientific theories can be regarded as (approximately) true. Quantum mechanics has long been regarded as at odds with scientific realism. It is now known that this is not true. However, scientific realists usually assume that the wave function represents physical entities. Chapter 11 discusses a particular approach which makes quantum mechanics compatible with scientific realism without assuming this: matter is instead represented by some spatio-temporal entity dubbed the primitive ontology. It argues how within this framework one developsa distinctive theory-construction schema, which allows us to perform a more informed theory evaluation by analyzing the various ingredients of the approach and their inter-relations.


2021 ◽  
Vol 11 (2) ◽  
Author(s):  
Florian J. Boge

AbstractTwo powerful arguments have famously dominated the realism debate in philosophy of science: The No Miracles Argument (NMA) and the Pessimistic Meta-Induction (PMI). A standard response to the PMI is selective scientific realism (SSR), wherein only the working posits of a theory are considered worthy of doxastic commitment. Building on the recent debate over the NMA and the connections between the NMA and the PMI, I here consider a stronger inductive argument that poses a direct challenge for SSR: Because it is sometimes exactly the working posits which contradict each other, i.e., that which is directly responsible for empirical success, SSR cannot deliver a general explanation of scientific success.


2020 ◽  
Author(s):  
Douglas Michael Snyder

Generally a central role has been assigned to an unavoidable physical interaction between the measuring instrument and the physical entity measured in the change in the wave function that often occurs in measurement in quantum mechanics. A survey of textbooks on quantum mechanics by authors such as Dicke and Witke (1960), Eisberg and Resnick (1985), Gasiorowicz (1974), Goswami (1992), and Liboff (1993) supports this point. Furthermore, in line with the view of Bohr and Feynman, generally the unavoidable interaction between a measuring instrument and the physical entity measured is considered responsible for the uncertainty principle. A gedankenexperiment using Feynman's double-hole interference scenario shows that physical interaction is not necessary to effect the change in the wave function that occurs in measurement in quantum mechanics. Instead, the general case is that knowledge is linked to the change in the wave function, not a physical interaction between the physical existent measured and the measuring instrument. Empirical work on electron shelving that involves null measurements, or what Renninger called negative observations (Zeitschrift fur Physik, vol. 158, p. 417), supports these points. Work on electron shelving is reported by Dehmelt and his colleagues (Physical Review Letters, vol. 56, p. 2797), Wineland and his colleagues (Physical Review Letters, vol. 57, p. 1699), and Sauter, Neuhauser, Blatt, and Toschek (Physical Review Letters, vol. 57, p. 1696). Originally appeared in arXiv on December 19, 1995. It can be accessed at arXiv:quant-ph/9601006 .


Quantum ◽  
2021 ◽  
Vol 5 ◽  
pp. 594
Author(s):  
Antoine Tilloy ◽  
Howard M. Wiseman

Spontaneous collapse models and Bohmian mechanics are two different solutions to the measurement problem plaguing orthodox quantum mechanics. They have, a priori nothing in common. At a formal level, collapse models add a non-linear noise term to the Schrödinger equation, and extract definite measurement outcomes either from the wave function (e.g. mass density ontology) or the noise itself (flash ontology). Bohmian mechanics keeps the Schrödinger equation intact but uses the wave function to guide particles (or fields), which comprise the primitive ontology. Collapse models modify the predictions of orthodox quantum mechanics, whilst Bohmian mechanics can be argued to reproduce them. However, it turns out that collapse models and their primitive ontology can be exactly recast as Bohmian theories. More precisely, considering (i) a system described by a non-Markovian collapse model, and (ii) an extended system where a carefully tailored bath is added and described by Bohmian mechanics, the stochastic wave-function of the collapse model is exactly the wave-function of the original system conditioned on the Bohmian hidden variables of the bath. Further, the noise driving the collapse model is a linear functional of the Bohmian variables. The randomness that seems progressively revealed in the collapse models lies entirely in the initial conditions in the Bohmian-like theory. Our construction of the appropriate bath is not trivial and exploits an old result from the theory of open quantum systems. This reformulation of collapse models as Bohmian theories brings to the fore the question of whether there exists `unromantic' realist interpretations of quantum theory that cannot ultimately be rewritten this way, with some guiding law. It also points to important foundational differences between `true' (Markovian) collapse models and non-Markovian models.


2020 ◽  
Vol 57 (2) ◽  
pp. 179-191
Author(s):  
Alexander A. Pechenkin ◽  

Two conceptions of the contemporary philosophy of science are taken under consideration: scientific realism and constructive empiricism. Scientific realism presupposes 1) the conception of truth as the correspondence of knowledge to reality, 2) the real existence of entities postulated by a theory. The constructive empiricism puts forward the idea of empirical adequacy: science aims to give us the theories which are empirically adequate and acceptance of the theory involves as belief only that it is empirically adequate. To compare methodological resources of these two positions in the philosophy of science the problem of the interpretation of quantum mechanics is involved. As a methodological realization of scientific realism the ensemble interpretation of quantum mechanics is taken under consideration. K.Popper’s version


2018 ◽  
Vol 95 (3) ◽  
pp. 329-342 ◽  
Author(s):  
Seungbae Park

In contemporary philosophy of science, the no-miracles argument and the pessimistic induction are regarded as the strongest arguments for and against scientific realism, respectively. In this paper, the author constructs a new argument for scientific realism, which he calls the anti-induction for scientific realism. It holds that, since past theories were false, present theories are true. The author provides an example from the history of science to show that anti-inductions sometimes work in science. The anti-induction for scientific realism has several advantages over the no-miracles argument as a positive argument for scientific realism.


2014 ◽  
Vol 5 (3) ◽  
pp. 871-981 ◽  
Author(s):  
Pang Xiao Feng

We establish the nonlinear quantum mechanics due to difficulties and problems of original quantum mechanics, in which microscopic particles have only a wave feature, not corpuscle feature, which are completely not consistent with experimental results and traditional concept of particle. In this theory the microscopic particles are no longer a wave, but localized and have a wave-corpuscle duality, which are represented by the following facts, the solutions of dynamic equation describing the particles have a wave-corpuscle duality, namely it consists of a mass center with constant size and carrier wave, is localized and stable and has a determinant mass, momentum and energy, which obey also generally conservation laws of motion, their motions meet both the Hamilton equation, Euler-Lagrange equation and Newton-type equation, their collision satisfies also the classical rule of collision of macroscopic particles, the uncertainty of their position and momentum is denoted by the minimum principle of uncertainty. Meanwhile the microscopic particles in this theory can both propagate in solitary wave with certain frequency and amplitude and generate reflection and transmission at the interfaces, thus they have also a wave feature, which but are different from linear and KdV solitary wave’s. Therefore the nonlinear quantum mechanics changes thoroughly the natures of microscopic particles due to the nonlinear interactions. In this investigation we gave systematically and completely the distinctions and variations between linear and nonlinear quantum mechanics, including the significances and representations of wave function and mechanical quantities, superposition principle of wave function, property of microscopic particle, eigenvalue problem, uncertainty relation and the methods solving the dynamic equations, from which we found nonlinear quantum mechanics is fully new and different from linear quantum mechanics. Finally, we verify further the correctness of properties of microscopic particles described by nonlinear quantum mechanics using the experimental results of light soliton in fiber and water soliton, which are described by same nonlinear Schrödinger equation. Thus we affirm that nonlinear quantum mechanics is correct and useful, it can be used to study the real properties of microscopic particles in physical systems.


2018 ◽  
Vol 2 (2) ◽  
pp. 43-47
Author(s):  
A. Suparmi, C. Cari, Ina Nurhidayati

Abstrak – Persamaan Schrödinger adalah salah satu topik penelitian yang yang paling sering diteliti dalam mekanika kuantum. Pada jurnal ini persamaan Schrödinger berbasis panjang minimal diaplikasikan untuk potensial Coulomb Termodifikasi. Fungsi gelombang dan spektrum energi yang dihasilkan menunjukkan kharakteristik atau tingkah laku dari partikel sub atom. Dengan menggunakan metode pendekatan hipergeometri, diperoleh solusi analitis untuk bagian radial persamaan Schrödinger berbasis panjang minimal diaplikasikan untuk potensial Coulomb Termodifikasi. Hasil yang diperoleh menunjukkan terjadi peningkatan energi yang sebanding dengan meningkatnya parameter panjang minimal dan parameter potensial Coulomb Termodifikasi. Kata kunci: persamaan Schrödinger, panjang minimal, fungsi gelombang, energi, potensial Coulomb Termodifikasi Abstract – The Schrödinger equation is the most popular topic research at quantum mechanics. The  Schrödinger equation based on the concept of minimal length formalism has been obtained for modified Coulomb potential. The wave function and energy spectra were used to describe the characteristic of sub-atomic particle. By using hypergeometry method, we obtained the approximate analytical solutions of the radial Schrödinger equation based on the concept of minimal length formalism for the modified Coulomb potential. The wave function and energy spectra was solved. The result showed that the value of energy increased by the increasing both of minimal length parameter and the potential parameter. Key words: Schrödinger equation, minimal length formalism (MLF), wave function, energy spectra, Modified Coulomb potential


2017 ◽  
Vol 26 (03) ◽  
pp. 1730008 ◽  
Author(s):  
Stephen D. H. Hsu

We explain the measure problem (cf. origin of the Born probability rule) in no-collapse quantum mechanics. Everett defined maverick branches of the state vector as those on which the usual Born probability rule fails to hold — these branches exhibit highly improbable behaviors, including possibly the breakdown of decoherence or even the absence of an emergent semi-classical reality. Derivations of the Born rule which originate in decision theory or subjective probability (i.e. the reasoning of individual observers) do not resolve this problem, because they are circular: they assume, a priori, that the observer occupies a non-maverick branch. An ab initio probability measure is sometimes assumed to explain why we do not occupy a maverick branch. This measure is constrained by, e.g. Gleason’s theorem or envariance to be the usual Hilbert measure. However, this ab initio measure ultimately governs the allocation of a self or a consciousness to a particular branch of the wave function, and hence invokes primitives which lie beyond the Everett wave function and beyond what we usually think of as physics. The significance of this leap has been largely overlooked, but requires serious scrutiny.


Author(s):  
David Wallace

Decoherence is widely felt to have something to do with the quantum measurement problem, but getting clear on just what is made difficult by the fact that the ‘measurement problem’, as traditionally presented in foundational and philosophical discussions, has become somewhat disconnected from the conceptual problems posed by real physics. This, in turn, is because quantum mechanics as discussed in textbooks and in foundational discussions has become somewhat removed from scientific practice, especially where the analysis of measurement is concerned. This paper has two goals: firstly (§§1–2), to present an account of how quantum measurements are actually dealt with in modern physics (hint: it does not involve a collapse of the wave function) and to state the measurement problem from the perspective of that account; and secondly (§§3–4), to clarify what role decoherence plays in modern measurement theory and what effect it has on the various strategies that have been proposed to solve the measurement problem.


Sign in / Sign up

Export Citation Format

Share Document