Identity in Physics

Philosophy ◽  
2014 ◽  
Author(s):  
Décio Krause ◽  
Jonas R. B. Arenhart

Traditionally, the problem of identity is closely associated with the problem of individuality: What is it that makes something being what it is? Approaches to the problem may be classified into two classes: reductionism and transcendental identity. The first group tries to reduce identity to some qualitative feature of the entities dealt with, while the second either grounds identity on some feature other than qualitative properties or else take it to be primitive. The debate is generally centred on the validity of the Principle of the Identity of Indiscernibles (PII), which states that qualitative indiscernibility amounts to numerical identity. If PII is valid, then reductionism concerning identity is at least viable; if PII is invalid, then reductionism seems less plausible and some form of transcendental identity seems required. It is common to say that objects in classical mechanics are individuals. This fact is exhibited by postulating that physical objects obey Maxwell-Boltzmann statistics; if we have containers A and B to accommodate two objects a and b, there are four equiprobable situations: (1) both objects in A, (2) both in B, (3) a in A and b in B, and finally (4) a in B and b in A. Since situations (3) and (4) differ, there may be something that makes the difference—a transcendental individuality or some quality. In quantum mechanics, assuming that we have two containers A and B to accommodate objects a and b, there are just three equiprobable situations for bosons: (1) both objects in A, (2) both in B, (3) one object in A and one in B. It makes no sense to say that it is a or b that is in A: Switching them makes no difference. For fermions we have only one possibility due to the exclusion principle: (1) one object in A and one in B. Again, switching them makes no difference whatsoever. The dispute in quantum mechanics concerns non-individuality on the one side and individuality (be it reductionism or transcendental individuality) on the other. That distinction was grounded on the fact that quantum particles may be qualitatively indiscernible, and, as the statistics show, permutations are unobservable. The actual debate concerns whether some form of reductionism may survive in quantum mechanics or whether some form of transcendental identity should be adopted on the one hand and whether non-individuality is a viable option. Furthermore, a third option, Ontic Structural Realism (OSR), proposes that we transcend the debate and choose a metaphysics of structures and relations, leaving the controversial topic individuals × non-individuals behind.

Author(s):  
Frank S. Levin

Surfing the Quantum World bridges the gap between in-depth textbooks and typical popular science books on quantum ideas and phenomena. Among its significant features is the description of a host of mind-bending phenomena, such as a quantum object being in two places at once or a certain minus sign being the most consequential in the universe. Much of its first part is historical, starting with the ancient Greeks and their concepts of light, and ending with the creation of quantum mechanics. The second part begins by applying quantum mechanics and its probability nature to a pedagogical system, the one-dimensional box, an analog of which is a musical-instrument string. This is followed by a gentle introduction to the fundamental principles of quantum theory, whose core concepts and symbolic representations are the foundation for most of the subsequent chapters. For instance, it is shown how quantum theory explains the properties of the hydrogen atom and, via quantum spin and Pauli’s Exclusion Principle, how it accounts for the structure of the periodic table. White dwarf and neutron stars are seen to be gigantic quantum objects, while the maximum height of mountains is shown to have a quantum basis. Among the many other topics considered are a variety of interference phenomena, those that display the wave properties of particles like electrons and photons, and even of large molecules. The book concludes with a wide-ranging discussion of interpretational and philosophic issues, introduced in Chapters 14 by entanglement and 15 by Schrödinger’s cat.


2021 ◽  
Vol 18 ◽  
pp. 1-29
Author(s):  
Jeffrey Boyd

Wave particle duality is a cornerstone of quantum chemistry and quantum mechanics (QM). But there are experiments it cannot explain, such as a neutron interferometer experiment. If QM uses Ψ as its wavefunction, several experiments suggest that nature uses -Ψ instead. The difference between -Ψ and +Ψ is that they describe entirely different pictures of how nature is organized. For example, with -Ψ quantum particles follow waves backwards, which is incompatible with wave-particle-duality, obviously. We call the -Ψ proposal the Theory of Elementary Waves (TEW). It unlocks opportunities for young scientists with no budget to conduct the basic research for a new, unexplored science. This is a dream come true for young scientists: the discovery of uncharted territory. We show how TEW explains the double slit, Pfleegor Mandel and Davisson Germer experiments, Feynman diagrams and the Bell test experiments. We provide innovative research designs for which -Ψ and +Ψ would predict divergent outcomes. What makes QM so accurate is its probability predictions. But Born’s law would yield the same probabilities if it were changed from P = |+Ψ |2 to P = |-Ψ |2. This article is accompanied by a lively YouTube video, “6 reasons to discard wave particle duality.”


2017 ◽  
Vol 14 (06) ◽  
pp. 1750095 ◽  
Author(s):  
Salvatore Capozziello ◽  
Emmanuel N. Saridakis ◽  
Kazuharu Bamba ◽  
Alireza Sepehri ◽  
Farook Rahaman ◽  
...  

An emergence of cosmic space has been suggested by Padmanabhan [Emergence and expansion of cosmic space as due to the quest for holographic equipartition, arXiv:hep-th/1206.4916] where he proposed that the expansion of the universe originates from a difference between the number of degrees of freedom on a holographic surface and the one in the emerged bulk. Now, a natural question that arises is how this proposal would explain the production of fermions and an emergence of the Pauli exclusion principle during the evolution of the universe? We try to address this issue in a system of [Formula: see text]-branes. In this model, there is a high symmetry and the system is composed of [Formula: see text]-branes to which only scalar fields are attached that represent scalar modes of the graviton. Then, when [Formula: see text]-branes join each other and hence form [Formula: see text]-branes, this symmetry is broken and gauge fields are formed. Therefore, these [Formula: see text]-branes interact with the anti-[Formula: see text]-branes and the force between them leads to a break of a symmetry such as the lower and upper parts of these branes are not the same. In these conditions, gauge fields which are localized on [Formula: see text]-branes and scalars which are attached to them symmetrically, decay to fermions with upper and lower spins which attach to the upper and lower parts of the [Formula: see text]-branes anti-symmetrically. The curvature produced by the coupling of identical spins has the opposite sign of the curvature produced by non-identical spins which lead to an attractive force between anti-parallel spins and a repelling force between parallel spins and hence an emergence of the Pauli exclusion principle. By approaching [Formula: see text]-branes to each other, the difference between curvatures of parallel spins and curvatures of anti-parallel spins increases, which leads to an inequality between the number of degrees of freedom on the surface and the one in the emerged bulk and hence lead to an occurrence of the cosmic expansion. By approaching [Formula: see text]-branes to each other, the square of the energy of the system becomes negative and hence tachyonic states arise. To remove these states, [Formula: see text]-branes compactify, the sign of gravity changes and anti-gravity emerges which leads to the branes moving away from each other. By joining [Formula: see text]-branes, [Formula: see text]-branes are produced which are similar to an initial system that oscillates between compacting and opening branches. Our universe is placed on one of these [Formula: see text]-branes and by changing the difference between the amount of couplings between identical and non-identical spins, it contracts or expands.


2018 ◽  
Vol 18 (2) ◽  
pp. 275-303 ◽  
Author(s):  
Kazuya Yokohama

In Article 28 of the statute of the International Criminal Court (icc), there appear to be two kinds of omission, namely, a failure to control on the one hand, and a failure to prevent, repress and submit on the other. However, the relationship between both omissions remains unclear so far. This is a controversial topic not only in the scholarly debate but also in the recent jurisprudence of the icc. The core question is whether both omissions need to be proved separately (twofold-failures approach), or whether only the proof of the latter omission could suffice for the superior to be held responsible (single-failure approach). These two approaches could lead to different conclusions as to several aspects of superior responsibility: the ‘number’ of omissions that must be proved and the requirement of causality, for example. This article addresses the difference between these two approaches and demonstrates which approach should be adopted.


1975 ◽  
Vol 34 (02) ◽  
pp. 426-444 ◽  
Author(s):  
J Kahan ◽  
I Nohén

SummaryIn 4 collaborative trials, involving a varying number of hospital laboratories in the Stockholm area, the coagulation activity of different test materials was estimated with the one-stage prothrombin tests routinely used in the laboratories, viz. Normotest, Simplastin-A and Thrombotest. The test materials included different batches of a lyophilized reference plasma, deep-frozen specimens of diluted and undiluted normal plasmas, and fresh and deep-frozen specimens from patients on long-term oral anticoagulant therapy.Although a close relationship was found between different methods, Simplastin-A gave consistently lower values than Normotest, the difference being proportional to the estimated activity. The discrepancy was of about the same magnitude on all the test materials, and was probably due to a divergence between the manufacturers’ procedures used to set “normal percentage activity”, as well as to a varying ratio of measured activity to plasma concentration. The extent of discrepancy may vary with the batch-to-batch variation of thromboplastin reagents.The close agreement between results obtained on different test materials suggests that the investigated reference plasma could be used to calibrate the examined thromboplastin reagents, and to compare the degree of hypocoagulability estimated by the examined PIVKA-insensitive thromboplastin reagents.The assigned coagulation activity of different batches of the reference plasma agreed closely with experimentally obtained values. The stability of supplied batches was satisfactory as judged from the reproducibility of repeated measurements. The variability of test procedures was approximately the same on different test materials.


2017 ◽  
Vol 13 (1) ◽  
pp. 4522-4534
Author(s):  
Armando Tomás Canero

This paper presents sound propagation based on a transverse wave model which does not collide with the interpretation of physical events based on the longitudinal wave model, but responds to the correspondence principle and allows interpreting a significant number of scientific experiments that do not follow the longitudinal wave model. Among the problems that are solved are: the interpretation of the location of nodes and antinodes in a Kundt tube of classical mechanics, the traslation of phonons in the vacuum interparticle of quantum mechanics and gravitational waves in relativistic mechanics.


1975 ◽  
Vol 14 (3) ◽  
pp. 370-375
Author(s):  
M. A. Akhtar

I am grateful to Abe, Fry, Min, Vongvipanond, and Yu (hereafter re¬ferred to as AFMVY) [1] for obliging me to reconsider my article [2] on the demand for money in Pakistan. Upon careful examination, I find that the AFMVY results are, in parts, misleading and that, on the whole, they add very little to those provided in my study. Nevertheless, the present exercise as well as the one by AFMVY is useful in that it furnishes us with an opportunity to view some of the fundamental problems involved in an empi¬rical analysis of the demand for money function in Pakistan. Based on their elaborate critique, AFMVY reformulate the two hypo¬theses—the substitution hypothesis and the complementarity hypothesis— underlying my study and provide us with some alternative estimates of the demand for money in Pakistan. Briefly their results, like those in my study, indicate that income and interest rates are important in deter¬mining the demand for money. However, unlike my results, they also suggest that the price variable is a highly significant determinant of the money demand function. Furthermore, while I found only a weak support for the complementarity between money demand and physical capital, the results obtained by AFMVY appear to yield a strong support for that rela¬tionship.1 The difference in results is only a natural consequence of alter¬native specifications of the theory and, therefore, I propose to devote most of this reply to the criticisms raised by AFMVY and the resulting reformulation of the two mypotheses.


2019 ◽  
Vol 67 (6) ◽  
pp. 483-492
Author(s):  
Seonghyeon Baek ◽  
Iljae Lee

The effects of leakage and blockage on the acoustic performance of particle filters have been examined by using one-dimensional acoustic analysis and experimental methods. First, the transfer matrix of a filter system connected to inlet and outlet pipes with conical sections is measured using a two-load method. Then, the transfer matrix of a particle filter only is extracted from the experiments by applying inverse matrices of the conical sections. In the analytical approaches, the one-dimensional acoustic model for the leakage between the filter and the housing is developed. The predicted transmission loss shows a good agreement with the experimental results. Compared to the baseline, the leakage between the filter and housing increases transmission loss at a certain frequency and its harmonics. In addition, the transmission loss for the system with a partially blocked filter is measured. The blockage of the filter also increases the transmission loss at higher frequencies. For the simplicity of experiments to identify the leakage and blockage, the reflection coefficients at the inlet of the filter system have been measured using two different downstream conditions: open pipe and highly absorptive terminations. The experiments show that with highly absorptive terminations, it is easier to see the difference between the baseline and the defects.


Author(s):  
Sagar Suman Panda ◽  
Ravi Kumar B.V.V.

Three new analytical methods were optimized and validated for the estimation of tigecycline (TGN) in its injection formulation. A difference UV spectroscopic, an area under the curve (AUC), and an ultrafast liquid chromatographic (UFLC) method were optimized for this purpose. The difference spectrophotometric method relied on the measurement of amplitude when equal concentration solutions of TGN in HCl are scanned against TGN in NaOH as reference. The measurements were done at 340 nm (maxima) and 410nm (minima). Further, the AUC under both the maxima and minima were measured at 335-345nm and 405-415nm, respectively. The liquid chromatographic method utilized a reversed-phase column (150mm×4.6mm, 5µm) with a mobile phase of methanol: 0.01M KH2PO4 buffer pH 3.5 (using orthophosphoric acid) in the ratio 80:20 %, v/v. The flow rate was 1.0ml/min, and diode array detection was done at 349nm. TGN eluted at 1.656min. All the methods were validated for linearity, precision, accuracy, stability, and robustness. The developed methods produced validation results within the satisfactory limits of ICH guidance. Further, these methods were applied to estimate the amount of TGN present in commercial lyophilized injection formulations, and the results were compared using the One-Way ANOVA test. Overall, the methods are rapid, simple, and reliable for routine quality control of TGN in the bulk and pharmaceutical dosage form. 


Sign in / Sign up

Export Citation Format

Share Document