scholarly journals Aspects of Geodesical Motion with Fisher-Rao Metric: Classical and Quantum

2018 ◽  
Vol 25 (01) ◽  
pp. 1850005 ◽  
Author(s):  
Florio M. Ciaglia ◽  
Fabio Di Cosmo ◽  
Domenico Felice ◽  
Stefano Mancini ◽  
Giuseppe Marmo ◽  
...  

The purpose of this paper is to exploit the geometric structure of quantum mechanics and of statistical manifolds to study the qualitative effect that the quantum properties have in the statistical description of a system. We show that the end points of geodesics in the classical setting coincide with the probability distributions that minimise Shannon’s entropy, i.e. with distributions of zero dispersion. In the quantum setting this happens only for particular initial conditions, which in turn correspond to classical submanifolds. This result can be interpreted as a geometric manifestation of the uncertainty principle.

2020 ◽  
Vol 18 (02) ◽  
pp. 1941002 ◽  
Author(s):  
Philip Taranto

Understanding temporal processes and their correlations in time is of paramount importance for the development of near-term technologies that operate under realistic conditions. Capturing the complete multi-time statistics that define a stochastic process lies at the heart of any proper treatment of memory effects. This is well understood in classical theory, where a hierarchy of joint probability distributions completely characterizes the process at hand. However, attempting to generalize this notion to quantum mechanics is problematic: observing realizations of a quantum process necessarily disturbs the state of the system, breaking an implicit, and crucial, assumption in the classical setting. This issue can be overcome by separating the experimental interventions from the underlying process, enabling an unambiguous description of the process itself and accounting for all possible multi-time correlations for any choice of interrogating instruments. In this paper, using a novel framework for the characterization of quantum stochastic processes, we first solve the long standing question of unambiguously describing the memory length of a quantum processes. This is achieved by constructing a quantum Markov order condition, which naturally generalizes its classical counterpart for the quantification of finite-length memory effects. As measurements are inherently invasive in quantum mechanics, one has no choice but to define Markov order with respect to the interrogating instruments that are used to probe the process at hand: different memory effects are exhibited depending on how one addresses the system, in contrast to the standard classical setting. We then fully characterize the structural constraints imposed on quantum processes with finite Markov order, shedding light on a variety of memory effects that can arise through various examples. Finally, we introduce an instrument-specific notion of memory strength that allows for a meaningful quantification of the temporal correlations between the history and the future of a process for a given choice of experimental intervention. These findings are directly relevant to both characterizing and exploiting memory effects that persist for a finite duration. In particular, immediate applications range from developing efficient compression and recovery schemes for the description of quantum processes with memory to designing coherent control protocols that efficiently perform information-theoretic tasks, amongst a plethora of others.


Author(s):  
Frank S. Levin

The subject of Chapter 8 is the fundamental principles of quantum theory, the abstract extension of quantum mechanics. Two of the entities explored are kets and operators, with kets being representations of quantum states as well as a source of wave functions. The quantum box and quantum spin kets are specified, as are the quantum numbers that identify them. Operators are introduced and defined in part as the symbolic representations of observable quantities such as position, momentum and quantum spin. Eigenvalues and eigenkets are defined and discussed, with the former identified as the possible outcomes of a measurement. Bras, the counterpart to kets, are introduced as the means of forming probability amplitudes from kets. Products of operators are examined, as is their role underpinning Heisenberg’s Uncertainty Principle. A variety of symbol manipulations are presented. How measurements are believed to collapse linear superpositions to one term of the sum is explored.


Author(s):  
Anurag Chapagain

Abstract: It is a well-known fact in physics that classical mechanics describes the macro-world, and quantum mechanics describes the atomic and sub-atomic world. However, principles of quantum mechanics, such as Heisenberg’s Uncertainty Principle, can create visible real-life effects. One of the most commonly known of those effects is the stability problem, whereby a one-dimensional point base object in a gravity environment cannot remain stable beyond a time frame. This paper expands the stability question from 1- dimensional rod to 2-dimensional highly symmetrical structures, such as an even-sided polygon. Using principles of classical mechanics, and Heisenberg’s uncertainty principle, a stability equation is derived. The stability problem is discussed both quantitatively as well as qualitatively. Using the graphical analysis of the result, the relation between stability time and the number of sides of polygon is determined. In an environment with gravity forces only existing, it is determined that stability increases with the number of sides of a polygon. Using the equation to find results for circles, it was found that a circle has the highest degree of stability. These results and the numerical calculation can be utilized for architectural purposes and high-precision experiments. The result is also helpful for minimizing the perception that quantum mechanical effects have no visible effects other than in the atomic, and subatomic world. Keywords: Quantum mechanics, Heisenberg Uncertainty principle, degree of stability, polygon, the highest degree of stability


2021 ◽  
Author(s):  
Muhammad Yasin

In 1927 Heisenberg has invented the uncertainty principle. The principle of uncertainty is, "It is impossible to determine the position and momentum of a particle at the same time."The more accurately the momentum is measured, the more uncertain the position will be. Just knowing the position would make the momentum uncertain. Einstein was adamant against this principle until his death. He thought that particles have some secret rules. Einstein thought, "The uncertainty principle is incomplete. There is a mistake somewhere that has resulted in uncertainty. Many did not accept Einstein then. But I'm sure Einstein was right then, there are secret rules for particles. Heisenberg's uncertainty principle is also 100% correct . I recently published a research paper named "Quantum Certainty Mechanics"[1], which shows the principle of measuring the momentum and position of particles by the quantum certainty principle. Why uncertainty comes from certainty is the main topic of this research paper. When the value of the energy absorbed by the electron in the laboratory is calculated, the uncertainty is removed. The details are discussed below.


2018 ◽  
Vol 14 (3) ◽  
pp. 5865-5868
Author(s):  
Antonio Puccini

As known the Weak Nuclear Force or Weak Interaction(WI) acts between quarks (Qs) and leptons. The action of the WI is mediated by highly massive gauge bosons. How does a nuclear Q emit such a massive particle, approximately 16000 or 40000 times its mass? Who provides so much energy to a up Q or a down Q? However, it must be considered that according to Quantum Mechanics it is possible to loan temporarily some energy, but to a precise and binding condition, established by the Uncertainty Principle: the higher the energy borrowed, the shorter the duration of the loan. Our calculations show that the maximum distance these bosons can travel, i.e. the upper limit of their range, corresponds to 1.543×10-15 [cm] for particles W+ and W- and 1.36×10-15[cm] for Z° particles.


2020 ◽  
Author(s):  
Douglas Michael Snyder

Generally a central role has been assigned to an unavoidable physical interaction between the measuring instrument and the physical entity measured in the change in the wave function that often occurs in measurement in quantum mechanics. A survey of textbooks on quantum mechanics by authors such as Dicke and Witke (1960), Eisberg and Resnick (1985), Gasiorowicz (1974), Goswami (1992), and Liboff (1993) supports this point. Furthermore, in line with the view of Bohr and Feynman, generally the unavoidable interaction between a measuring instrument and the physical entity measured is considered responsible for the uncertainty principle. A gedankenexperiment using Feynman's double-hole interference scenario shows that physical interaction is not necessary to effect the change in the wave function that occurs in measurement in quantum mechanics. Instead, the general case is that knowledge is linked to the change in the wave function, not a physical interaction between the physical existent measured and the measuring instrument. Empirical work on electron shelving that involves null measurements, or what Renninger called negative observations (Zeitschrift fur Physik, vol. 158, p. 417), supports these points. Work on electron shelving is reported by Dehmelt and his colleagues (Physical Review Letters, vol. 56, p. 2797), Wineland and his colleagues (Physical Review Letters, vol. 57, p. 1699), and Sauter, Neuhauser, Blatt, and Toschek (Physical Review Letters, vol. 57, p. 1696). Originally appeared in arXiv on December 19, 1995. It can be accessed at arXiv:quant-ph/9601006 .


Author(s):  
Екатерина Геннадьевна Диденкулова ◽  
Анна Витальевна Кокорина ◽  
Алексей Викторович Слюняев

Приведены детали численной схемы и способа задания начальных условий для моделирования нерегулярной динамики ансамблей солитонов в рамках уравнений типа Кортевега-де Вриза на примере модифицированного уравнения Кортевега-де Вриза с фокусирующим типом нелинейности. Дано качественное описание эволюции статистических характеристик для ансамблей солитонов одной и разных полярностей. Обсуждаются результаты тестовых экспериментов по столкновению большого числа солитонов. The details of the numerical scheme and the method of specifying the initial conditions for the simulation of the irregular dynamics of soliton ensembles within the framework of equations of the Korteweg - de Vries type are given using the example of the modified Korteweg - de Vries equation with a focusing type of nonlinearity. The numerical algorithm is based on a pseudo-spectral method with implicit integration over time and uses the Crank-Nicholson scheme for improving the stability property. The aims of the research are to determine the relationship between the spectral composition of the waves (the Fourier spectrum or the spectrum of the associated scattering problem) and their probabilistic properties, to describe transient processes and the equilibrium states. The paper gives a qualitative description of the evolution of statistical characteristics for ensembles of solitons of the same and different polarities, obtained as a result of numerical simulations; the probability distributions for wave amplitudes are also provided. The results of test experiments on the collision of a large number of solitons are discussed: the choice of optimal conditions and the manifestation of numerical artifacts caused by insufficient accuracy of the discretization. The numerical scheme used turned out to be extremely suitable for the class of the problems studied, since it ensures good accuracy in describing collisions of solitons with a short computation time.


Sign in / Sign up

Export Citation Format

Share Document