scholarly journals Generalized probability rules from a timeless formulation of Wigner's friend scenarios

Quantum ◽  
2021 ◽  
Vol 5 ◽  
pp. 524
Author(s):  
Veronika Baumann ◽  
Flavio Del Santo ◽  
Alexander R. H. Smith ◽  
Flaminia Giacomini ◽  
Esteban Castro-Ruiz ◽  
...  

The quantum measurement problem can be regarded as the tension between the two alternative dynamics prescribed by quantum mechanics: the unitary evolution of the wave function and the state-update rule (or "collapse") at the instant a measurement takes place. The notorious Wigner's friend gedankenexperiment constitutes the paradoxical scenario in which different observers (one of whom is observed by the other) describe one and the same interaction differently, one –the Friend– via state-update and the other –Wigner– unitarily. This can lead to Wigner and his friend assigning different probabilities to the outcome of the same subsequent measurement. In this paper, we apply the Page-Wootters mechanism (PWM) as a timeless description of Wigner's friend-like scenarios. We show that the standard rules to assign two-time conditional probabilities within the PWM need to be modified to deal with the Wigner's friend gedankenexperiment. We identify three main definitions of such modified rules to assign two-time conditional probabilities, all of which reduce to standard quantum theory for non-Wigner's friend scenarios. However, when applied to the Wigner's friend setup each rule assigns different conditional probabilities, potentially resolving the probability-assignment paradox in a different manner. Moreover, one rule imposes strict limits on when a joint probability distribution for the measurement outcomes of Wigner and his Friend is well-defined, which single out those cases where Wigner's measurement does not disturb the Friend's memory and such a probability has an operational meaning in terms of collectible statistics. Interestingly, the same limits guarantee that said measurement outcomes fulfill the consistency condition of the consistent histories framework.

2017 ◽  
Vol 73 (6) ◽  
pp. 460-473 ◽  
Author(s):  
Maria Cristina Burla ◽  
Benedetta Carrozzini ◽  
Giovanni Luca Cascarano ◽  
Carmelo Giacovazzo ◽  
Giampiero Polidori

Difference electron densities do not play a central role in modern phase refinement approaches, essentially because of the explosive success of the EDM (electron-density modification) techniques, mainly based on observed electron-density syntheses. Difference densities however have been recently rediscovered in connection with theVLD(Vive la Difference) approach, because they are a strong support for strengthening EDM approaches and forab initiocrystal structure solution. In this paper the properties of the most documented difference electron densities, here denoted asF−Fp,mF−FpandmF−DFpsyntheses, are studied. In addition, a fourth new difference synthesis, here denoted as {\overline F_q} synthesis, is proposed. It comes from the study of the same joint probability distribution function from which theVLDapproach arose. The properties of the {\overline F_q} syntheses are studied and compared with those of the other three syntheses. The results suggest that the {\overline F_q} difference may be a useful tool for making modern phase refinement procedures more efficient.


Author(s):  
Andrei Khrennikov

The recent Google’s claim on breakthrough in quantum computing is a gong signal for further analysis of foundational roots of (possible) superiority of some quantum algorithms over the corresponding classical algorithms. This note is a step in this direction. We start with critical analysis of rather common reference to entanglement and quantum nonlocality as the basic sources of quantum superiority. We elevate the role of the Bohr’s principle of complementarity1 (PCOM) by interpreting the Bell-experiments as statistical tests of this principle. (Our analysis also includes comparison of classical vs genuine quantum entanglements.) After a brief presentation of PCOM and endowing it with the information interpretation, we analyze its computational counterpart. The main implication of PCOM is that by using the quantum representation of probability, one need not compute the joint probability distribution (jpd) for observables involved in the process of computation. Jpd’s calculation is exponentially time consuming. Consequently, classical probabilistic algorithms involving calculation of jpd for n random variables can be over-performed by quantum algorithms (for big values of n). Quantum algorithms are based on quantum probability calculus. It is crucial that the latter modifies the classical formula of total probability (FTP). Probability inference based on the quantum version of FTP leads to constructive interference of probabilities increasing probabilities of some events. We also stress the role the basic feature of the genuine quantum superposition comparing with the classical wave superposition: generation of discrete events in measurements on superposition states. Finally, the problem of superiority of quantum computations is coupled with the quantum measurement problem and linearity of dynamics of the quantum state update.


2016 ◽  
Vol 26 (4) ◽  
pp. 467-506 ◽  
Author(s):  
K. Jeganathan ◽  
J. Sumathi ◽  
G. Mahalakshmi

This article presents a perishable stochastic inventory system under continuous review at a service facility consisting of two parallel queues with jockeying. Each server has its own queue, and jockeying among the queues is permitted. The capacity of each queue is of finite size L. The inventory is replenished according to an (s; S) inventory policy and the replenishing times are assumed to be exponentially distributed. The individual customer is issued a demanded item after a random service time, which is distributed as negative exponential. The life time of each item is assumed to be exponential. Customers arrive according to a Poisson process and on arrival; they join the shortest feasible queue. Moreover, if the inventory level is more than one and one queue is empty while in the other queue, more than one customer are waiting, then the customer who has to be received after the customer being served in that queue is transferred to the empty queue. This will prevent one server from being idle while the customers are waiting in the other queue. The waiting customer independently reneges the system after an exponentially distributed amount of time. The joint probability distribution of the inventory level, the number of customers in both queues, and the status of the server are obtained in the steady state. Some important system performance measures in the steady state are derived, so as the long-run total expected cost rate.


2021 ◽  
Vol 4 (1) ◽  
Author(s):  
Philippe Allard Guérin ◽  
Veronika Baumann ◽  
Flavio Del Santo ◽  
Časlav Brukner

AbstractThe notorious Wigner’s friend thought experiment (and modifications thereof) has received renewed interest especially due to new arguments that force us to question some of the fundamental assumptions of quantum theory. In this paper, we formulate a no-go theorem for the persistent reality of Wigner’s friend’s perception, which allows us to conclude that the perceptions that the friend has of her own measurement outcomes at different times cannot “share the same reality”, if seemingly natural quantum mechanical assumptions are met. More formally, this means that, in a Wigner’s friend scenario, there is no joint probability distribution for the friend’s perceived measurement outcomes at two different times, that depends linearly on the initial state of the measured system and whose marginals reproduce the predictions of unitary quantum theory. This theorem entails that one must either (1) propose a nonlinear modification of the Born rule for two-time predictions, (2) sometimes prohibit the use of present information to predict the future—thereby reducing the predictive power of quantum theory—or (3) deny that unitary quantum mechanics makes valid single-time predictions for all observers. We briefly discuss which of the theorem’s assumptions are more likely to be dropped within various popular interpretations of quantum mechanics.


Author(s):  
André Luís Morosov ◽  
Reidar Brumer Bratvold

AbstractThe exploratory phase of a hydrocarbon field is a period when decision-supporting information is scarce while the drilling stakes are high. Each new prospect drilled brings more knowledge about the area and might reveal reserves, hence choosing such prospect is essential for value creation. Drilling decisions must be made under uncertainty as the available geological information is limited and probability elicitation from geoscience experts is key in this process. This work proposes a novel use of geostatistics to help experts elicit geological probabilities more objectively, especially useful during the exploratory phase. The approach is simpler, more consistent with geologic knowledge, more comfortable for geoscientists to use and, more comprehensive for decision-makers to follow when compared to traditional methods. It is also flexible by working with any amount and type of information available. The workflow takes as input conceptual models describing the geology and uses geostatistics to generate spatial variability of geological properties in the vicinity of potential drilling prospects. The output is stochastic realizations which are processed into a joint probability distribution (JPD) containing all conditional probabilities of the process. Input models are interactively changed until the JPD satisfactory represents the expert’s beliefs. A 2D, yet realistic, implementation of the workflow is used as a proof of concept, demonstrating that even simple modeling might suffice for decision-making support. Derivative versions of the JPD are created and their effect on the decision process of selecting the drilling sequence is assessed. The findings from the method application suggest ways to define the input parameters by observing how they affect the JPD and the decision process.


2017 ◽  
Vol 31 (2) ◽  
pp. 139-179 ◽  
Author(s):  
Ioannis Dimitriou

We consider a single server system accepting two types of retrial customers, which arrive according to two independent Poisson streams. The service station can handle at most one customer, and in case of blocking, typeicustomer,i=1, 2, is routed to a separate typeiorbit queue of infinite capacity. Customers from the orbits try to access the server according to the constant retrial policy. We consider coupled orbit queues, and thus, when both orbit queues are non-empty, the orbit queueitries to re-dispatch a blocked customer of typeito the main service station after an exponentially distributed time with rate μi. If an orbit queue empties, the other orbit queue changes its re-dispatch rate from μito$\mu_{i}^{\ast}$. We consider both exponential and arbitrary distributed service requirements, and show that the probability generating function of the joint stationary orbit queue length distribution can be determined using the theory of Riemann (–Hilbert) boundary value problems. For exponential service requirements, we also investigate the exact tail asymptotic behavior of the stationary joint probability distribution of the two orbits with either an idle or a busy server by using the kernel method. Performance metrics are obtained, computational issues are discussed and a simple numerical example is presented.


Sign in / Sign up

Export Citation Format

Share Document