scholarly journals Non-locality, contextuality and valuation algebras: a general theory of disagreement

Author(s):  
Samson Abramsky ◽  
Giovanni Carù

We establish a strong link between two apparently unrelated topics: the study of conflicting information in the formal framework of valuation algebras, and the phenomena of non-locality and contextuality. In particular, we show that these peculiar features of quantum theory are mathematically equivalent to a general notion of disagreement between information sources. This result vastly generalizes previously observed connections between contextuality, relat- ional databases, constraint satisfaction problems and logical paradoxes, and gives further proof that contextual behaviour is not a phenomenon limited to quantum physics, but pervades various domains of mathematics and computer science. The connection allows to translate theorems, methods and algorithms from one field to the other, and paves the way for the application of generic inference algorithms to study contextuality. This article is part of the theme issue ‘Contextuality and probability in quantum mechanics and beyond’.

2021 ◽  
Vol 11 (2) ◽  
pp. 79-87
Author(s):  
Meredith Carroll ◽  
Paige Sanchez ◽  
Donna Wilt

Abstract. The purpose of this study was to examine how pilots respond to conflicting information on the flight deck. In this study, 108 airline, corporate, and general aviation pilots completed an online questionnaire reporting weather, traffic, and navigation information conflicts experienced on the flight deck, including which information sources they trusted and acted on. Results indicated that weather information conflicts are most commonly experienced, and typically between a certified source in the panel and an uncertified electronic flight bag application. Most participants (a) trusted certified systems due to their accuracy, reliability, recency, and knowledge about the source, and (2) acted on the certified system due to trust, being trained and required to use it, and its indicating a more hazardous situation.


Author(s):  
Sergio Greco ◽  
Cristina Sirangelo ◽  
Irina Trubitsyna ◽  
Ester Zumpano

The objective of this article is to investigate the problems related to the extensional integration of information sources. In particular, we propose an approach for managing inconsistent databases, that is, databases violating integrity constraints. The problem of dealing with inconsistent information has recently assumed additional relevance as it plays a key role in all the areas in which duplicate information or conflicting information is likely to occur (Agarwal et al., 1995; Arenas, Bertossi & Chomicki, 1999; Bry, 1997; Dung, 1996; Lin & Mendelzon, 1999; Subrahmanian, 1994).


Author(s):  
Hubie Chen ◽  
Georg Gottlob ◽  
Matthias Lanzinger ◽  
Reinhard Pichler

Constraint satisfaction problems (CSPs) are an important formal framework for the uniform treatment of various prominent AI tasks, e.g., coloring or scheduling problems. Solving CSPs is, in general, known to be NP-complete and fixed-parameter intractable when parameterized by their constraint scopes. We give a characterization of those classes of CSPs for which the problem becomes fixed-parameter tractable. Our characterization significantly increases the utility of the CSP framework by making it possible to decide the fixed-parameter tractability of problems via their CSP formulations. We further extend our characterization to the evaluation of unions of conjunctive queries, a fundamental problem in databases. Furthermore, we provide some new insight on the frontier of PTIME solvability of CSPs. In particular, we observe that bounded fractional hypertree width is more general than bounded hypertree width only for classes that exhibit a certain type of exponential growth. The presented work resolves a long-standing open problem and yields powerful new tools for complexity research in AI and database theory.


Author(s):  
Nicolas Gisin ◽  
Florian Fröwis

Quantum non-locality has been an extremely fruitful subject of research, leading the scientific revolution towards quantum information science, in particular, to device-independent quantum information processing. We argue that the time is ripe to work on another basic problem in the foundations of quantum physics, the quantum measurement problem, which should produce good physics in theoretical, mathematical, experimental and applied physics. We briefly review how quantum non-locality contributed to physics (including some outstanding open problems) and suggest ways in which questions around macroscopic quantumness could equally contribute to all aspects of physics. This article is part of a discussion meeting issue ‘Foundations of quantum mechanics and their impact on contemporary society’.


2016 ◽  
Vol 8 (2) ◽  
pp. 49-71
Author(s):  
Mark Germine

The genesis of actuality from potentiality, with the apparent role of the observer, is an important and unsolved problem which essentially defines science‟s view of reality in a variety of contexts. Observation then becomes lawful and not emergent. Panentheism is needed to provide a mechanism for order outside of blind efficient causality, in a Universal final causality. Classical physics is over a hundred years out of date, yet scientific models remain mechanistic and deterministic. Deism, a remnant of classical cosmology, is examined and rejected by scientists and philosophers, and certain pre-scientific notions of religion are scorned, putting the matter to rest. Quantum physics, in its basic form, is necessary if there is to be any philosophical or scientific notion of free will and self-determination, as potentiality. Quantum metaphysics is also needed because classical physics is fundamentally limited to localized external relations, lacking the internality and non-locality of relatedness. God, or the equivalent, is necessary to complete the equation. Physicists now tell us that reality is fundamentally mental and is created by observation. Observation is here taken to mean experience, with experience going all the way down to the lowest order of a Universal mentality.


Author(s):  
Sauro Succi ◽  
Peter V. Coveney

For it is not the abundance of knowledge, but the interior feeling and taste of things, which is accustomed to satisfy the desire of the soul.(Saint Ignatius of Loyola).We argue that the boldest claims of big data (BD) are in need of revision and toning-down, in view of a few basic lessons learned from the science of complex systems. We point out that, once the most extravagant claims of BD are properly discarded, a synergistic merging of BD with big theory offers considerable potential to spawn a new scientific paradigm capable of overcoming some of the major barriers confronted by the modern scientific method originating with Galileo. These obstacles are due to the presence of nonlinearity, non-locality and hyperdimensions which one encounters frequently in multi-scale modelling of complex systems.This article is part of the theme issue ‘Multiscale modelling, simulation and computing: from the desktop to the exascale’.


Author(s):  
Gennady S. Mishuris ◽  
Alexander B. Movchan ◽  
Leonid I. Slepyan

This paper presents a unified approach to the modelling of elastic solids with embedded dynamic microstructures. General dependences are derived based on Green's kernel formulations. Specifically, we consider systems consisting of a master structure and continuously or discretely distributed oscillators. Several classes of connections between oscillators are studied. We examine how the microstructure affects the dispersion relations and determine the energy distribution between the master structure and microstructures, including the vibration shield phenomenon. Special attention is given to the comparative analysis of discrete and continuous distributions of the oscillators, and to the effects of non-locality and trapped vibrations. This article is part of the theme issue ‘Modelling of dynamic phenomena and localization in structured media (part 2)’.


2014 ◽  
Vol 16 (2) ◽  
pp. e60 ◽  
Author(s):  
Katri Hämeen-Anttila ◽  
Hedvig Nordeng ◽  
Esa Kokki ◽  
Johanna Jyrkkä ◽  
Angela Lupattelli ◽  
...  

Author(s):  
Robert B. Griffiths

In quantum physics, the term ‘contextual’ can be used in more than one way. One usage, here called ‘Bell contextual’ since the idea goes back to Bell, is that if A , B and C are three quantum observables, with A compatible (i.e. commuting) with B and also with C , whereas B and C are incompatible, a measurement of A might yield a different result (indicating that quantum mechanics is contextual) depending upon whether A is measured along with B (the { A ,  B } context) or with C (the { A ,  C } context). An analysis of what projective quantum measurements measure shows that quantum theory is Bell non-contextual: the outcome of a particular A measurement when A is measured along with B would have been exactly the same if A had, instead, been measured along with C . A different definition, here called ‘globally (non)contextual’ refers to whether or not there is (non-contextual) or is not (contextual) a single joint probability distribution that simultaneously assigns probabilities in a consistent manner to the outcomes of measurements of a certain collection of observables, not all of which are compatible. A simple example shows that such a joint probability distribution can exist even in a situation where the measurement probabilities cannot refer to properties of a quantum system, and hence lack physical significance, even though mathematically well defined. It is noted that the quantum sample space, a projective decomposition of the identity, required for interpreting measurements of incompatible properties in different runs of an experiment using different types of apparatus, has a tensor product structure, a fact sometimes overlooked. This article is part of the theme issue ‘Contextuality and probability in quantum mechanics and beyond’.


2020 ◽  
pp. 32-78
Author(s):  
Pieter Adriaans

A computational theory of meaning tries to understand the phenomenon of meaning in terms of computation. Here we give an analysis in the context of Kolmogorov complexity. This theory measures the complexity of a data set in terms of the length of the smallest program that generates the data set on a universal computer. As a natural extension, the set of all programs that produce a data set on a computer can be interpreted as the set of meanings of the data set. We give an analysis of the Kolmogorov structure function and some other attempts to formulate a mathematical theory of meaning in terms of two-part optimal model selection. We show that such theories will always be context dependent: the invariance conditions that make Kolmogorov complexity a valid theory of measurement fail for this more general notion of meaning. One cause is the notion of polysemy: one data set (i.e., a string of symbols) can have different programs with no mutual information that compresses it. Another cause is the existence of recursive bijections between ℕ and ℕ2 for which the two-part code is always more efficient. This generates vacuous optimal two-part codes. We introduce a formal framework to study such contexts in the form of a theory that generalizes the concept of Turing machines to learning agents that have a memory and have access to each other’s functions in terms of a possible world semantics. In such a framework, the notions of randomness and informativeness become agent dependent. We show that such a rich framework explains many of the anomalies of the correct theory of algorithmic complexity. It also provides perspectives for, among other things, the study of cognitive and social processes. Finally, we sketch some application paradigms of the theory.


Sign in / Sign up

Export Citation Format

Share Document