scholarly journals Introduction to Quantum Logical Information Theory: Talk

2018 ◽  
Vol 182 ◽  
pp. 02039
Author(s):  
David Ellerman

Logical information theory is the quantitative version of the logic of partitions just as logical probability theory is the quantitative version of the dual Boolean logic of subsets. The resulting notion of information is about distinctions, differences, and distinguishability, and is formalized using the distinctions (“dits”) of a partition (a pair of points distinguished by the partition). All the definitions of simple, joint, conditional, and mutual entropy of Shannon information theory are derived by a uniform transformation from the corresponding definitions at the logical level. The purpose of this talk is to outline the direct generalization to quantum logical information theory that similarly focuses on the pairs of eigenstates distinguished by an observable, i.e., “qudits” of an observable. The fundamental theorem for quantum logical entropy and measurement establishes a direct quantitative connection between the increase in quantum logical entropy due to a projective measurement and the eigenstates (cohered together in the pure superposition state being measured) that are distinguished by the measurement (decohered in the postmeasurement mixed state). Both the classical and quantum versions of logical entropy have simple interpretations as “two-draw” probabilities for distinctions. The conclusion is that quantum logical entropy is the simple and natural notion of information for quantum information theory focusing on the distinguishing of quantum states.

Author(s):  
David Ellerman

Logical information theory is the quantitative version of the logic of partitions just as logical probability theory is the quantitative version of the dual Boolean logic of subsets. The resulting notion of information is about distinctions, differences, and distinguishability, and is formalized using the distinctions (`dits') of a partition (a pair of points distinguished by the partition). All the definitions of simple, joint, conditional, and mutual entropy of Shannon information theory are derived by a uniform transformation from the corresponding definitions at the logical level. The purpose of this paper is to give the direct generalization to quantum logical information theory that similarly focuses on the pairs of eigenstates distinguished by an observable, i.e., qudits of an observable. The fundamental theorem for quantum logical entropy and measurement establishes a direct quantitative connection between the increase in quantum logical entropy due to a projective measurement and the eigenstates (cohered together in the pure superposition state being measured) that are distinguished by the measurement (decohered in the post-measurement mixed state). Both the classical and quantum versions of logical entropy have simple interpretations as "two-draw" probabilities for distinctions. The conclusion is that quantum logical entropy is the simple and natural notion of information for quantum information theory focusing on the distinguishing of quantum states.


Entropy ◽  
2018 ◽  
Vol 20 (9) ◽  
pp. 679 ◽  
Author(s):  
David Ellerman

Logical information theory is the quantitative version of the logic of partitions just as logical probability theory is the quantitative version of the dual Boolean logic of subsets. The resulting notion of information is about distinctions, differences and distinguishability and is formalized using the distinctions (“dits”) of a partition (a pair of points distinguished by the partition). All the definitions of simple, joint, conditional and mutual entropy of Shannon information theory are derived by a uniform transformation from the corresponding definitions at the logical level. The purpose of this paper is to give the direct generalization to quantum logical information theory that similarly focuses on the pairs of eigenstates distinguished by an observable, i.e., qudits of an observable. The fundamental theorem for quantum logical entropy and measurement establishes a direct quantitative connection between the increase in quantum logical entropy due to a projective measurement and the eigenstates (cohered together in the pure superposition state being measured) that are distinguished by the measurement (decohered in the post-measurement mixed state). Both the classical and quantum versions of logical entropy have simple interpretations as “two-draw” probabilities for distinctions. The conclusion is that quantum logical entropy is the simple and natural notion of information for quantum information theory focusing on the distinguishing of quantum states.


2018 ◽  
Vol 225 (2) ◽  
pp. 425-450
Author(s):  
Dr. Karim Mousa Hussein Mezban

     This research was devoted to elaborate various models of probability theory which is adopted by a number of philosophers of science in the twentieth century.  The debate between them about the validity of probability theory is shown through philosophical researches, the research is distributed to fifth sections which can be listed as follow: Part (1): The theory of classical probability (classical) has been devoted to knowledge of the basic model of probability theory. Part (2): the theory of repetitive probability adopted by the philosopher of science Hans Reichenbach to fill the lack of the model The basis of probability theory part (3): The theory of logical probability adopted by the philosopher of science Rudolf Carnab to fill the logical deficit in the theory of the repetitive probability of Rischenbach, and also included sparring between them. part(4): The theory of probability of vascular, divided into two parts: Section 4.a: Pragmatisms is adopted by philosopher Charles Pierce to enshrine pragmatism or tendency in the concept of probability. Section 4.b: Karl Popper's Probabilistic Probability Theory, in which he defended the inability of probability to justify the induction method. part(5): The theory of entropy probability, which was devoted to the contemporary theory of probability that took all previous models of probability according to information theory.


Author(s):  
Noboru Watanabe ◽  
Masahiro Muto

Transmitted complexity (mutual entropy) is one of the important measures for quantum information theory developed recently in several ways. We will review the fundamental concepts of the Kossakowski, Ohya and Watanabe entropy and define a transmitted complexity for quantum dynamical systems. This article is part of the themed issue ‘Second quantum revolution: foundational questions’.


Author(s):  
Vlatko Vedral

The main view promoted by this book is that underlying many different aspects of reality is some form of information processing. The theory of information started rather innocently, as the result of a very specific question that Shannon considered, which was how to maximize the capacity of communication between two users. Shannon showed that all we need is to associate a probability to an event, and defined a metric that allowed you to quantify the information content of that event. Interestingly, because of its simplicity and intuitiveness, Shannon’s views have been successfully applied to many other problems. We can view biological information through Shannon’s theory as a communication in time (where the objective of natural selection is to propagate the gene pool into the future). But it is not only that communications and biology are trying to optimize information. In physics, systems arrange themselves so that entropy is maximized, and this entropy is quantified in the same way as Shannon’s information. We encounter the same form of information in other phenomena. Financial speculation is also governed by the same concept of entropy, and optimizing your profit is the same problem as optimizing your channel capacity. In social theory, society is governed by its interconnectedness or correlation and this correlation itself is quantified by Shannon’s entropy. Underlying all these phenomena was the classical Boolean logic where events had clear outcomes, either yes or no, on or off, and so on. In our most accurate description of reality, given by quantum theory, we know that bits of information are an approximation to a much more precise concept of qubits. Qubits, unlike bits, can exist in a multitude of states, any combination of yes and no, on and off. Shannon’s information theory has been extended to account for quantum theory and the resulting framework, quantum information theory, has already shown a number of advantages. The greater power of quantum information theory is manifested in more secure cryptographic protocols, a completely new order of computing, quantum teleportation, and a number of other applications that were simply not possible according to Shannon’s view.


4open ◽  
2022 ◽  
Vol 5 ◽  
pp. 1
Author(s):  
David Ellerman

We live in the information age. Claude Shannon, as the father of the information age, gave us a theory of communications that quantified an “amount of information,” but, as he pointed out, “no concept of information itself was defined.” Logical entropy provides that definition. Logical entropy is the natural measure of the notion of information based on distinctions, differences, distinguishability, and diversity. It is the (normalized) quantitative measure of the distinctions of a partition on a set-just as the Boole–Laplace logical probability is the normalized quantitative measure of the elements of a subset of a set. And partitions and subsets are mathematically dual concepts – so the logic of partitions is dual in that sense to the usual Boolean logic of subsets, and hence the name “logical entropy.” The logical entropy of a partition has a simple interpretation as the probability that a distinction or dit (elements in different blocks) is obtained in two independent draws from the underlying set. The Shannon entropy is shown to also be based on this notion of information-as-distinctions; it is the average minimum number of binary partitions (bits) that need to be joined to make all the same distinctions of the given partition. Hence all the concepts of simple, joint, conditional, and mutual logical entropy can be transformed into the corresponding concepts of Shannon entropy by a uniform non-linear dit-bit transform. And finally logical entropy linearizes naturally to the corresponding quantum concept. The quantum logical entropy of an observable applied to a state is the probability that two different eigenvalues are obtained in two independent projective measurements of that observable on that state.


Sign in / Sign up

Export Citation Format

Share Document