scholarly journals Introduction to Logical Entropy and its Relationship to Shannon Entropy

2021 ◽  
Author(s):  
David Ellerman
4open ◽  
2022 ◽  
Vol 5 ◽  
pp. 1
Author(s):  
David Ellerman

We live in the information age. Claude Shannon, as the father of the information age, gave us a theory of communications that quantified an “amount of information,” but, as he pointed out, “no concept of information itself was defined.” Logical entropy provides that definition. Logical entropy is the natural measure of the notion of information based on distinctions, differences, distinguishability, and diversity. It is the (normalized) quantitative measure of the distinctions of a partition on a set-just as the Boole–Laplace logical probability is the normalized quantitative measure of the elements of a subset of a set. And partitions and subsets are mathematically dual concepts – so the logic of partitions is dual in that sense to the usual Boolean logic of subsets, and hence the name “logical entropy.” The logical entropy of a partition has a simple interpretation as the probability that a distinction or dit (elements in different blocks) is obtained in two independent draws from the underlying set. The Shannon entropy is shown to also be based on this notion of information-as-distinctions; it is the average minimum number of binary partitions (bits) that need to be joined to make all the same distinctions of the given partition. Hence all the concepts of simple, joint, conditional, and mutual logical entropy can be transformed into the corresponding concepts of Shannon entropy by a uniform non-linear dit-bit transform. And finally logical entropy linearizes naturally to the corresponding quantum concept. The quantum logical entropy of an observable applied to a state is the probability that two different eigenvalues are obtained in two independent projective measurements of that observable on that state.


2013 ◽  
Vol 07 (02) ◽  
pp. 121-145 ◽  
Author(s):  
DAVID ELLERMAN

The logical basis for information theory is the newly developed logic of partitions that is dual to the usual Boolean logic of subsets. The key concept is a "distinction" of a partition, an ordered pair of elements in distinct blocks of the partition. The logical concept of entropy based on partition logic is the normalized counting measure of the set of distinctions of a partition on a finite set — just as the usual logical notion of probability based on the Boolean logic of subsets is the normalized counting measure of the subsets (events). Thus logical entropy is a measure on the set of ordered pairs, and all the compound notions of entropy (join entropy, conditional entropy, and mutual information) arise in the usual way from the measure (e.g. the inclusion-exclusion principle) — just like the corresponding notions of probability. The usual Shannon entropy of a partition is developed by replacing the normalized count of distinctions (dits) by the average number of binary partitions (bits) necessary to make all the distinctions of the partition.


2012 ◽  
Vol 27 (28) ◽  
pp. 1250164
Author(s):  
J. MANUEL GARCÍA-ISLAS

In the three-dimensional spin foam model of quantum gravity with a cosmological constant, there exists a set of observables associated with spin network graphs. A set of probabilities is calculated from these observables, and hence the associated Shannon entropy can be defined. We present the Shannon entropy associated with these observables and find some interesting bounded inequalities. The problem relates measurements, entropy and information theory in a simple way which we explain.


Mathematics ◽  
2021 ◽  
Vol 9 (9) ◽  
pp. 1034
Author(s):  
María Carmen Carnero

Due to the important advantages it offers, gamification is one of the fastest-growing industries in the world, and interest from the market and from users continues to grow. This has led to the development of more and more applications aimed at different fields, and in particular the education sector. Choosing the most suitable application is increasingly difficult, and so to solve this problem, our study designed a model which is an innovative combination of fuzzy Technique for Order of Preference by Similarity to Ideal Solution (TOPSIS) with the Measuring Attractiveness by a Categorical Based Evaluation Technique (MACBETH) and Shannon entropy theory, to choose the most suitable gamification application for the Industrial Manufacturing and Organisation Systems course in the degree programmes for Electrical Engineering and Industrial and Automatic Electronics at the Higher Technical School of Industrial Engineering of Ciudad Real, part of the University of Castilla-La Mancha. There is no precedent in the literature that combines MACBETH and fuzzy Shannon entropy to simultaneously consider the subjective and objective weights of criteria to achieve a more accurate model. The objective weights computed from fuzzy Shannon entropy were compared with those calculated from De Luca and Termini entropy and exponential entropy. The validity of the proposed method is tested through the Preference Ranking Organisation METHod for Enrichment of Evaluations (PROMETHEE) II, ELimination and Choice Expressing REality (ELECTRE) III, and fuzzy VIKOR method (VIsekriterijumska optimizacija i KOmpromisno Resenje). The results show that Quizizz is the best option for this course, and it was used in two academic years. There are no precedents in the literature using fuzzy multicriteria decision analysis techniques to select the most suitable gamification application for a degree-level university course.


2021 ◽  
Vol 11 (1) ◽  
Author(s):  
Guanglei Xu ◽  
William S. Oates

AbstractRestricted Boltzmann Machines (RBMs) have been proposed for developing neural networks for a variety of unsupervised machine learning applications such as image recognition, drug discovery, and materials design. The Boltzmann probability distribution is used as a model to identify network parameters by optimizing the likelihood of predicting an output given hidden states trained on available data. Training such networks often requires sampling over a large probability space that must be approximated during gradient based optimization. Quantum annealing has been proposed as a means to search this space more efficiently which has been experimentally investigated on D-Wave hardware. D-Wave implementation requires selection of an effective inverse temperature or hyperparameter ($$\beta $$ β ) within the Boltzmann distribution which can strongly influence optimization. Here, we show how this parameter can be estimated as a hyperparameter applied to D-Wave hardware during neural network training by maximizing the likelihood or minimizing the Shannon entropy. We find both methods improve training RBMs based upon D-Wave hardware experimental validation on an image recognition problem. Neural network image reconstruction errors are evaluated using Bayesian uncertainty analysis which illustrate more than an order magnitude lower image reconstruction error using the maximum likelihood over manually optimizing the hyperparameter. The maximum likelihood method is also shown to out-perform minimizing the Shannon entropy for image reconstruction.


Sign in / Sign up

Export Citation Format

Share Document