Dancing with entropy: form attributes, children, and representation

2004 ◽  
Vol 60 (2) ◽  
pp. 144-163 ◽  
Author(s):  
Jodi Kearns ◽  
Brian O'Connor

This study explores the use of the information theory entropy equation in representations of videos for children. The calculated rates of information in the videos are calibrated to the corresponding perceived rates of information as elicited from the 12 seven‐ to ten‐year‐old girls who were shown video documents. Entropy measures are calculated for several video elements: set time, set incidence, verbal time, verbal incidence, set constraint, nonverbal dependence, and character appearance. As hypothesized, mechanically calculated entropy measure (CEM) was found to be sufficiently similar to perceived entropy measure (PEM) made by children so that they can be used as useful and predictive elements of representations of children's videos. The relationships between the CEM and the PEM show that CEM could stand for PEM in order to enrich representations for video documents for this age group.

2015 ◽  
Vol 15 (4) ◽  
pp. 13-26 ◽  
Author(s):  
Jun Ye

Abstract Due to some drawbacks of the cross entropy between Single Valued Neutrosophic Sets (SVNSs) in dealing with decision-making problems, the existing single valued neutrosophic cross entropy indicates an asymmetrical phenomenon or may produce an undefined (unmeaningful) phenomenon in some situations. In order to overcome these disadvantages, this paper proposes an improved cross entropy measure of SVNSs and investigates its properties, and then extends it to a cross entropy measure between interval neutrosophic sets (INSs). Furthermore, the cross entropy measures are applied to multicriteria decision making problems with single valued neutrosophic information and interval neutrosophic information. In decision making methods, through the weighted cross entropy measure between each alternative and the the ideal alternative, one can obtain the ranking order of all alternatives and the best one. The decision-making methods using the proposed cross entropy measures can efficiently deal with decision making problems with incomplete, indeterminate and inconsistent information which exist usually in real situations. Finally, two illustrative examples are provided to demonstrate the application and efficiency of the developed decision making approaches under single valued neutrosophic and interval neutrosophic environments.


Entropy ◽  
2018 ◽  
Vol 20 (11) ◽  
pp. 844 ◽  
Author(s):  
Wen-Hua Cui ◽  
Jun Ye

In order to quantify the fuzziness in the simplified neutrosophic setting, this paper proposes a generalized distance-based entropy measure and a dimension root entropy measure of simplified neutrosophic sets (NSs) (containing interval-valued and single-valued NSs) and verifies their properties. Then, comparison with the existing relative interval-valued NS entropy measures through a numerical example is carried out to demonstrate the feasibility and rationality of the presented generalized distance-based entropy and dimension root entropy measures of simplified NSs. Lastly, a decision-making example is presented to illustrate their applicability, and then the decision results indicate that the presented entropy measures are effective and reasonable. Hence, this study enriches the simplified neutrosophic entropy theory and measure approaches.


Entropy ◽  
2020 ◽  
Vol 22 (8) ◽  
pp. 845 ◽  
Author(s):  
Aadel Howedi ◽  
Ahmad Lotfi ◽  
Amir Pourabdollah

This paper presents anomaly detection in activities of daily living based on entropy measures. It is shown that the proposed approach will identify anomalies when there are visitors representing a multi-occupant environment. Residents often receive visits from family members or health care workers. Therefore, the residents’ activity is expected to be different when there is a visitor, which could be considered as an abnormal activity pattern. Identifying anomalies is essential for healthcare management, as this will enable action to avoid prospective problems early and to improve and support residents’ ability to live safely and independently in their own homes. Entropy measure analysis is an established method to detect disorder or irregularities in many applications: however, this has rarely been applied in the context of activities of daily living. An experimental evaluation is conducted to detect anomalies obtained from a real home environment. Experimental results are presented to demonstrate the effectiveness of the entropy measures employed in detecting anomalies in the resident’s activity and identifying visiting times in the same environment.


Entropy ◽  
2021 ◽  
Vol 23 (12) ◽  
pp. 1618
Author(s):  
Rubem P. Mondaini ◽  
Simão C. de Albuquerque Neto

The Khinchin–Shannon generalized inequalities for entropy measures in Information Theory, are a paradigm which can be used to test the Synergy of the distributions of probabilities of occurrence in physical systems. The rich algebraic structure associated with the introduction of escort probabilities seems to be essential for deriving these inequalities for the two-parameter Sharma–Mittal set of entropy measures. We also emphasize the derivation of these inequalities for the special cases of one-parameter Havrda–Charvat’s, Rényi’s and Landsberg–Vedral’s entropy measures.


2021 ◽  
Vol 53 ◽  
Author(s):  
Vikas Kumar ◽  
Rekha Rani ◽  
Nirdesh Singh

Non-additive entropy measures are important for many applications. In this paper, we introduce a quantile-based non-additive entropy measure, based on Tsallis entropy and study their properties. Some relationships of this measure with well-known reliability mea- sures and ageing classes are studied and some characterization results are presented. Also the concept of quantile-based shift independent entropy measures has been introduced and studied various properties.


1999 ◽  
pp. 599-611
Author(s):  
M. Matić ◽  
Charles E. M. Pearce ◽  
Josip Pečarić

2022 ◽  
Vol 24 (1) ◽  
pp. 105-118
Author(s):  
Mervat Mahdy ◽  
◽  
Dina S. Eltelbany ◽  
Hoda Mohammed ◽  
◽  
...  

Entropy measures the amount of uncertainty and dispersion of an unknown or random quantity, this concept introduced at first by Shannon (1948), it is important for studies in many areas. Like, information theory: entropy measures the amount of information in each message received, physics: entropy is the basic concept that measures the disorder of the thermodynamical system, and others. Then, in this paper, we introduce an alternative measure of entropy, called 𝐻𝑁- entropy, unlike Shannon entropy, this proposed measure of order α and β is more flexible than Shannon. Then, the cumulative residual 𝐻𝑁- entropy, cumulative 𝐻𝑁- entropy, and weighted version have been introduced. Finally, comparison between Shannon entropy and 𝐻𝑁- entropy and numerical results have been introduced.


2021 ◽  
Vol 10 (2) ◽  
pp. 82-102
Author(s):  
Omdutt Sharma ◽  
Pratiksha Tiwari ◽  
Priti Gupta

Information theory is a tool to measure uncertainty; these days, it is used to solve various challenging problems that involve hybridization of information theory with the fuzzy set, rough sets, vague sets, etc. In order to solve challenging problems in scientific data analysis and visualization recently, various authors are working on hybrid measures of information theory. In this paper, using the relation between information measures, some measures are proposed for the fuzzy rough set. Firstly, an entropy measure is derived using the fuzzy rough similarity measure, and then corresponding to this entropy measure, some other measures like mutual information measure, joint entropy measure, and conditional entropy measure are also proposed. Some properties of these measures are also studied. Later, the proposed measure is compared with some existing measures to prove its efficiency. Further, the proposed measures are applied to pattern recognition, medical diagnoses, and a real-life decision-making problem for incorporating software in the curriculum at the Department of Statistics.


Author(s):  
Hang Tian ◽  
Jiaru Li ◽  
Fangwei Zhang ◽  
Yujuan Xu ◽  
Caihong Cui ◽  
...  

This paper identifies four variables to reveal the internal mechanisms of the entropy measures on intuitionistic fuzzy sets (IFSs) and interval-valued intuitionistic fuzzy sets (IVIFSs). First, four variables are used to propose a pair of generalized entropy measures on IFSs and IVIFSs. Second, three specific entropy measures are put forward to illustrate the effectiveness of the generalized entropy measure functions. Third, a novel multiple attribute decision-making approach under an intuitionistic fuzzy environment is proposed. The superiority of the decision-making approach is that the weight values of the attributes are obtained by their related entropy measures. Finally, the performance of the proposed entropy regulations on IFSs and IVIFSs is illustrated through a mode assessment example on open communities.


Entropy ◽  
2019 ◽  
Vol 21 (5) ◽  
pp. 482 ◽  
Author(s):  
Modjtaba Ghorbani ◽  
Matthias Dehmer ◽  
Mina Rajabi-Parsa ◽  
Abbe Mowshowitz ◽  
Frank Emmert-Streib

In this paper, we study several distance-based entropy measures on fullerene graphs. These include the topological information content of a graph I a ( G ) , a degree-based entropy measure, the eccentric-entropy I f σ ( G ) , the Hosoya entropy H ( G ) and, finally, the radial centric information entropy H e c c . We compare these measures on two infinite classes of fullerene graphs denoted by A 12 n + 4 and B 12 n + 6 . We have chosen these measures as they are easily computable and capture meaningful graph properties. To demonstrate the utility of these measures, we investigate the Pearson correlation between them on the fullerene graphs.


Sign in / Sign up

Export Citation Format

Share Document