scholarly journals Some Characterization Results on Dynamic Cumulative Residual Tsallis Entropy

2015 ◽  
Vol 2015 ◽  
pp. 1-8 ◽  
Author(s):  
Madan Mohan Sati ◽  
Nitin Gupta

We propose a generalized cumulative residual information measure based on Tsallis entropy and its dynamic version. We study the characterizations of the proposed information measure and define new classes of life distributions based on this measure. Some applications are provided in relation to weighted and equilibrium probability models. Finally the empirical cumulative Tsallis entropy is proposed to estimate the new information measure.

Entropy ◽  
2021 ◽  
Vol 24 (1) ◽  
pp. 9
Author(s):  
Muhammed Rasheed Irshad ◽  
Radhakumari Maya ◽  
Francesco Buono ◽  
Maria Longobardi

Tsallis introduced a non-logarithmic generalization of Shannon entropy, namely Tsallis entropy, which is non-extensive. Sati and Gupta proposed cumulative residual information based on this non-extensive entropy measure, namely cumulative residual Tsallis entropy (CRTE), and its dynamic version, namely dynamic cumulative residual Tsallis entropy (DCRTE). In the present paper, we propose non-parametric kernel type estimators for CRTE and DCRTE where the considered observations exhibit an ρ-mixing dependence condition. Asymptotic properties of the estimators were established under suitable regularity conditions. A numerical evaluation of the proposed estimator is exhibited and a Monte Carlo simulation study was carried out.


2020 ◽  
Vol 1 (11) ◽  
pp. 5-23
Author(s):  
Alexander V. Spesivtsev ◽  
◽  
Sergey A. Spesivtsev ◽  
Vasily A. Spesivtsev ◽  
◽  
...  

Isites for the introduction of the fuzzyprobabilistic approach in the construction of mathematical models in personality psychology. The specificity of this field of psychology lies in the impossibility of constructing effective mathematical models by deterministic methods, including statistical ones. The fact is that statistical methods are intended for processing a variety of initial data and are used successfully in social psychology, which describes the trends of a certain phenomenon of a group of people «in general and in the middle», but are not applicable to an individual personality. The main position of the authors is the need to use the knowledge and experience of highly qualified specialistspsychologists who practice direct counseling of clients to solve personal psychological problems. The proposed apparatus for constructing fuzzy probability models is presented in the form of an algorithm that describes all the necessary steps – from working with an expert to interpreting models and using them to obtain qualitatively new information about the client's psychological state. The application of the fuzzyprobability approach is illustrated by the example of the synthesis of a mathematical model for assessing and predicting the psychological state of a woman's readiness to create a stable family


Entropy ◽  
2020 ◽  
Vol 22 (6) ◽  
pp. 709 ◽  
Author(s):  
Abdolsaeed Toomaj ◽  
Antonio Di Crescenzo

The generalized cumulative residual entropy is a recently defined dispersion measure. In this paper, we obtain some further results for such a measure, in relation to the generalized cumulative residual entropy and the variance of random lifetimes. We show that it has an intimate connection with the non-homogeneous Poisson process. We also get new expressions, bounds and stochastic comparisons involving such measures. Moreover, the dynamic version of the mentioned notions is studied through the residual lifetimes and suitable aging notions. In this framework we achieve some findings of interest in reliability theory, such as a characterization for the exponential distribution, various results on k-out-of-n systems, and a connection to the excess wealth order. We also obtain similar results for the generalized cumulative entropy, which is a dual measure to the generalized cumulative residual entropy.


Author(s):  
Litegebe Wondie ◽  
Satish Kumar

We presenta relation betweenTsallis’s entropy and generalizedKerridge inaccuracywhich is called generalizedShannon inequalityand is well-known generalization ininformation theoryand then give its application in coding theory. The objective of the paper is to establish a result on noiseless coding theorem for the proposed mean code length in terms of generalized information measure of orderξ.


Author(s):  
Héctor Fernando Gómez-García ◽  
José L. Marroquín ◽  
Johan Van Horebeek

2009 ◽  
Vol 2009 ◽  
pp. 1-13
Author(s):  
Fritz Wysotzki ◽  
Peter Geibel

This article describes how the costs of misclassification given with the individual training objects for classification learning can be used in the construction of decision trees for minimal cost instead of minimal error class decisions. This is demonstrated by defining modified, cost-dependent probabilities, a new, cost-dependent information measure, and using a cost-sensitive extension of the CAL5 algorithm for learning decision trees. The cost-dependent information measure ensures the selection of the (local) next best, that is, cost-minimizing, discriminating attribute in the sequential construction of the classification trees. This is shown to be a cost-dependent generalization of the classical information measure introduced by Shannon, which only depends on classical probabilities. It is therefore of general importance and extends classic information theory, knowledge processing, and cognitive science, since subjective evaluations of decision alternatives can be included in entropy and the transferred information. Decision trees can then be viewed as cost-minimizing decoders for class symbols emitted by a source and coded by feature vectors. Experiments with two artificial datasets and one application example show that this approach is more accurate than a method which uses class dependent costs given by experts a priori.


Sign in / Sign up

Export Citation Format

Share Document