entropy estimator
Recently Published Documents


TOTAL DOCUMENTS

86
(FIVE YEARS 15)

H-INDEX

10
(FIVE YEARS 1)

Entropy ◽  
2021 ◽  
Vol 23 (5) ◽  
pp. 561
Author(s):  
Lianet Contreras Rodríguez ◽  
Evaristo José Madarro-Capó  ◽  
Carlos Miguel Legón-Pérez  ◽  
Omar Rojas ◽  
Guillermo Sosa-Gómez

Entropy makes it possible to measure the uncertainty about an information source from the distribution of its output symbols. It is known that the maximum Shannon’s entropy of a discrete source of information is reached when its symbols follow a Uniform distribution. In cryptography, these sources have great applications since they allow for the highest security standards to be reached. In this work, the most effective estimator is selected to estimate entropy in short samples of bytes and bits with maximum entropy. For this, 18 estimators were compared. Results concerning the comparisons published in the literature between these estimators are discussed. The most suitable estimator is determined experimentally, based on its bias, the mean square error short samples of bytes and bits.


2021 ◽  
Vol 11 (8) ◽  
pp. 3369
Author(s):  
Edgar F. Sierra-Alonso ◽  
Julian Caicedo-Acosta ◽  
Álvaro Ángel Orozco Gutiérrez ◽  
Héctor F. Quintero ◽  
German Castellanos-Dominguez

Vibration-condition monitoring aims to detect bearing damages of rotating machinery’s incipient failures mainly through time–frequency methods because of their efficient analysis of nonstationary signals. However, by having failures with impulse behavior, short-term events have a tendency to be diluted under variable-speed conditions, while information on frequency changes tends to be lost. Here, we introduce an approach to highlighting bearing impulsive failures by measuring short-term spectral components to deal with variable-speed vibrations. The short-term estimator employs two sliding windows: a small one that measures the instantaneous amplitude level and tracks impulsive components and a large interval that evaluates the average background amplitude. Aiming to characterize cyclo-non-stationary processes with impulsive behavior, the emphasizing high-order-based estimator based on the principle of spectral entropy is introduced. For evaluation, both visual inspection and classifier performance are assessed, contrasting the spectral-entropy estimator with the widely used spectral-kurtosis approach for dealing with impulsive signals. The validation of short-time/-angle spectral analysis performed on three datasets at variable speed showed that the proposed spectral-entropy estimator is a promising indicator for emphasizing bearing failures with impulse behavior.


2021 ◽  
Vol 14 (3) ◽  
pp. 97
Author(s):  
Farzad Alavi Fard ◽  
Firmin Doko Tchatoka ◽  
Sivagowry Sriananthakumar

In this paper we propose a maximum entropy estimator for the asymptotic distribution of the hedging error for options. Perfect replication of financial derivatives is not possible, due to market incompleteness and discrete-time hedging. We derive the asymptotic hedging error for options under a generalised jump-diffusion model with kernel bias, which nests a number of very important processes in finance. We then obtain an estimation for the distribution of hedging error by maximising Shannon’s entropy subject to a set of moment constraints, which in turn yields the value-at-risk and expected shortfall of the hedging error. The significance of this approach lies in the fact that the maximum entropy estimator allows us to obtain a consistent estimate of the asymptotic distribution of hedging error, despite the non-normality of the underlying distribution of returns.


2020 ◽  
Vol 7 (2) ◽  
pp. 107-112
Author(s):  
Marian Manciu ◽  
Sorour Hosseini ◽  
Joscelyne Guzman-Gonzalez

Background: Statistical methods commonly used in survival analysis typically provide the probability that the difference between groups is due to chance, but do not offer a reliable estimate of the average survival time difference between groups (the difference between median survival time is usually reported). Objective: We suggest a Maximum-Entropy estimator for the average Survival Time Difference (MESTD) between groups. Methods: The estimator is based on the extra survival time, which should be added to each member of the group, to produce the maximum entropy of the result (resulting in the groups becoming most similar). The estimator is calculated only from time to event data, does not necessarily assume hazard proportionality and provides the magnitude of the clinical differences between the groups. Results: Monte Carlo simulations show that, even at low sample numbers (much lower than the ones needed to prove that the two groups are statistically different), the MESTD estimator is a reliable predictor of the clinical differences between the groups, and therefore can be used to estimate from (low sample numbers) preliminary data whether or not the large sample number experiment is worth pursuing. Conclusion: By providing a reasonable estimate for the efficacy of a treatment (e.g., for cancer) even for low sample data, it might provide useful insight in testing new methods for treatment (for example, for quick testing of multiple combinations of cancer drugs).


2020 ◽  
Vol 34 (04) ◽  
pp. 5013-5020
Author(s):  
Chien Lu ◽  
Jaakko Peltonen

An ellipsoid-based, improved kNN entropy estimator based on random samples of distribution for high dimensionality is developed. We argue that the inaccuracy of the classical kNN estimator in high dimensional spaces results from the local uniformity assumption and the proposed method mitigates the local uniformity assumption by two crucial extensions, a local ellipsoid-based volume correction and a correction acceptance testing procedure. Relevant theoretical contributions are provided and several experiments from simple to complicated cases have shown that the proposed estimator can effectively reduce the bias especially in high dimensionalities, outperforming current state of the art alternative estimators.


Author(s):  
Basim Shlaibah Msallam ◽  
Saifaldin Hashim Kamar

It is well known that the Generalized Maximum Entropy method can be used to fit linear regression models, especially as they are not restricted by the conditions to be verified as are other classical methods. Therefore, in this paper, a new method for estimating the parameters of the four-parameter Weibull growth model was proposed using the Generalized Maximum Entropy function by fitting data based on the Haar matrix which was used in the wavelet method. The suggested and classical entropy estimators for Weibull growth model parameters were compared using simulation and the preference for the suggested method estimator was shown. The Modified Generalized Maximum Entropy estimator was applied to the real data representing annual Iraqi oil production for the period 2010–2017. Iraqi crude oil production for the year 2022 was predicted and appeared as 4.4 million bb/day.


2019 ◽  
Vol 14 (15) ◽  
pp. 67-73 ◽  
Author(s):  
Aleksandr Martynenko ◽  
◽  
Gianfranko Raimondi ◽  
N. Budreiko ◽  
◽  
...  

Sign in / Sign up

Export Citation Format

Share Document