Fisher Information and Uncertainty Principle for Skew-Gaussian Random Variables
Fisher information is a measure to quantify information and estimate system-defining parameters. The scaling and uncertainty properties of this measure, linked with Shannon entropy, are useful to characterize signals through the Fisher–Shannon plane. In addition, several non-gaussian distributions have been exemplified, given that assuming gaussianity in evolving systems is unrealistic, and the derivation of distributions that addressed asymmetry and heavy–tails is more suitable. The latter has motivated studying Fisher information and the uncertainty principle for skew-gaussian random variables for this paper. We describe the skew-gaussian distribution effect on uncertainty principle, from which the Fisher information, the Shannon entropy power, and the Fisher divergence are derived. Results indicate that flexibility of skew-gaussian distribution with a shape parameter allows deriving explicit expressions of these measures and define a new Fisher–Shannon information plane. Performance of the proposed methodology is illustrated by numerical results and applications to condition factor time series.