The Metrics of Info-Metrics
In this chapter I present the key ideas and develop the essential quantitative metrics needed for modeling and inference with limited information. I provide the necessary tools to study the traditional maximum-entropy principle, which is the cornerstone for info-metrics. The chapter starts by defining the primary notions of information and entropy as they are related to probabilities and uncertainty. The unique properties of the entropy are explained. The derivations and discussion are extended to multivariable entropies and informational quantities. For completeness, I also discuss the complete list of the Shannon-Khinchin axioms behind the entropy measure. An additional derivation of information and entropy, due to the independently developed work of Wiener, is provided as well.