Algorithm AS 292: Fisher Information Matrix for the Extreme Value, Normal and Logistic Distributions and Censored Data

Author(s):  
Luis A. Escobar ◽  
William Q. Meeker
Entropy ◽  
2019 ◽  
Vol 21 (2) ◽  
pp. 110 ◽  
Author(s):  
Stephen Taylor

Information geometry provides a correspondence between differential geometry and statistics through the Fisher information matrix. In particular, given two models from the same parametric family of distributions, one can define the distance between these models as the length of the geodesic connecting them in a Riemannian manifold whose metric is given by the model’s Fisher information matrix. One limitation that has hindered the adoption of this similarity measure in practical applications is that the Fisher distance is typically difficult to compute in a robust manner. We review such complications and provide a general form for the distance function for one parameter model. We next focus on higher dimensional extreme value models including the generalized Pareto and generalized extreme value distributions that will be used in financial risk applications. Specifically, we first develop a technique to identify the nearest neighbors of a target security in the sense that their best fit model distributions have minimal Fisher distance to the target. Second, we develop a hierarchical clustering technique that utilizes the Fisher distance. Specifically, we compare generalized extreme value distributions fit to block maxima of a set of equity loss distributions and group together securities whose worst single day yearly loss distributions exhibit similarities.


Sign in / Sign up

Export Citation Format

Share Document