scholarly journals SEMIPARAMETRIC IDENTIFICATION AND FISHER INFORMATION

2021 ◽  
pp. 1-38
Author(s):  
Juan Carlos Escanciano

This paper provides a systematic approach to semiparametric identification that is based on statistical information as a measure of its “quality.” Identification can be regular or irregular, depending on whether the Fisher information for the parameter is positive or zero, respectively. I first characterize these cases in models with densities linear in an infinite-dimensional parameter. I then introduce a novel “generalized Fisher information.” If positive, it implies (possibly irregular) identification when other conditions hold. If zero, it implies impossibility results on rates of estimation. Three examples illustrate the applicability of the general results. First, I consider the canonical example of average densities. Second, I show irregular identification of the median willingness to pay in contingent valuation studies. Finally, I study identification of the discount factor and average measures of risk aversion in a nonparametric Euler equation with nonparametric measurement error in consumption.

2021 ◽  
Vol 47 (2) ◽  
pp. 1-34
Author(s):  
Umberto Villa ◽  
Noemi Petra ◽  
Omar Ghattas

We present an extensible software framework, hIPPYlib, for solution of large-scale deterministic and Bayesian inverse problems governed by partial differential equations (PDEs) with (possibly) infinite-dimensional parameter fields (which are high-dimensional after discretization). hIPPYlib overcomes the prohibitively expensive nature of Bayesian inversion for this class of problems by implementing state-of-the-art scalable algorithms for PDE-based inverse problems that exploit the structure of the underlying operators, notably the Hessian of the log-posterior. The key property of the algorithms implemented in hIPPYlib is that the solution of the inverse problem is computed at a cost, measured in linearized forward PDE solves, that is independent of the parameter dimension. The mean of the posterior is approximated by the MAP point, which is found by minimizing the negative log-posterior with an inexact matrix-free Newton-CG method. The posterior covariance is approximated by the inverse of the Hessian of the negative log posterior evaluated at the MAP point. The construction of the posterior covariance is made tractable by invoking a low-rank approximation of the Hessian of the log-likelihood. Scalable tools for sample generation are also discussed. hIPPYlib makes all of these advanced algorithms easily accessible to domain scientists and provides an environment that expedites the development of new algorithms.


2016 ◽  
Vol 219 (5) ◽  
pp. 731-742
Author(s):  
V. A. Ershov ◽  
I. A. Ibragimov

Test ◽  
2019 ◽  
Vol 29 (4) ◽  
pp. 966-988
Author(s):  
Francesco Bravo

AbstractThis paper considers estimation and inference for a class of varying coefficient models in which some of the responses and some of the covariates are missing at random and outliers are present. The paper proposes two general estimators—and a computationally attractive and asymptotically equivalent one-step version of them—that combine inverse probability weighting and robust local linear estimation. The paper also considers inference for the unknown infinite-dimensional parameter and proposes two Wald statistics that are shown to have power under a sequence of local Pitman drifts and are consistent as the drifts diverge. The results of the paper are illustrated with three examples: robust local generalized estimating equations, robust local quasi-likelihood and robust local nonlinear least squares estimation. A simulation study shows that the proposed estimators and test statistics have competitive finite sample properties, whereas two empirical examples illustrate the applicability of the proposed estimation and testing methods.


Sign in / Sign up

Export Citation Format

Share Document