scholarly journals NONPARAMETRIC INSTRUMENTAL REGRESSION WITH ERRORS IN VARIABLES

2018 ◽  
Vol 34 (6) ◽  
pp. 1256-1280 ◽  
Author(s):  
Karun Adusumilli ◽  
Taisuke Otsu

This paper considers nonparametric instrumental variable regression when the endogenous variable is contaminated with classical measurement error. Existing methods are inconsistent in the presence of measurement error. We propose a wavelet deconvolution estimator for the structural function that modifies the generalized Fourier coefficients of the orthogonal series estimator to take into account the measurement error. We establish the convergence rates of our estimator for the cases of mildly/severely ill-posed models and ordinary/super smooth measurement errors. We characterize how the presence of measurement error slows down the convergence rates of the estimator. We also study the case where the measurement error density is unknown and needs to be estimated, and show that the estimation error of the measurement error density is negligible under mild conditions as far as the measurement error density is symmetric.

2021 ◽  
pp. 1-22
Author(s):  
Daisuke Kurisu ◽  
Taisuke Otsu

This paper studies the uniform convergence rates of Li and Vuong’s (1998, Journal of Multivariate Analysis 65, 139–165; hereafter LV) nonparametric deconvolution estimator and its regularized version by Comte and Kappus (2015, Journal of Multivariate Analysis 140, 31–46) for the classical measurement error model, where repeated noisy measurements on the error-free variable of interest are available. In contrast to LV, our assumptions allow unbounded supports for the error-free variable and measurement errors. Compared to Bonhomme and Robin (2010, Review of Economic Studies 77, 491–533) specialized to the measurement error model, our assumptions do not require existence of the moment generating functions of the square and product of repeated measurements. Furthermore, by utilizing a maximal inequality for the multivariate normalized empirical characteristic function process, we derive uniform convergence rates that are faster than the ones derived in these papers under such weaker conditions.


2011 ◽  
Vol 49 (4) ◽  
pp. 901-937 ◽  
Author(s):  
Xiaohong Chen ◽  
Han Hong ◽  
Denis Nekipelov

Measurement errors in economic data are pervasive and nontrivial in size. The presence of measurement errors causes biased and inconsistent parameter estimates and leads to erroneous conclusions to various degrees in economic analysis. While linear errors-in-variables models are usually handled with well-known instrumental variable methods, this article provides an overview of recent research papers that derive estimation methods that provide consistent estimates for nonlinear models with measurement errors. We review models with both classical and nonclassical measurement errors, and with misclassification of discrete variables. For each of the methods surveyed, we describe the key ideas for identification and estimation, and discuss its application whenever it is currently available. (JEL C20, C26, C50)


Biometrika ◽  
2019 ◽  
Vol 107 (1) ◽  
pp. 238-245
Author(s):  
Zhichao Jiang ◽  
Peng Ding

Summary Instrumental variable methods can identify causal effects even when the treatment and outcome are confounded. We study the problem of imperfect measurements of the binary instrumental variable, treatment and outcome. We first consider nondifferential measurement errors, that is, the mismeasured variable does not depend on other variables given its true value. We show that the measurement error of the instrumental variable does not bias the estimate, that the measurement error of the treatment biases the estimate away from zero, and that the measurement error of the outcome biases the estimate toward zero. Moreover, we derive sharp bounds on the causal effects without additional assumptions. These bounds are informative because they exclude zero. We then consider differential measurement errors, and focus on sensitivity analyses in those settings.


2010 ◽  
Vol 27 (3) ◽  
pp. 522-545 ◽  
Author(s):  
Jan Johannes ◽  
Sébastien Van Bellegem ◽  
Anne Vanhems

This paper studies the estimation of a nonparametric functionϕfrom the inverse problemr=Tϕgiven estimates of the functionrand of the linear transformT. We show that rates of convergence of the estimator are driven by two types of assumptions expressed in a single Hilbert scale. The two assumptions quantify the prior regularity ofϕand the prior link existing betweenTand the Hilbert scale. The approach provides a unified framework that allows us to compare various sets of structural assumptions found in the econometric literature. Moreover, general upper bounds are also derived for the risk of the estimator of the structural functionϕas well as that of its derivatives. It is shown that the bounds cover and extend known results given in the literature. Two important applications are also studied. The first is the blind nonparametric deconvolution on the real line, and the second is the estimation of the derivatives of the nonparametric instrumental regression function via an iterative Tikhonov regularization scheme.


2020 ◽  
Vol 36 (4) ◽  
pp. 658-706 ◽  
Author(s):  
Andrii Babii

AbstractThis article develops inferential methods for a very general class of ill-posed models in econometrics encompassing the nonparametric instrumental variable regression, various functional regressions, and the density deconvolution. We focus on uniform confidence sets for the parameter of interest estimated with Tikhonov regularization, as in Darolles et al. (2011, Econometrica 79, 1541–1565). Since it is impossible to have inferential methods based on the central limit theorem, we develop two alternative approaches relying on the concentration inequality and bootstrap approximations. We show that expected diameters and coverage properties of resulting sets have uniform validity over a large class of models, that is, constructed confidence sets are honest. Monte Carlo experiments illustrate that introduced confidence sets have reasonable width and coverage properties. Using U.S. data, we provide uniform confidence sets for Engel curves for various commodities.


2017 ◽  
Vol 928 (10) ◽  
pp. 58-63 ◽  
Author(s):  
V.I. Salnikov

The initial subject for study are consistent sums of the measurement errors. It is assumed that the latter are subject to the normal law, but with the limitation on the value of the marginal error Δpred = 2m. It is known that each amount ni corresponding to a confidence interval, which provides the value of the sum, is equal to zero. The paradox is that the probability of such an event is zero; therefore, it is impossible to determine the value ni of where the sum becomes zero. The article proposes to consider the event consisting in the fact that some amount of error will change value within 2m limits with a confidence level of 0,954. Within the group all the sums have a limit error. These tolerances are proposed to use for the discrepancies in geodesy instead of 2m*SQL(ni). The concept of “the law of the truncated normal distribution with Δpred = 2m” is suggested to be introduced.


Author(s):  
Radu Boţ ◽  
Guozhi Dong ◽  
Peter Elbau ◽  
Otmar Scherzer

AbstractRecently, there has been a great interest in analysing dynamical flows, where the stationary limit is the minimiser of a convex energy. Particular flows of great interest have been continuous limits of Nesterov’s algorithm and the fast iterative shrinkage-thresholding algorithm, respectively. In this paper, we approach the solutions of linear ill-posed problems by dynamical flows. Because the squared norm of the residual of a linear operator equation is a convex functional, the theoretical results from convex analysis for energy minimising flows are applicable. However, in the restricted situation of this paper they can often be significantly improved. Moreover, since we show that the proposed flows for minimising the norm of the residual of a linear operator equation are optimal regularisation methods and that they provide optimal convergence rates for the regularised solutions, the given rates can be considered the benchmarks for further studies in convex analysis.


Author(s):  
Alice R. Carter ◽  
Eleanor Sanderson ◽  
Gemma Hammerton ◽  
Rebecca C. Richmond ◽  
George Davey Smith ◽  
...  

AbstractMediation analysis seeks to explain the pathway(s) through which an exposure affects an outcome. Traditional, non-instrumental variable methods for mediation analysis experience a number of methodological difficulties, including bias due to confounding between an exposure, mediator and outcome and measurement error. Mendelian randomisation (MR) can be used to improve causal inference for mediation analysis. We describe two approaches that can be used for estimating mediation analysis with MR: multivariable MR (MVMR) and two-step MR. We outline the approaches and provide code to demonstrate how they can be used in mediation analysis. We review issues that can affect analyses, including confounding, measurement error, weak instrument bias, interactions between exposures and mediators and analysis of multiple mediators. Description of the methods is supplemented by simulated and real data examples. Although MR relies on large sample sizes and strong assumptions, such as having strong instruments and no horizontally pleiotropic pathways, our simulations demonstrate that these methods are unaffected by confounders of the exposure or mediator and the outcome and non-differential measurement error of the exposure or mediator. Both MVMR and two-step MR can be implemented in both individual-level MR and summary data MR. MR mediation methods require different assumptions to be made, compared with non-instrumental variable mediation methods. Where these assumptions are more plausible, MR can be used to improve causal inference in mediation analysis.


Sign in / Sign up

Export Citation Format

Share Document