Nonlinear Models of Measurement Errors

2011 ◽  
Vol 49 (4) ◽  
pp. 901-937 ◽  
Author(s):  
Xiaohong Chen ◽  
Han Hong ◽  
Denis Nekipelov

Measurement errors in economic data are pervasive and nontrivial in size. The presence of measurement errors causes biased and inconsistent parameter estimates and leads to erroneous conclusions to various degrees in economic analysis. While linear errors-in-variables models are usually handled with well-known instrumental variable methods, this article provides an overview of recent research papers that derive estimation methods that provide consistent estimates for nonlinear models with measurement errors. We review models with both classical and nonclassical measurement errors, and with misclassification of discrete variables. For each of the methods surveyed, we describe the key ideas for identification and estimation, and discuss its application whenever it is currently available. (JEL C20, C26, C50)

2000 ◽  
Vol 37 (1) ◽  
pp. 113-124 ◽  
Author(s):  
Prasad A. Naik ◽  
Chih-Ling Tsai

Commercial market research firms provide information on advertising variables of interest, such as brand awareness or gross rating points, that are likely to contain measurement errors. This unreliability of measured variables induces bias in the estimated parameters of dynamic models of advertising. Consequently, advertisers either under- or overspend on advertising to maintain a desired level of brand awareness. Monte Carlo studies show that the magnitude of bias can be serious when conventional estimation methods, such as ordinary least squares and errors in variables, are employed to obtain parameter estimates. Therefore, the authors have developed two new approaches that either reduce or eliminate parameter bias. Using these methods, advertisers can determine an unbiased optimal advertising budget, even if advertising variables are measured with error. The application of these methods to estimate the extent of measurement noise in empirical advertising data is illustrated.


2019 ◽  
Vol 3 (1) ◽  
pp. 73-101 ◽  
Author(s):  
Naoto Kunitomo ◽  
Naoki Awaya ◽  
Daisuke Kurisu

AbstractWe investigate the estimation methods of the multivariate non-stationary errors-in-variables models when there are non-stationary trend components and the measurement errors or noise components. We compare the maximum likelihood (ML) estimation and the separating information maximum likelihood (SIML) estimation. The latter was proposed by Kunitomo and Sato (Trend, seasonality and economic time series: the nonstationary errors-in-variables models. MIMS-RBP-SDS-3, MIMS, Meiji University. http://www.mims.meiji.ac.jp/, 2017) and Kunitomo et al. (Separating information maximum likelihood method for high-frequency financial data. Springer, Berlin, 2018). We have found that the Gaussian likelihood function can have non-concave shape in some cases and the ML method does work only when the Gaussianity of non-stationary and stationary components holds with some restrictions such as the signal–noise variance ratio in the parameter space. The SIML estimation has the asymptotic robust properties in more general situations. We explore the finite sample and asymptotic properties of the ML and SIML methods for the non-stationary errors-in variables models.


2018 ◽  
Vol 34 (6) ◽  
pp. 1256-1280 ◽  
Author(s):  
Karun Adusumilli ◽  
Taisuke Otsu

This paper considers nonparametric instrumental variable regression when the endogenous variable is contaminated with classical measurement error. Existing methods are inconsistent in the presence of measurement error. We propose a wavelet deconvolution estimator for the structural function that modifies the generalized Fourier coefficients of the orthogonal series estimator to take into account the measurement error. We establish the convergence rates of our estimator for the cases of mildly/severely ill-posed models and ordinary/super smooth measurement errors. We characterize how the presence of measurement error slows down the convergence rates of the estimator. We also study the case where the measurement error density is unknown and needs to be estimated, and show that the estimation error of the measurement error density is negligible under mild conditions as far as the measurement error density is symmetric.


Methodology ◽  
2015 ◽  
Vol 11 (3) ◽  
pp. 89-99 ◽  
Author(s):  
Leslie Rutkowski ◽  
Yan Zhou

Abstract. Given a consistent interest in comparing achievement across sub-populations in international assessments such as TIMSS, PIRLS, and PISA, it is critical that sub-population achievement is estimated reliably and with sufficient precision. As such, we systematically examine the limitations to current estimation methods used by these programs. Using a simulation study along with empirical results from the 2007 cycle of TIMSS, we show that a combination of missing and misclassified data in the conditioning model induces biases in sub-population achievement estimates, the magnitude and degree to which can be readily explained by data quality. Importantly, estimated biases in sub-population achievement are limited to the conditioning variable with poor-quality data while other sub-population achievement estimates are unaffected. Findings are generally in line with theory on missing and error-prone covariates. The current research adds to a small body of literature that has noted some of the limitations to sub-population estimation.


1975 ◽  
Vol 35 (1) ◽  
pp. 1-7 ◽  
Author(s):  
Sylvia L. Thrupp

For anyone on the green side of fifty who didn't start historical browsing in the playpen it may be quite hard to see the present appeal of statistical theory and method in perspective. To one lucky enough to have been a student abroad in the 1920's, it is merely one of the consequences of a fundamental shift, which was firming in that decade, in conceptions of the economic historian's job. Essentially the shift consisted in making the economy and the social institutions in which it is embedded analytically distinct. Voices from the Polanyite school still claim that this step was as wrong as Adam's eating of the apple. Milder critics complain only that some of us let economic analysis run away with the ball to the neglect of social analysis and of the interplay between the two. For workers on the recent past this is defensible, because the heavy fall-out of purely economic data clamors to be dealt with in its own terms. The preindustrialist, who has to dig harder for data, and seldom turns up such pure economic ore, is more inclined to think in terms of interplay.


Solar Energy ◽  
1991 ◽  
Vol 47 (1) ◽  
pp. 1-16 ◽  
Author(s):  
B. Bourges ◽  
A. Rabl ◽  
B. Leide ◽  
M.J. Carvalho ◽  
M. Collares-Pereira

1999 ◽  
Vol 56 (7) ◽  
pp. 1234-1240
Author(s):  
W R Gould ◽  
L A Stefanski ◽  
K H Pollock

All catch-effort estimation methods implicitly assume catch and effort are known quantities, whereas in many cases, they have been estimated and are subject to error. We evaluate the application of a simulation-based estimation procedure for measurement error models (J.R. Cook and L.A. Stefanski. 1994. J. Am. Stat. Assoc. 89: 1314-1328) in catch-effort studies. The technique involves a simulation component and an extrapolation step, hence the name SIMEX estimation. We describe SIMEX estimation in general terms and illustrate its use with applications to real and simulated catch and effort data. Correcting for measurement error with SIMEX estimation resulted in population size and catchability coefficient estimates that were substantially less than naive estimates, which ignored measurement errors in some cases. In a simulation of the procedure, we compared estimators from SIMEX with "naive" estimators that ignore measurement errors in catch and effort to determine the ability of SIMEX to produce bias-corrected estimates. The SIMEX estimators were less biased than the naive estimators but in some cases were also more variable. Despite the bias reduction, the SIMEX estimator had a larger mean squared error than the naive estimator for one of two artificial populations studied. However, our results suggest the SIMEX estimator may outperform the naive estimator in terms of bias and precision for larger populations.


Author(s):  
Marisa Faggini ◽  
Bruna Bruno ◽  
Anna Parziale

AbstractFollowing the reverse engineering (RE) approach to analyse an economic complex system is to infer how its underlying mechanism works. The main factors that condition the difficulty of RE are the number of variable components in the system and, most importantly, the interdependence of components on one another and nonlinear dynamics. All those aspects characterize the economic complex systems within which economic agents make their choices. Economic complex systems are adopted in RE science, and they could be used to understand, predict and model the dynamics of the complex systems that enable to define and to control the economic environment. With the RE approach, economic data could be used to peek into the internal workings of the economic complex system, providing information about its underling nonlinear dynamics. The idea of this paper arises from the aim to deepen the comprehension of this approach and to highlight the potential implementation of tools and methodologies based on it to treat economic complex systems. An overview of the literature about the RE is presented, by focusing on the definition and on the state of the art of the research, and then we consider two potential tools that could translate the methodological issues of RE by evidencing advantages and disadvantages for economic analysis: the recurrence analysis and the agent-based model (ABM).


2014 ◽  
Vol 47 (3) ◽  
pp. 2335-2340 ◽  
Author(s):  
Arne Dankers ◽  
Paul M.J. Van den Hof ◽  
Xavier Bombois ◽  
Peter S.C. Heuberger

Sign in / Sign up

Export Citation Format

Share Document