nuisance parameters
Recently Published Documents


TOTAL DOCUMENTS

370
(FIVE YEARS 36)

H-INDEX

31
(FIVE YEARS 3)

Mathematics ◽  
2022 ◽  
Vol 10 (2) ◽  
pp. 171
Author(s):  
Nicolas Hardy

Are traditional tests of forecast evaluation well behaved when the competing (nested) model is biased? No, they are not. In this paper, we show analytically and via simulations that, under the null hypothesis of no encompassing, a bias in the nested model may severely distort the size properties of traditional out-of-sample tests in economic forecasting. Not surprisingly, these size distortions depend on the magnitude of the bias and the persistency of the additional predictors. We consider two different cases: (i) There is both in-sample and out-of-sample bias in the nested model. (ii) The bias is present exclusively out-of-sample. To address the former case, we propose a modified encompassing test (MENC-NEW) robust to a bias in the null model. Akin to the ENC-NEW statistic, the asymptotic distribution of our test is a functional of stochastic integrals of quadratic Brownian motions. While this distribution is not pivotal, we can easily estimate the nuisance parameters. To address the second case, we derive the new asymptotic distribution of the ENC-NEW, showing that critical values may differ remarkably. Our Monte Carlo simulations reveal that the MENC-NEW (and the ENC-NEW with adjusted critical values) is reasonably well-sized even when the ENC-NEW (with standard critical values) exhibits rejections rates three times higher than the nominal size.


2021 ◽  
Vol 87 (10) ◽  
pp. 717-733 ◽  
Author(s):  
Radhika Ravi ◽  
Ayman Habib

This article proposes a solution to special least squares adjustment (LSA) models with a rank-deficient weight matrix, which are commonly encountered in geomatics. The two sources of rank deficiency in weight matrices are discussed: naturally occurring due to the inherent characteristics of LSA mathematical models and artificially induced to eliminate nuisance parameters from LSA estimation. The physical interpretation of the sources of rank deficiency is demonstrated using a case study to solve the problem of 3D line fitting, which is often encountered in geomatics but has not been addressed fully to date. Finally, some geomatics-related applications—mobile lidar system calibration, point cloud registration, and single-photo resection—are discussed along with respective experimental results, to emphasize the need to assess LSA models and their weight matrices to draw inferences regarding the effective contribution of observations. The discussion and results demonstrate the vast applications of this research in geomatics as well as other engineering domains.


Author(s):  
Murtala Adam Muhammad ◽  
Junjuan Hu

In this paper, the asymptotic distribution of Fourier ESTAR model (FKSS) proposed by [1], which was not given in the original paper are derived. Result shows that the asymptotic distributions are functions of brownian motion, only depends on K and free from nuisance parameters.


2021 ◽  
Author(s):  
Pavel Logacev ◽  
Noyan Dokudan

In the field of sentence processing, speakers’ preferred interpretation of ambiguous sentences is often determined using a variant of a discrete choice task, in which participants are asked to indicate their preferred meaning of an ambiguous sentence. We discuss participants’ degree of attentiveness as a potential source of bias and variability in such tasks.We show that it may distort the estimates of the preference of a particular interpretation obtained in such experiments and may thus complicate the interpretation of the results as well as the comparison of the results of several experiments. We propose an analysis method based on multinomial processing tree models (Batchelder and Riefer, 1999) which can correct for this bias and allows for a separation of parameters of theoretical importance from nuisance parameters. We test two variants of the MPT-based model on experimental data from English and Turkish and demonstrate that our method can provide deeper insight into the processes underlying participants’ answering behavior and their interpretation preferences than analyses based on raw percentages.


Author(s):  
Alessandro Baldi Antognini ◽  
Marco Novelli ◽  
Maroussa Zagoraiou

AbstractThe present paper discusses drawbacks and limitations of likelihood-based inference in sequential clinical trials for treatment comparisons managed via Response-Adaptive Randomization. Taking into account the most common statistical models for the primary outcome—namely binary, Poisson, exponential and normal data—we derive the conditions under which (i) the classical confidence intervals degenerate and (ii) the Wald test becomes inconsistent and strongly affected by the nuisance parameters, also displaying a non monotonic power. To overcome these drawbacks, we provide a very simple solution that could preserve the fundamental properties of likelihood-based inference. Several illustrative examples and simulation studies are presented in order to confirm the relevance of our results and provide some practical recommendations.


Econometrics ◽  
2021 ◽  
Vol 9 (2) ◽  
pp. 15
Author(s):  
Jau-er Chen ◽  
Chien-Hsun Huang ◽  
Jia-Jyun Tien

In this study, we investigate the estimation and inference on a low-dimensional causal parameter in the presence of high-dimensional controls in an instrumental variable quantile regression. Our proposed econometric procedure builds on the Neyman-type orthogonal moment conditions of a previous study (Chernozhukov et al. 2018) and is thus relatively insensitive to the estimation of the nuisance parameters. The Monte Carlo experiments show that the estimator copes well with high-dimensional controls. We also apply the procedure to empirically reinvestigate the quantile treatment effect of 401(k) participation on accumulated wealth.


Author(s):  
Stefan Wunsch ◽  
Simon Jörger ◽  
Roger Wolf ◽  
Günter Quast

AbstractData analysis in science, e.g., high-energy particle physics, is often subject to an intractable likelihood if the observables and observations span a high-dimensional input space. Typically the problem is solved by reducing the dimensionality using feature engineering and histograms, whereby the latter allows to build the likelihood using Poisson statistics. However, in the presence of systematic uncertainties represented by nuisance parameters in the likelihood, an optimal dimensionality reduction with a minimal loss of information about the parameters of interest is not known. This work presents a novel strategy to construct the dimensionality reduction with neural networks for feature engineering and a differential formulation of histograms so that the full workflow can be optimized with the result of the statistical inference, e.g., the variance of a parameter of interest, as objective. We discuss how this approach results in an estimate of the parameters of interest that is close to optimal and the applicability of the technique is demonstrated with a simple example based on pseudo-experiments and a more complex example from high-energy particle physics.


Mathematics ◽  
2021 ◽  
Vol 9 (1) ◽  
pp. 89
Author(s):  
Michal Pešta

Linear relations, containing measurement errors in input and output data, are considered. Parameters of these so-called errors-in-variables models can change at some unknown moment. The aim is to test whether such an unknown change has occurred or not. For instance, detecting a change in trend for a randomly spaced time series is a special case of the investigated framework. The designed changepoint tests are shown to be consistent and involve neither nuisance parameters nor tuning constants, which makes the testing procedures effortlessly applicable. A changepoint estimator is also introduced and its consistency is proved. A boundary issue is avoided, meaning that the changepoint can be detected when being close to the extremities of the observation regime. As a theoretical basis for the developed methods, a weak invariance principle for the smallest singular value of the data matrix is provided, assuming weakly dependent and non-stationary errors. The results are presented in a simulation study, which demonstrates computational efficiency of the techniques. The completely data-driven tests are illustrated through problems coming from calibration and insurance; however, the methodology can be applied to other areas such as clinical measurements, dietary assessment, computational psychometrics, or environmental toxicology as manifested in the paper.


Sign in / Sign up

Export Citation Format

Share Document