nuisance parameter
Recently Published Documents


TOTAL DOCUMENTS

155
(FIVE YEARS 40)

H-INDEX

26
(FIVE YEARS 3)

MAUSAM ◽  
2021 ◽  
Vol 61 (2) ◽  
pp. 197-202
Author(s):  
J. K. S. YADAV ◽  
R. K. GIRI ◽  
D. K. MALIK

Global Positioning System (GPS) estimates the total delay in zenith direction by the propagation delay of the neutral atmosphere in presence of water vapour present in the troposphere. This total delay has been treated as a nuisance parameter for many years by the geodesists. The above delay have two parts dry delay and wet delay and known as Zenith Hydrostatic Delay (ZHD) and Zenith Wet Delay (ZWD) respectively. The Integrated Precipitable Water Vapour (IPWV) is estimated through ZWD overlying the receiver at ground-based station. The accuracy of the above said estimates depends on the quality of the predicted satellite orbits, which are not the same for each individual satellite. India Meteorological Department (IMD) is operationally estimating the IPWV on near real time basis at five places and matches fairly well (error ~6.7 mm) with Radisonde (RS) data. This paper examine the effect of International GPS Service (IGS) predicted precise orbits and near real time predicted rapid or broadcast orbits supplied by the Scripps Orbit and Permanent Array Center (SOPAC) on Zenith Total Delay (ZTD) and IPWV estimates by calculating the mean Bias and Root Mean Square Error (RMSE) for ZTD and IPWV in mm for all the five stations. The observed bias for ZTD is almost of the order of less than 1 mm in most cases and RMSE is less than 6 mm. Similarly the bias observed in the case of derived IPWV is almost negligible and RMSE is less than 1 mm.


Sensors ◽  
2021 ◽  
Vol 21 (21) ◽  
pp. 7159
Author(s):  
Jun Li ◽  
Kutluyil Dogancay ◽  
Hatem Hmam

This paper investigates the hybrid source localization problem using differential received signal strength (DRSS) and angle of arrival (AOA) measurements. The main advantage of hybrid measurements is to improve the localization accuracy with respect to a single sensor modality. For sufficiently short wavelengths, AOA sensors can be constructed with size, weight, power and cost (SWAP-C) requirements in mind, making the proposed hybrid DRSS-AOA sensing feasible at a low cost. Firstly the maximum likelihood estimation solution is derived, which is computationally expensive and likely to become unstable for large noise levels. Then a novel closed-form pseudolinear estimation method is developed by incorporating the AOA measurements into a linearized form of DRSS equations. This method eliminates the nuisance parameter associated with linearized DRSS equations, hence improving the estimation performance. The estimation bias arising from the injection of measurement noise into the pseudolinear data matrix is examined. The method of instrumental variables is employed to reduce this bias. As the performance of the resulting weighted instrumental variable (WIV) estimator depends on the correlation between the IV matrix and data matrix, a selected-hybrid-measurement WIV (SHM-WIV) estimator is proposed to maintain a strong correlation. The superior bias and mean-squared error performance of the new SHM-WIV estimator is illustrated with simulation examples.


2021 ◽  
pp. 096228022110370
Author(s):  
Brice Ozenne ◽  
Esben Budtz-Jørgensen ◽  
Julien Péron

The benefit–risk balance is a critical information when evaluating a new treatment. The Net Benefit has been proposed as a metric for the benefit–risk assessment, and applied in oncology to simultaneously consider gains in survival and possible side effects of chemotherapies. With complete data, one can construct a U-statistic estimator for the Net Benefit and obtain its asymptotic distribution using standard results of the U-statistic theory. However, real data is often subject to right-censoring, e.g. patient drop-out in clinical trials. It is then possible to estimate the Net Benefit using a modified U-statistic, which involves the survival time. The latter can be seen as a nuisance parameter affecting the asymptotic distribution of the Net Benefit estimator. We present here how existing asymptotic results on U-statistics can be applied to estimate the distribution of the net benefit estimator, and assess their validity in finite samples. The methodology generalizes to other statistics obtained using generalized pairwise comparisons, such as the win ratio. It is implemented in the R package BuyseTest (version 2.3.0 and later) available on Comprehensive R Archive Network.


Mathematics ◽  
2021 ◽  
Vol 9 (20) ◽  
pp. 2534
Author(s):  
Tolga Omay ◽  
Aysegul Corakci ◽  
Esra Hasdemir

In this study, we consider the hybrid nonlinear features of the Exponential Smooth Transition Autoregressive-Fractional Fourier Function (ESTAR-FFF) form unit root test. As is well known, when developing a unit root test for the ESTAR model, linearization is performed by the Taylor approximation, and thereby the nuisance parameter problem is eliminated. Although this linearization process leads to a certain amount of information loss in the unit root testing equation, it also causes the resulting test to be more accessible and consistent. The method that we propose here contributes to the literature in three important ways. First, it reduces the information loss that arises due to the Taylor expansion. Second, the research to date has tended to misinterpret the Fourier function used with the Kapetanios, Shin and Snell (2003) (KSS) unit root test and considers it to capture multiple smooth transition structural breaks. The simulation studies that we carry out in this study clearly show that the Fourier function only restores the Taylor residuals of the ESTAR type function rather than accounting forthe smooth structural break. Third, the new nonlinear unit root test developed in this paper has very strong power in the highly persistent near unit root environment that the financial data exhibit. The application of the Kapetanios Shin Snell- Fractional Fourier (KSS-FF) test to ex-post real interest rates data of 11 OECD countries for country-specific sample periods shows that the new test catches nonlinear stationarity in many more countries than the KSS test itself.


2021 ◽  
Vol 71 (5) ◽  
pp. 1309-1318
Author(s):  
Abbas Eftekharian ◽  
Morad Alizadeh

Abstract The problem of finding optimal tests in the family of uniform distributions is investigated. The general forms of the uniformly most powerful and generalized likelihood ratio tests are derived. Moreover, the problem of finding the uniformly most powerful unbiased test for testing two-sided hypothesis in the presence of nuisance parameter is investigated, and it is shown that such a test is equivalent to the generalized likelihood ratio test for the same problem. The simulation study is performed to evaluate the performance of power function of the tests.


2021 ◽  
Vol 2021 (9) ◽  
Author(s):  
Forrest Flesher ◽  
Katherine Fraser ◽  
Charles Hutchison ◽  
Bryan Ostdiek ◽  
Matthew D. Schwartz

Abstract One of the key tasks of any particle collider is measurement. In practice, this is often done by fitting data to a simulation, which depends on many parameters. Sometimes, when the effects of varying different parameters are highly correlated, a large ensemble of data may be needed to resolve parameter-space degeneracies. An important example is measuring the top-quark mass, where other physical and unphysical parameters in the simulation must be profiled when fitting the top-quark mass parameter. We compare four different methodologies for top-quark mass measurement: a classical histogram fit similar to one commonly used in experiment augmented by soft-drop jet grooming; a 2D profile likelihood fit with a nuisance parameter; a machine-learning method called DCTR; and a linear regression approach, either using a least-squares fit or with a dense linearly-activated neural network. Despite the fact that individual events are totally uncorrelated, we find that the linear regression methods work most effectively when we input an ensemble of events sorted by mass, rather than training them on individual events. Although all methods provide robust extraction of the top-quark mass parameter, the linear network does marginally best and is remarkably simple. For the top study, we conclude that the Monte-Carlo-based uncertainty on current extractions of the top-quark mass from LHC data can be reduced significantly (by perhaps a factor of 2) using networks trained on sorted event ensembles. More generally, machine learning from ensembles for parameter estimation has broad potential for collider physics measurements.


2021 ◽  
Vol 50 (Supplement_1) ◽  
Author(s):  
Jonathan Huang ◽  
Xiang Meng

Abstract Background Flexible, data-adaptive algorithms (machine learning; ML) for nuisance parameter estimation in epidemiologic causal inference have promising asymptotic properties for complex, high-dimensional data. However, recently proposed applications (e.g. targeted maximum likelihood estimation; TMLE) may produce biases parameter and standard error estimates in common real-world cohort settings. The relative performance of these novel estimators over simpler approaches in such settings is unclear. Methods We apply double-crossfit TMLE, augmented inverse probability weighting (AIPW), and standard IPW to simple simulations (5 covariates) and “real-world” data using covariate-structure-preserving (“plasmode”) simulations of 1,178 subjects and 331 covariates from a longitudinal birth cohort. We evaluate various data generating and estimation scenarios including: under- and over- (e.g. excess orthogonal covariates) identification, poor data support, near-instruments, and mis-specified biological interactions. We also track representative computation times. Results We replicate optimal performance of cross-fit, doubly robust estimators in simple data generating processes. However, in nearly every real world-based scenario, estimators fit with parametric learners outperform those that include non-parametric learners in terms of mean bias and confidence interval coverage. Even when correctly specified, estimators fit with non-parametric algorithms (xgboost, random forest) performed poorly (e.g. 24% bias, 57% coverage vs. 10% bias, 79% coverage for parametric fit), at times underperforming simple IPW. Conclusions In typical epidemiologic data sets, double-crossfit estimators fit with simple smooth, parametric learners may be the optimal solution, taking 2-5 times less computation time than flexible non-parametric models, while having equal or better performance. No approaches are optimal, and estimators should be compared on simulations close to the source data. Key messages In epidemiologic studies, use of flexible non-parametric algorithms for effect estimation should be strongly justified (i.e. high-dimensional covariates) and performed with care. Parametric learners may be a safer option with few drawbacks.


2021 ◽  
Author(s):  
Leo Michelis

This paper examines the asymptotic null distributions of the <em>J</em> and Cox non-nested tests in the framework of two linear regression models with nearly orthogonal non-nested regressors. The analysis is based on the concept of near population orthogonality (NPO), according to which the non-nested regressors in the two models are nearly uncorrelated in the population distribution from which they are drawn. New distributional results emerge under NPO. The <em>J</em> and Cox tests tend to two different random variables asymptotically, each of which is expressible as a function of a nuisance parameter, <em>c</em>, a N(0,1) variate and a <em>χ</em>2(<em>q</em>) variate, where <em>q</em> is the number of non-nested regressors in the alternative model. The Monte Carlo method is used to show the relevance of the new results in finite samples and to compute alternative critical values for the two tests under NPO by plugging consistent estimates of <em>c</em> into the relevant asymptotic expressions. An empirical example illustrates the ‘plug in’ procedure.


Sign in / Sign up

Export Citation Format

Share Document