asymptotic mean squared error
Recently Published Documents


TOTAL DOCUMENTS

22
(FIVE YEARS 10)

H-INDEX

5
(FIVE YEARS 1)

Author(s):  
Yulia Kotlyarova ◽  
Marcia M. A. Schafgans ◽  
Victoria Zinde-Walsh

AbstractIn this paper, we summarize results on convergence rates of various kernel based non- and semiparametric estimators, focusing on the impact of insufficient distributional smoothness, possibly unknown smoothness and even non-existence of density. In the presence of a possible lack of smoothness and the uncertainty about smoothness, methods of safeguarding against this uncertainty are surveyed with emphasis on nonconvex model averaging. This approach can be implemented via a combined estimator that selects weights based on minimizing the asymptotic mean squared error. In order to evaluate the finite sample performance of these and similar estimators we argue that it is important to account for possible lack of smoothness.


Extremes ◽  
2021 ◽  
Author(s):  
Laura Fee Schneider ◽  
Andrea Krajina ◽  
Tatyana Krivobokova

AbstractThreshold selection plays a key role in various aspects of statistical inference of rare events. In this work, two new threshold selection methods are introduced. The first approach measures the fit of the exponential approximation above a threshold and achieves good performance in small samples. The second method smoothly estimates the asymptotic mean squared error of the Hill estimator and performs consistently well over a wide range of processes. Both methods are analyzed theoretically, compared to existing procedures in an extensive simulation study and applied to a dataset of financial losses, where the underlying extreme value index is assumed to vary over time.


Risks ◽  
2021 ◽  
Vol 9 (4) ◽  
pp. 70
Author(s):  
Małgorzata Just ◽  
Krzysztof Echaust

The appropriate choice of a threshold level, which separates the tails of the probability distribution of a random variable from its middle part, is considered to be a very complex and challenging task. This paper provides an empirical study on various methods of the optimal tail selection in risk measurement. The results indicate which method may be useful in practice for investors and financial and regulatory institutions. Some methods that perform well in simulation studies, based on theoretical distributions, may not perform well when real data are in use. We analyze twelve methods with different parameters for forty-eight world indices using returns from the period of 2000–Q1 2020 and four sub-periods. The research objective is to compare the methods and to identify those which can be recognized as useful in risk measurement. The results suggest that only four tail selection methods, i.e., the Path Stability algorithm, the minimization of the Asymptotic Mean Squared Error approach, the automated Eyeball method with carefully selected tuning parameters and the Hall single bootstrap procedure may be useful in practical applications.


Sankhya A ◽  
2021 ◽  
Author(s):  
Kira Alhorn ◽  
Holger Dette ◽  
Kirsten Schorning

AbstractIn this paper we construct optimal designs for frequentist model averaging estimation. We derive the asymptotic distribution of the model averaging estimate with fixed weights in the case where the competing models are non-nested. A Bayesian optimal design minimizes an expectation of the asymptotic mean squared error of the model averaging estimate calculated with respect to a suitable prior distribution. We derive a necessary condition for the optimality of a given design with respect to this new criterion. We demonstrate that Bayesian optimal designs can improve the accuracy of model averaging substantially. Moreover, the derived designs also improve the accuracy of estimation in a model selected by model selection and model averaging estimates with random weights.


Biometrika ◽  
2020 ◽  
Author(s):  
D Barreiro-Ures ◽  
R Cao ◽  
M Francisco-Fernández ◽  
J D Hart

Abstract Hall & Robinson (2009) proposed and analyzed the use of bagged cross-validation to choose the bandwidth of a kernel density estimator. They established that bagging greatly reduces the noise inherent in ordinary cross-validation, and hence leads to a more efficient bandwidth selector. The asymptotic theory of Hall & Robinson (2009) assumes that N, the number of bagged subsamples, is ∞. We expand upon their theoretical results by allowing N to be finite, as it is in practice. Our results indicate an important difference in the rate of convergence of the bagged cross-validation bandwidth for the cases N = ∞ and N < ∞. Simulations quantify the improvement in statistical efficiency and computational speed that can result from using bagged cross-validation as opposed to a binned implementation of ordinary crossvalidation. The performance of the bagged bandwidth is also illustrated on a real, very large, data set. Finally, a byproduct of our study is the correction of errors appearing in the Hall & Robinson (2009) expression for the asymptotic mean squared error of the bagging selector.


Author(s):  
Nelson Kiprono Bii ◽  
Christopher Ouma Onyango ◽  
John Odhiambo

Developing finite population estimators of parameters such as mean, variance, and asymptotic mean squared error has been one of the core objectives of sample survey theory and practice. Sample survey practitioners need to assess the properties of these estimators so that better ones can be adopted. In survey sampling, the occurrence of nonresponse affects inference and optimality of the estimators of finite population parameters. It introduces bias and may cause samples to deviate from the distributions obtained by the original sampling technique. To compensate for random nonresponse, imputation methods have been proposed by various researchers. However, the asymptotic bias and variance of the finite population mean estimators are still high under this technique. In this paper, transformation of data weighting technique is suggested. The proposed estimator is observed to be asymptotically consistent under mild assumptions. Simulated data show that the estimator proposed is much better than its rival estimators for all the different mean functions simulated.


2020 ◽  
Author(s):  
Mohitosh Kejriwal ◽  
Xuewen Yu

Summary This paper develops a new approach to forecasting a highly persistent time series that employs feasible generalized least squares (FGLS) estimation of the deterministic components in conjunction with Mallows model averaging. Within a local-to-unity asymptotic framework, we derive analytical expressions for the asymptotic mean squared error and one-step-ahead mean squared forecast risk of the proposed estimator and show that the optimal FGLS weights are different from their ordinary least squares (OLS) counterparts. We also provide theoretical justification for a generalized Mallows averaging estimator that incorporates lag order uncertainty in the construction of the forecast. Monte Carlo simulations demonstrate that the proposed procedure yields a considerably lower finite-sample forecast risk relative to OLS averaging. An application to U.S. macroeconomic time series illustrates the efficacy of the advocated method in practice and finds that both persistence and lag order uncertainty have important implications for the accuracy of forecasts.


2019 ◽  
Vol 22 (3) ◽  
pp. 995-1008
Author(s):  
M. N. M. van Lieshout

AbstractWe investigate the asymptotic mean squared error of kernel estimators of the intensity function of a spatial point process. We derive expansions for the bias and variance in the scenario that n independent copies of a point process in $\mathbb {R}^{d}$ ℝ d are superposed. When the same bandwidth is used in all d dimensions, we show that an optimal bandwidth exists and is of the order n− 1/(d+ 4) under appropriate smoothness conditions on the true intensity function.


2019 ◽  
Vol 1 (2) ◽  
pp. 162-180 ◽  
Author(s):  
Yong Siah Teo ◽  
Hyunseok Jeong ◽  
Jaroslav Řeháček ◽  
Zdeněk Hradil ◽  
Luis L. Sánchez-Soto ◽  
...  

Ideal photon-number-resolving detectors form a class of important optical components in quantum optics and quantum information theory. In this article, we theoretically investigate the potential of multiport devices having reconstruction performances approaching that of the Fock-state measurement. By recognizing that all multiport devices are minimally complete, we first provide a general analytical framework to describe the tomographic accuracy (or quality) of these devices. Next, we show that a perfect multiport device with an infinite number of output ports functions as either the Fock-state measurement when photon losses are absent or binomial mixtures of Fock-state measurements when photon losses are present and derive their respective expressions for the tomographic transfer function. This function is the scaled asymptotic mean squared error of the reconstructed photon-number distributions uniformly averaged over all distributions in the probability simplex. We then supply more general analytical formulas for the transfer function for finite numbers of output ports in both the absence and presence of photon losses. The effects of photon losses on the photon-number resolving power of both infinite- and finite-size multiport devices are also investigated.


Biometrika ◽  
2019 ◽  
Vol 106 (3) ◽  
pp. 665-682
Author(s):  
K Alhorn ◽  
K Schorning ◽  
H Dette

SummaryWe consider the problem of designing experiments for estimating a target parameter in regression analysis when there is uncertainty about the parametric form of the regression function. A new optimality criterion is proposed that chooses the experimental design to minimize the asymptotic mean squared error of the frequentist model averaging estimate. Necessary conditions for the optimal solution of a locally and Bayesian optimal design problem are established. The results are illustrated in several examples, and it is demonstrated that Bayesian optimal designs can yield a reduction of the mean squared error of the model averaging estimator by up to 45%.


Sign in / Sign up

Export Citation Format

Share Document