scholarly journals Bounds for the normal approximation of the maximum likelihood estimator fromm-dependent random variables

2017 ◽  
Vol 129 ◽  
pp. 171-181
Author(s):  
Andreas Anastasiou
1995 ◽  
Vol 11 (3) ◽  
pp. 437-483 ◽  
Author(s):  
Lung-Fei Lee

In this article, we investigate a bias in an asymptotic expansion of the simulated maximum likelihood estimator introduced by Lerman and Manski (pp. 305–319 in C. Manski and D. McFadden (eds.), Structural Analysis of Discrete Data with Econometric Applications, Cambridge: MIT Press, 1981) for the estimation of discrete choice models. This bias occurs due to the nonlinearity of the derivatives of the log likelihood function and the statistically independent simulation errors of the choice probabilities across observations. This bias can be the dominating bias in an asymptotic expansion of the simulated maximum likelihood estimator when the number of simulated random variables per observation does not increase at least as fast as the sample size. The properly normalized simulated maximum likelihood estimator even has an asymptotic bias in its limiting distribution if the number of simulated random variables increases only as fast as the square root of the sample size. A bias-adjustment is introduced that can reduce the bias. Some Monte Carlo experiments have demonstrated the usefulness of the bias-adjustment procedure.


2014 ◽  
Vol 24 (2) ◽  
pp. 283-291 ◽  
Author(s):  
Milan Jovanovic ◽  
Vesna Rajic

In this paper, we estimate probability P{X < Y} when X and Y are two independent random variables from gamma and exponential distribution, respectively. We obtain maximum likelihood estimator and its asymptotic distribution. We perform some simulation study.


Author(s):  
Hazim Mansour Gorgees ◽  
Bushra Abdualrasool Ali ◽  
Raghad Ibrahim Kathum

     In this paper, the maximum likelihood estimator and the Bayes estimator of the reliability function for negative exponential distribution has been derived, then a Monte –Carlo simulation technique was employed to compare the performance of such estimators. The integral mean square error (IMSE) was used as a criterion for this comparison. The simulation results displayed that the Bayes estimator performed better than the maximum likelihood estimator for different samples sizes.


2021 ◽  
Author(s):  
Jakob Raymaekers ◽  
Peter J. Rousseeuw

AbstractMany real data sets contain numerical features (variables) whose distribution is far from normal (Gaussian). Instead, their distribution is often skewed. In order to handle such data it is customary to preprocess the variables to make them more normal. The Box–Cox and Yeo–Johnson transformations are well-known tools for this. However, the standard maximum likelihood estimator of their transformation parameter is highly sensitive to outliers, and will often try to move outliers inward at the expense of the normality of the central part of the data. We propose a modification of these transformations as well as an estimator of the transformation parameter that is robust to outliers, so the transformed data can be approximately normal in the center and a few outliers may deviate from it. It compares favorably to existing techniques in an extensive simulation study and on real data.


Sign in / Sign up

Export Citation Format

Share Document