scholarly journals Strong consistency of the maximum likelihood estimator for finite mixtures of location-scale distributions when the scale parameters are exponentially small

Bernoulli ◽  
2006 ◽  
Vol 12 (6) ◽  
pp. 1003-1017 ◽  
Author(s):  
Kentaro Tanaka ◽  
Akimichi Takemura
2014 ◽  
Vol 519-520 ◽  
pp. 878-882
Author(s):  
Chang Ming Yin ◽  
Bo Hong Chen ◽  
Shuang Hua Liu

For the exponential sequential model, we show that maximum likelihood estimator of regression parameter vector is asymptotically existence and strongly consistent under mild conditions


2020 ◽  
Vol 15 (2) ◽  
pp. 2335-2348
Author(s):  
Issa Cherif Geraldo

In this paper, we study the maximum likelihood estimator (MLE) of the parameter vector of a discrete multivariate crash frequencies model used in the statistical analysis of the effectiveness of a road safety measure. We derive the closed-form expression of the MLE afterwards we prove its strong consistency and we obtain the exact variance of the components of the MLE except one component whose variance is approximated via the delta method.


2017 ◽  
Vol 15 (1) ◽  
pp. 1539-1548
Author(s):  
Haiyan Xuan ◽  
Lixin Song ◽  
Muhammad Amin ◽  
Yongxia Shi

Abstract This paper studies the quasi-maximum likelihood estimator (QMLE) for the generalized autoregressive conditional heteroscedastic (GARCH) model based on the Laplace (1,1) residuals. The QMLE is proposed to the parameter vector of the GARCH model with the Laplace (1,1) firstly. Under some certain conditions, the strong consistency and asymptotic normality of QMLE are then established. In what follows, a real example with Laplace and normal distribution is analyzed to evaluate the performance of the QMLE and some comparison results on the performance are given. In the end the proofs of some theorem are presented.


Author(s):  
Hazim Mansour Gorgees ◽  
Bushra Abdualrasool Ali ◽  
Raghad Ibrahim Kathum

     In this paper, the maximum likelihood estimator and the Bayes estimator of the reliability function for negative exponential distribution has been derived, then a Monte –Carlo simulation technique was employed to compare the performance of such estimators. The integral mean square error (IMSE) was used as a criterion for this comparison. The simulation results displayed that the Bayes estimator performed better than the maximum likelihood estimator for different samples sizes.


2021 ◽  
Author(s):  
Jakob Raymaekers ◽  
Peter J. Rousseeuw

AbstractMany real data sets contain numerical features (variables) whose distribution is far from normal (Gaussian). Instead, their distribution is often skewed. In order to handle such data it is customary to preprocess the variables to make them more normal. The Box–Cox and Yeo–Johnson transformations are well-known tools for this. However, the standard maximum likelihood estimator of their transformation parameter is highly sensitive to outliers, and will often try to move outliers inward at the expense of the normality of the central part of the data. We propose a modification of these transformations as well as an estimator of the transformation parameter that is robust to outliers, so the transformed data can be approximately normal in the center and a few outliers may deviate from it. It compares favorably to existing techniques in an extensive simulation study and on real data.


Sign in / Sign up

Export Citation Format

Share Document