Genetic variation detection using maximum likelihood estimator

Author(s):  
Abdullah K. Alqallaf ◽  
Ahmed H. Tewfik ◽  
Scott B. Selleck
Author(s):  
Hazim Mansour Gorgees ◽  
Bushra Abdualrasool Ali ◽  
Raghad Ibrahim Kathum

     In this paper, the maximum likelihood estimator and the Bayes estimator of the reliability function for negative exponential distribution has been derived, then a Monte –Carlo simulation technique was employed to compare the performance of such estimators. The integral mean square error (IMSE) was used as a criterion for this comparison. The simulation results displayed that the Bayes estimator performed better than the maximum likelihood estimator for different samples sizes.


2021 ◽  
Author(s):  
Jakob Raymaekers ◽  
Peter J. Rousseeuw

AbstractMany real data sets contain numerical features (variables) whose distribution is far from normal (Gaussian). Instead, their distribution is often skewed. In order to handle such data it is customary to preprocess the variables to make them more normal. The Box–Cox and Yeo–Johnson transformations are well-known tools for this. However, the standard maximum likelihood estimator of their transformation parameter is highly sensitive to outliers, and will often try to move outliers inward at the expense of the normality of the central part of the data. We propose a modification of these transformations as well as an estimator of the transformation parameter that is robust to outliers, so the transformed data can be approximately normal in the center and a few outliers may deviate from it. It compares favorably to existing techniques in an extensive simulation study and on real data.


2013 ◽  
Vol 55 (3) ◽  
pp. 643-652
Author(s):  
Gauss M. Cordeiro ◽  
Denise A. Botter ◽  
Alexsandro B. Cavalcanti ◽  
Lúcia P. Barroso

2020 ◽  
Vol 28 (3) ◽  
pp. 183-196
Author(s):  
Kouacou Tanoh ◽  
Modeste N’zi ◽  
Armel Fabrice Yodé

AbstractWe are interested in bounds on the large deviations probability and Berry–Esseen type inequalities for maximum likelihood estimator and Bayes estimator of the parameter appearing linearly in the drift of nonhomogeneous stochastic differential equation driven by fractional Brownian motion.


Stats ◽  
2021 ◽  
Vol 4 (1) ◽  
pp. 88-107
Author(s):  
Alfio Marazzi

The distance constrained maximum likelihood procedure (DCML) optimally combines a robust estimator with the maximum likelihood estimator with the purpose of improving its small sample efficiency while preserving a good robustness level. It has been published for the linear model and is now extended to the GLM. Monte Carlo experiments are used to explore the performance of this extension in the Poisson regression case. Several published robust candidates for the DCML are compared; the modified conditional maximum likelihood estimator starting with a very robust minimum density power divergence estimator is selected as the best candidate. It is shown empirically that the DCML remarkably improves its small sample efficiency without loss of robustness. An example using real hospital length of stay data fitted by the negative binomial regression model is discussed.


Sign in / Sign up

Export Citation Format

Share Document