scholarly journals Properties of the Maximum Likelihood Estimates and Bias Reduction for Logistic Regression Model

OALib ◽  
2017 ◽  
Vol 04 (05) ◽  
pp. 1-12 ◽  
Author(s):  
Nuri H. Salem Badi
2018 ◽  
Vol 48 (3) ◽  
pp. 199-204 ◽  
Author(s):  
R. LI ◽  
J. ZHOU ◽  
L. WANG

In this paper, the non-parametric bootstrap and non-parametric Bayesian bootstrap methods are applied for parameter estimation in the binary logistic regression model. A real data study and a simulation study are conducted to compare the Nonparametric bootstrap, Non-parametric Bayesian bootstrap and the maximum likelihood methods. Study results shows that three methods are all effective ways for parameter estimation in the binary logistic regression model. In small sample case, the non-parametric Bayesian bootstrap method performs relatively better than the non-parametric bootstrap and the maximum likelihood method for parameter estimation in the binary logistic regression model.


2021 ◽  
Vol 2106 (1) ◽  
pp. 012001
Author(s):  
P R Sihombing ◽  
S R Rohimah ◽  
A Kurnia

Abstract This study aims to compare the efficacy of logistic regression model for identifying the risk factors of low-birth-weight babies in Indonesia using the maximum likelihood estimation (MLE)and the Bayesian estimation methods. The data used in this study is secondary data derived from the 2017 Indonesian Demographic Health Survey with a total sample of 16,344 newborn babies. Selection of the best logistic regression model was based on the smaller Bayesian Schwartz Information Criterion (BIC) value. The logistic regression model with the Bayesian estimation method has a smaller BIC value than the MLE method. Twin births, baby girl, maternal age at risk, birth spacing that is too close, iron deficiency, low education, low economy, inadequate drinking water sources have provided a higher risk of low-birth-weight incidence.


2020 ◽  
Vol 36 (4) ◽  
pp. 1253-1259
Author(s):  
Autcha Araveeporn ◽  
Yuwadee Klomwises

Markov Chain Monte Carlo (MCMC) method has been a popular method for getting information about probability distribution for estimating posterior distribution by Gibbs sampling. So far, the standard methods such as maximum likelihood and logistic ridge regression methods have represented to compare with MCMC. The maximum likelihood method is the classical method to estimate the parameter on the logistic regression model by differential the loglikelihood function on the estimator. The logistic ridge regression depends on the choice of ridge parameter by using crossvalidation for computing estimator on penalty function. This paper provides maximum likelihood, logistic ridge regression, and MCMC to estimate parameter on logit function and transforms into a probability. The logistic regression model predicts the probability to observe a phenomenon. The prediction accuracy evaluates in terms of the percentage with correct predictions of a binary event. A simulation study conducts a binary response variable by using 2, 4, and 6 explanatory variables, which are generated from multivariate normal distribution on the positive and negative correlation coefficient or called multicollinearity problem. The criterion of these methods is to compare by a maximum of predictive accuracy. The outcomes find that MCMC satisfies all situations.


Sign in / Sign up

Export Citation Format

Share Document