Recently Published Documents
Jackknife empirical likelihood of error variance for partially linear varying-coefficient model with missing covariates
Statistical Inference for the Heteroscedastic Partially Linear Varying-Coefficient Errors-in-Variables Model with Missing Censoring Indicators
In this paper, we focus on heteroscedastic partially linear varying-coefficient errors-in-variables models under right-censored data with censoring indicators missing at random. Based on regression calibration, imputation, and inverse probability weighted methods, we define a class of modified profile least square estimators of the parameter and local linear estimators of the coefficient function, which are applied to constructing estimators of the error variance function. In order to improve the estimation accuracy and take into account the heteroscedastic error, reweighted estimators of the parameter and coefficient function are developed. At the same time, we apply the empirical likelihood method to construct confidence regions and maximum empirical likelihood estimators of the parameter. Under appropriate assumptions, the asymptotic normality of the proposed estimators is studied. The strong uniform convergence rate for the estimators of the error variance function is considered. Also, the asymptotic chi-squared distribution of the empirical log-likelihood ratio statistics is proved. A simulation study is conducted to evaluate the finite sample performance of the proposed estimators. Meanwhile, one real data example is provided to illustrate our methods.
Correction to “Empirical‐likelihood‐based criteria for model selection on marginal analysis of longitudinal data with dropout missingness,” by Chen, C., Shen, B., Zhang, L., Xue, Y. and Wang, M.; 75(3), 950–965, 2019
Jackknife empirical likelihood confidence intervals for assessing heterogeneity in meta-analysis of rare binary event data
In this paper, we introduce a robust version of the empirical likelihood estimator for semiparametric moment condition models. This estimator is obtained by minimizing the modified Kullback–Leibler divergence, in its dual form, using truncated orthogonality functions. We prove the robustness and the consistency of the new estimator. The performance of the robust empirical likelihood estimator is illustrated through examples based on Monte Carlo simulations.