Inadmissibility of the maximum likelihood estimator for a multivariate normal distribution when some observations are missing

1982 ◽  
Vol 11 (8) ◽  
pp. 941-955 ◽  
Author(s):  
R. Radhakrishnan
1997 ◽  
Vol 47 (3-4) ◽  
pp. 167-180 ◽  
Author(s):  
Nabendu Pal ◽  
Jyh-Jiuan Lin

Assume i.i.d. observations are available from a p-dimensional multivariate normal distribution with an unknown mean vector μ and an unknown p .d. diaper- . sion matrix ∑. Here we address the problem of mean estimation in a decision theoretic setup. It is well known that the unbiased as well as the maximum likelihood estimator of μ is inadmissible when p ≤ 3 and is dominated by the famous James-Stein estimator (JSE). There are a few estimators which are better than the JSE reported in the literature, but in this paper we derive wide classes of estimators uniformly better than the JSE. We use some of these estimators for further risk study.


1970 ◽  
Vol 13 (3) ◽  
pp. 391-393 ◽  
Author(s):  
B. K. Kale

Lehmann [1] in his lecture notes on estimation shows that for estimating the unknown mean of a normal distribution, N(θ, 1), the usual estimator is neither minimax nor admissible if it is known that θ belongs to a finite closed interval [a, b] and the loss function is squared error. It is shown that , the maximum likelihood estimator (MLE) of θ, has uniformly smaller mean squared error (MSE) than that of . It is natural to ask the question whether the MLE of θ in N(θ, 1) is admissible or not if it is known that θ ∊ [a, b]. The answer turns out to be negative and the purpose of this note is to present this result in a slightly generalized form.


Sign in / Sign up

Export Citation Format

Share Document