The Regression of the Sample Variance on the Sample Mean

1946 ◽  
Vol s1-21 (1) ◽  
pp. 22-28 ◽  
Author(s):  
M. C. K. Tweedie
Keyword(s):  
1985 ◽  
Vol 15 (2) ◽  
pp. 103-121 ◽  
Author(s):  
William S. Jewell ◽  
Rene Schnieper

AbstractCredibility theory refers to the use of linear least-squares theory to approximate the Bayesian forecast of the mean of a future observation; families are known where the credibility formula is exact Bayesian. Second-moment forecasts are also of interest, for example, in assessing the precision of the mean estimate. For some of these same families, the second-moment forecast is exact in linear and quadratic functions of the sample mean. On the other hand, for the normal distribution with normal-gamma prior on the mean and variance, the exact forecast of the variance is a linear function of the sample variance and the squared deviation of the sample mean from the prior mean. Bühlmann has given a credibility approximation to the variance in terms of the sample mean and sample variance.In this paper, we present a unified approach to estimating both first and second moments of future observations using linear functions of the sample mean and two sample second moments; the resulting least-squares analysis requires the solution of a 3 × 3 linear system, using 11 prior moments from the collective and giving joint predictions of all moments of interest. Previously developed special cases follow immediately. For many analytic models of interest, 3-dimensional joint prediction is significantly better than independent forecasts using the “natural” statistics for each moment when the number of samples is small. However, the expected squared-errors of the forecasts become comparable as the sample size increases.


2019 ◽  
Vol 629 ◽  
pp. A143 ◽  
Author(s):  
Nicolas Clerc ◽  
Edoardo Cucchetti ◽  
Etienne Pointecouteau ◽  
Philippe Peille

Context. X-ray observations of galaxy clusters provide insights into the nature of gaseous turbulent motions, their physical scales, and the fundamental processes to which they are related. Spatially-resolved, high-resolution spectral measurements of X-ray emission lines provide diagnostics on the nature of turbulent motions in emitting atmospheres. Since they are acting on scales comparable to the size of the objects, the uncertainty on these physical parameters is limited by the number of observational measurements, through sample variance. Aims. We propose a different and complementary approach to repeating numerical simulations for the computation of sample variance (i.e. Monte-Carlo sampling) by introducing new analytical developments for lines diagnosis. Methods. We considered the model of a “turbulent gas cloud”, consisting in isotropic and uniform turbulence described by a universal Kolmogorov power-spectrum with random amplitudes and phases in an optically thin medium. Following a simple prescription for the four-term correlation of Fourier coefficients, we derived generic expressions for the sample mean and variance of line centroid shift, line broadening, and projected velocity structure function. We performed a numerical validation based on Monte-Carlo simulations for two popular models of gas emissivity based on the β-model. Results. Generic expressions for the sample variance of line centroid shifts and broadening in arbitrary apertures are derived and match the simulations within their range of applicability. Generic expressions for the mean and variance of the structure function are provided and verified against simulations. An application to the Athena/X-IFU (Advanced Telescope for High-ENergy Astrophysics/X-ray Integral Field Unit) and XRISM/Resolve (X-ray Imaging and Spectroscopy Mission) instruments forecasts the potential of sensitive, spatially-resolved spectroscopy to probe the inertial range of turbulent velocity cascades in a Coma-like galaxy cluster. Conclusions. The formulas provided are of generic relevance and can be implemented in forecasts for upcoming or current X-ray instrumentation and observing programmes.


1982 ◽  
Vol 36 (3) ◽  
pp. 176 ◽  
Author(s):  
Jonathan J. Shuster
Keyword(s):  

2013 ◽  
Vol 46 (3) ◽  
pp. 663-671 ◽  
Author(s):  
Tine Straasø ◽  
Dirk Müter ◽  
Henning Osholm Sørensen ◽  
Jens Als-Nielsen

A statistical method to determine the background level and separate signal from background in a Poisson-distributed background data set is described. The algorithm eliminates the pixel with the highest intensity value in an iterative manner until the sample variance equals the sample mean within the estimated uncertainties. The eliminated pixels then contain signal superimposed on the background, so the integrated signal can be obtained by summation or by a simple extension by profile fitting depending on the user's preferences. Two additional steps remove `outliers' and correct for the underestimated extension of the peak area, respectively. The algorithm can be easily modified to specific needs, and an application on crystal truncation rods is presented, dealing with a sloping background.


2016 ◽  
Vol 38 (3) ◽  
Author(s):  
Mohammad Fraiwan Al-Saleh ◽  
Adil Eltayeb Yousif

Unlike the mean, the standard deviation ¾ is a vague concept. In this paper, several properties of ¾ are highlighted. These properties include the minimum and the maximum of ¾, its relationship to the mean absolute deviation and the range of the data, its role in Chebyshev’s inequality and the coefficient of variation. The hidden information in the formula itself is extracted. The confusion about the denominator of the sample variance being n ¡ 1 is also addressed. Some properties of the sample mean and varianceof normal data are carefully explained. Pointing out these and other properties in classrooms may have significant effects on the understanding and the retention of the concept.


1974 ◽  
Vol 75 (2) ◽  
pp. 219-234 ◽  
Author(s):  
Y. H. Wang

Let X1, X2, …, Xn, be n (n ≥ 2) independent observations on a one-dimensional random variable X with distribution function F. Letbe the sample mean andbe the sample variance. In 1925, Fisher (2) showed that if the distribution function F is normal then and S2 are stochastically independent. This property was used to derive the student's t-distribution which has played a very important role in statistics. In 1936, Geary(3) proved that the independence of and S2 is a sufficient condition for F to be a normal distribution under the assumption that F has moments of all order. Later, Lukacs (14) proved this result assuming only the existence of the second moment of F. The assumption of the existence of moments of F was subsequently dropped in the proofs given by Kawata and Sakamoto (7) and by Zinger (27). Thus the independence of and S2 is a characterizing property of the normal distribution.


Sign in / Sign up

Export Citation Format

Share Document