XX.—On the Estimation of Variance and Covariance

Author(s):  
E. H. Lloyd

SynopsisSuppose we have a number of independent pairs of observations (Xi, Yi) on two correlated variates (X, Y), which have constant variances and covariance, and whose expected values are of known linear form, with unknown coefficients: say respectively. The pij and the qij are known, the aj and the bj are unknown. The paper discusses the estimation of the coefficients, and of the variances and the covariance, and evaluates the sampling variances of the estimates. The argument is entirely free of distributional assumptions.

Author(s):  
B. D. Athey ◽  
A. L. Stout ◽  
M. F. Smith ◽  
J. P. Langmore

Although there is general agreement that Inactive chromosome fibers consist of helically packed nucleosomes, the pattern of packing is still undetermined. Only one of the proposed models, the crossed-linker model, predicts a variable diameter dependent on the length of DNA between nucleosomes. Measurements of the fiber diameter of negatively-stained and frozen- hydrated- chromatin from Thyone sperm (87bp linker) and Necturus erythrocytes (48bp linker) have been previously reported from this laboratory. We now introduce a more reliable method of measuring the diameters of electron images of fibrous objects. The procedure uses a modified version of the computer program TOTAL, which takes a two-dimensional projection of the fiber density (represented by the micrograph itself) and projects it down the fiber axis onto one dimension. We illustrate this method using high contrast, in-focus STEM images of TMV and chromatin from Thyone and Necturus. The measured diameters are in quantitative agreement with the expected values for the crossed-linker model for chromatin structure


2021 ◽  
Vol 8 (1) ◽  
Author(s):  
Thomas B. Lynch ◽  
Jeffrey H. Gove ◽  
Timothy G. Gregoire ◽  
Mark J. Ducey

Abstract Background A new variance estimator is derived and tested for big BAF (Basal Area Factor) sampling which is a forest inventory system that utilizes Bitterlich sampling (point sampling) with two BAF sizes, a small BAF for tree counts and a larger BAF on which tree measurements are made usually including DBHs and heights needed for volume estimation. Methods The new estimator is derived using the Delta method from an existing formulation of the big BAF estimator as consisting of three sample means. The new formula is compared to existing big BAF estimators including a popular estimator based on Bruce’s formula. Results Several computer simulation studies were conducted comparing the new variance estimator to all known variance estimators for big BAF currently in the forest inventory literature. In simulations the new estimator performed well and comparably to existing variance formulas. Conclusions A possible advantage of the new estimator is that it does not require the assumption of negligible correlation between basal area counts on the small BAF factor and volume-basal area ratios based on the large BAF factor selection trees, an assumption required by all previous big BAF variance estimation formulas. Although this correlation was negligible on the simulation stands used in this study, it is conceivable that the correlation could be significant in some forest types, such as those in which the DBH-height relationship can be affected substantially by density perhaps through competition. We derived a formula that can be used to estimate the covariance between estimates of mean basal area and the ratio of estimates of mean volume and mean basal area. We also mathematically derived expressions for bias in the big BAF estimator that can be used to show the bias approaches zero in large samples on the order of $\frac {1}{n}$ 1 n where n is the number of sample points.


Author(s):  
Ana Belén Ramos-Guajardo

AbstractA new clustering method for random intervals that are measured in the same units over the same group of individuals is provided. It takes into account the similarity degree between the expected values of the random intervals that can be analyzed by means of a two-sample similarity bootstrap test. Thus, the expectations of each pair of random intervals are compared through that test and a p-value matrix is finally obtained. The suggested clustering algorithm considers such a matrix where each p-value can be seen at the same time as a kind of similarity between the random intervals. The algorithm is iterative and includes an objective stopping criterion that leads to statistically similar clusters that are different from each other. Some simulations to show the empirical performance of the proposal are developed and the approach is applied to two real-life situations.


Geosciences ◽  
2021 ◽  
Vol 11 (7) ◽  
pp. 296
Author(s):  
Richard H. Groshong

This paper is a personal account of the origin and development of the twinned-calcite strain gauge, its experimental verification, and its relationship to stress analysis. The method allows the calculation of the three-dimensional deviatoric strain tensor based on five or more twin sets. A minimum of about 25 twin sets should provide a reasonably accurate result for the magnitude and orientation of the strain tensor. The opposite-signed strain axis orientation is the most accurately located. Where one strain axis is appreciably different from the other two, that axis is generally within about 10° of the correct value. Experiments confirm a magnitude accuracy of 1% strain over the range of 1–12% axial shortening and that samples with more than 40% negative expected values imply multiple or rotational deformations. If two deformations are at a high angle to one another, the strain calculated from the positive and negative expected values separately provides a good estimate of both deformations. Most stress analysis techniques do not provide useful magnitudes, although most provide a good estimate of the principal strain axis directions. Stress analysis based on the number of twin sets per grain provides a better than order-of-magnitude approximation to the differential stress magnitude in a constant strain rate experiment.


2015 ◽  
Author(s):  
Dário Ferreira ◽  
Sandra S. Ferreira ◽  
Célia Nunes ◽  
João T. Mexia

1982 ◽  
Vol 14 (7) ◽  
pp. 869-888 ◽  
Author(s):  
P F Lesse

This paper deals with a class of models which describe spatial interactions and are based on Jaynes's principle. The variables entering these models can be partitioned in four groups: (a) probability density distributions (for example, relative traffic flows), (b) expected values (average cost of travel), (c) their duals (Lagrange multipliers, traffic impedance coefficient), and (d) operators transforming probabilities into expected values. The paper presents several dual formulations replacing the problem of maximizing entropy in terms of the group of variables (a) by equivalent extreme problems involving groups (b)-(d). These problems form the basis of a phenomenological theory. The theory makes it possible to derive useful relationships among groups (b) and (c). There are two topics discussed: (1) practical application of the theory (with examples), (2) the relationship between socioeconomic modelling and statistical mechanics.


Sign in / Sign up

Export Citation Format

Share Document