scholarly journals Intermediate Stars in Extragalactic Radiosource Fields: Astrometric Measurements

2000 ◽  
Vol 180 ◽  
pp. 92-96
Author(s):  
Mirel Birlan ◽  
Gheorghe Bocsa

AbstractThe statistical analysis of the O – C and the standard errors for the astrometric RRS2 standard stars, and the analysis of the standard errors for the intermediate PIRS stars for 75 extragalactic radiosource fields are presented. This study was performed at Bucharest Observatory.

1945 ◽  
Vol 35 (3) ◽  
pp. 158-162 ◽  
Author(s):  
W. Bolton ◽  
R. W. Hale

An experiment involving twelve pens of twenty-five pullets demonstrated that two types of dried potatoes compared favourably with cereals, as regards their effects on egg production, when up to 42½% of each product was included in the mash.The records of the experiment, as regards (a) the number of eggs laid in 12 months, (b) the mean weight of eggs during the year, and (c) the change in live weight during the year, were studied by analysis of variance, using the approximate method of Yates (1933) for obtaining standard errors of pen means from the variance within pens. Similar analyses of the number of eggs laid in the month were carried out for each month separately. The latter analyses indicated the months in which increased production was obtained from the superior rations, and showed that duplicate pens had reacted similarly to the experimental feeding. The results of the statistical analyses are set out in some detail as an example of method.


2006 ◽  
Vol 38 (5) ◽  
pp. 425-434 ◽  
Author(s):  
Caroline J. DORÉ ◽  
Mariette S. COLE ◽  
David L. HAWKSWORTH

In order to ascertain the extent of possible host-specialization in the Nesolechia oxyspora complex, as a pilot study ascospores from 20 collections from different parmelioid hosts representing seven genera were drawn, measured, and had length:breadth ratios calculated. The data were then subjected to multiple regression analysis using Huber-White sandwich estimators of standard errors (apparently not previously used in mycology) which take account of spores not necessarily being independent variables as they might come from the same ascus, and a statistical analysis. Significant differences between collections from seven genera were found. While the sample size was too small to reach definite conclusions, it is clear that there is a finer degree of host-relatedness than hitherto expected, which may be geno- or phenotypic. A more extensive study including species from a wider range of hosts and complemented by molecular methods will be necessary to further elucidate degrees of specificity and cryptic co-speciation in the complex. A list of the 63 reported lichen hosts is included; these are distributed though 19 genera.


2017 ◽  
Vol 16 (1) ◽  
pp. 55-65
Author(s):  
PATRICK WHITE ◽  
STEPHEN GORARD

Recent concerns about a shortage of capacity for statistical and numerical analysis skills among social science students and researchers have prompted a range of initiatives aiming to improve teaching in this area. However, these projects have rarely re-evaluated the content of what is taught to students and have instead focussed primarily on delivery. The emphasis has generally been on increased use of complex techniques, specialist software and, most importantly in the context of this paper, a continued focus on inferential statistical tests, often at the expense of other types of analysis. We argue that this ‘business as usual’ approach to the content of statistics teaching is problematic for several reasons. First, the assumptions underlying inferential statistical tests are rarely met, meaning that students are being taught analyses that should only be used very rarely. Secondly, all of the most common outputs of inferential statistical tests – p-values, standard errors and confidence intervals – suffer from a similar logical problem that renders them at best useless and at worst misleading. Eliminating inferential statistical tests from statistics teaching (and practice) would avoid the creation of another generation of researchers who either do not understand, or knowingly misuse, these techniques. It would also have the benefit of removing one of the key barriers to students’ understanding of statistical analysis. First published May 2017 at Statistics Education Research Journal Archives


2016 ◽  
Vol 26 (03n04) ◽  
pp. 93-101 ◽  
Author(s):  
K. Kataoka ◽  
T. Yamada ◽  
T. Saunders ◽  
T. Takatsuji ◽  
K. Sera ◽  
...  

The objective of this study is to examine the variations in hair mineral concentrations measured by particle induced X-ray emission (PIXE). As the concentrations reflect the historical profile of minerals in blood for a few days to several years, hair mineral measurements vary among the hair strands of a sample as well as the locations on hair strands where PIXE beams are irradiated. PIXE analysis presents not only mineral measurements but also standard errors associated with them due to uncertainty in fitting analysis of X-ray spectrum. We developed statistical methods to decompose the intra-individual variance into the variances due to locations and that due to the fitting analysis. This study describes the results of statistical analysis on the variations in six minerals (Cu, Fe, Sr, Ca, Cl and Pb) and compares the results for better understanding of variations in hair mineral measurements obtained from hair analysis using PIXE.


Author(s):  
D. J. Dowrick ◽  
E. G. C. Smith

This paper gives a list of magnitudes on the surface wave scale for a selection of larger New Zealand earthquakes that occurred in the period 1901-1988. Most of the events considered were of shallow origin h < 45 km, and the magnitudes ranged from about 5 to 7.8. The Analysis of Variance method of statistical analysis was used to correct the large set of station observations so as to provide consistent mean magnitudes for each event. The resulting station terms and standard errors are given. Comparisons made between the results of this study and the relatively few previous Ms determinations show little change except for one or two important events. In particular the magnitude of the 1968 Inangahua earthquake was found to be 7.4 (± 0.07), which is somewhat greater than previous estimates.


Author(s):  
G G Lucas ◽  
G G Lucas ◽  
A L Emtage

A new look is taken at the analytical procedure for reducing the results of the coast-down test. A three-degree-of-freedom model is discussed, and a derivative-based correction algorithm is shown to account for changes in ambient conditions. A parameter optimization technique for coeficient extraction is described, but not validated, due to deficiencies in the available data. The calculation of the standard errors in the extracted coeficients provides the basis for a statistical analysis of coast-down tests. This type of analysis is applied to simulated coast-down data that contain artificial pseudo-random errors, and this leads to a discussion of the accuracy requirements of the coast-down test.


Weed Science ◽  
2015 ◽  
Vol 63 (SP1) ◽  
pp. 166-187 ◽  
Author(s):  
Christian Ritz ◽  
Andrew R. Kniss ◽  
Jens C. Streibig

There are various reasons for using statistics, but perhaps the most important is that the biological sciences are empirical sciences. There is always an element of variability that can only be dealt with by applying statistics. Essentially, statistics is a way to summarize the variability of data so that we can confidently say whether there is a difference among treatments or among regression parameters and tell others about the variability of the results. To that end, we must use the most appropriate statistics to get a “correct” picture of the experimental variability, and the best way of doing that is to report the size of the parameters or the means and their associated standard errors or confidence intervals. Simply declaring that the yields were 1 or 2 ton ha−1does not mean anything without associated standard errors for those yields. Another driving force is that no journal will accept publications without the data having been subjected to some kind of statistical analysis.


2021 ◽  
pp. 000370282098760
Author(s):  
Samantha Remigi ◽  
Tullio Mancini ◽  
Simona Ferrando ◽  
Maria Luce Frezzotti

Raman spectroscopy has been used extensively to calculate CO2 fluid density in many geological environments, based on the measurement of the Fermi diad split (Δ; cm-1) in the CO2 spectrum. While recent research has allowed the calibration of several Raman CO2 densimeters, there is a limit to the inter-laboratory application of published equations. These calculate two classes of density values for the same measured Δ, with a deviation of 0.09 ± 0.02 g/cm3 on average. To elucidate the influence of experimental parameters on the calibration of Raman CO2 densimeters, we propose a bottom-up approach beginning with the calibration of a new equation, to evaluate a possible instrument-dependent variability induced by experimental conditions. Then, we develop bootstrapped confidence intervals for density estimate of existing equations to move the statistical analysis from a sample-specific to a population level. We find that Raman densimeter equations calibrated based on spectra acquired with similar spectral resolution calculate CO2 density values lying within standard errors of equations and are suitable for the inter-laboratory application. The statistical analysis confirms that equations calibrated at similar spectral resolution calculate CO2 densities equivalent at 95% confidence, and that each Raman densimeter does have a limit of applicability, statistically defined by a minimum Δ value, below which the error in calculated CO2 densities is too high.


1966 ◽  
Vol 24 ◽  
pp. 188-189
Author(s):  
T. J. Deeming

If we make a set of measurements, such as narrow-band or multicolour photo-electric measurements, which are designed to improve a scheme of classification, and in particular if they are designed to extend the number of dimensions of classification, i.e. the number of classification parameters, then some important problems of analytical procedure arise. First, it is important not to reproduce the errors of the classification scheme which we are trying to improve. Second, when trying to extend the number of dimensions of classification we have little or nothing with which to test the validity of the new parameters.Problems similar to these have occurred in other areas of scientific research (notably psychology and education) and the branch of Statistics called Multivariate Analysis has been developed to deal with them. The techniques of this subject are largely unknown to astronomers, but, if carefully applied, they should at the very least ensure that the astronomer gets the maximum amount of information out of his data and does not waste his time looking for information which is not there. More optimistically, these techniques are potentially capable of indicating the number of classification parameters necessary and giving specific formulas for computing them, as well as pinpointing those particular measurements which are most crucial for determining the classification parameters.


Author(s):  
Gianluigi Botton ◽  
Gilles L'espérance

As interest for parallel EELS spectrum imaging grows in laboratories equipped with commercial spectrometers, different approaches were used in recent years by a few research groups in the development of the technique of spectrum imaging as reported in the literature. Either by controlling, with a personal computer both the microsope and the spectrometer or using more powerful workstations interfaced to conventional multichannel analysers with commercially available programs to control the microscope and the spectrometer, spectrum images can now be obtained. Work on the limits of the technique, in terms of the quantitative performance was reported, however, by the present author where a systematic study of artifacts detection limits, statistical errors as a function of desired spatial resolution and range of chemical elements to be studied in a map was carried out The aim of the present paper is to show an application of quantitative parallel EELS spectrum imaging where statistical analysis is performed at each pixel and interpretation is carried out using criteria established from the statistical analysis and variations in composition are analyzed with the help of information retreived from t/γ maps so that artifacts are avoided.


Sign in / Sign up

Export Citation Format

Share Document