Strategies for Data Handling and Statistical Analysis in Metabolomics Studies

Author(s):  
Marianne Defernez ◽  
Gwénaëlle Le Gall
Author(s):  
Simon Lin ◽  
Salvatore Mungal ◽  
Richard Haney ◽  
Edward F. Patz Jr. ◽  
Patrick McConnell

This chapter provides a rudimentary review of the field of proteomics as it applies to mass spectrometry, data handling, and analysis. It points out the potential significance of the field suggesting that the study of nuclei acids has its limitations and that the progressive field of proteomics with spectrometry in tandem with transcription studies could potentially elucidate the link between RNA transcription and concomitant protein expression. Furthermore, we describe the fundamentals of proteomics with mass spectrometry and expound the methodology necessary to manage the vast amounts of data generated in order to facilitate statistical analysis. We explore the underlying technologies with the intention to demystify the complexities of the nascent field and to fuel interest by readers in the near future.


1989 ◽  
Vol 67 (1) ◽  
pp. 55-60 ◽  
Author(s):  
L. R. Linton ◽  
E. S. Edgington ◽  
Ronald W. Davies

Published values of niche overlap (similarity) are generally point estimates of the similarity between population centroids. A number of shortcomings are associated with this method of data presentation: (i) confidence intervals on the estimates are lacking, (ii) no statistical procedures are used to test for significant differences between estimates, and (iii) the estimates tend to be biased. These problems arise primarily as a result of the manner in which data are pooled for the calculations. We illustrate an alternative method of data handling, involving calculation of similarity between individuals, which is more biologically reasonable and which eliminates these problems. A permutation test procedure is also introduced for use on large sets of data.


1988 ◽  
Vol 24 (3) ◽  
pp. 343-353
Author(s):  
D. J. Finney

SUMMARYGood data handling is a prerequisite for good statistical analysis and interpretation. This paper begins by distinguishing types of numerical data. Care for original data and the risks involved in any process of copying data are emphasized. After discussion of the entry of data into a computer file, problems of scrutiny and checking are discussed at length. An attempt is made to suggest what action should follow when such scrutiny discloses anomalies in some values; if explanations can be found, or if clear evidence indicates corrections that are needed, no problem arises, but otherwise various subjective responses need consideration. Advice is offered on the numbers of digits to be retained in computations and reports. A final section presents ideas on the filing of data and analyses from an experiment.


1966 ◽  
Vol 24 ◽  
pp. 188-189
Author(s):  
T. J. Deeming

If we make a set of measurements, such as narrow-band or multicolour photo-electric measurements, which are designed to improve a scheme of classification, and in particular if they are designed to extend the number of dimensions of classification, i.e. the number of classification parameters, then some important problems of analytical procedure arise. First, it is important not to reproduce the errors of the classification scheme which we are trying to improve. Second, when trying to extend the number of dimensions of classification we have little or nothing with which to test the validity of the new parameters.Problems similar to these have occurred in other areas of scientific research (notably psychology and education) and the branch of Statistics called Multivariate Analysis has been developed to deal with them. The techniques of this subject are largely unknown to astronomers, but, if carefully applied, they should at the very least ensure that the astronomer gets the maximum amount of information out of his data and does not waste his time looking for information which is not there. More optimistically, these techniques are potentially capable of indicating the number of classification parameters necessary and giving specific formulas for computing them, as well as pinpointing those particular measurements which are most crucial for determining the classification parameters.


Author(s):  
Gianluigi Botton ◽  
Gilles L'espérance

As interest for parallel EELS spectrum imaging grows in laboratories equipped with commercial spectrometers, different approaches were used in recent years by a few research groups in the development of the technique of spectrum imaging as reported in the literature. Either by controlling, with a personal computer both the microsope and the spectrometer or using more powerful workstations interfaced to conventional multichannel analysers with commercially available programs to control the microscope and the spectrometer, spectrum images can now be obtained. Work on the limits of the technique, in terms of the quantitative performance was reported, however, by the present author where a systematic study of artifacts detection limits, statistical errors as a function of desired spatial resolution and range of chemical elements to be studied in a map was carried out The aim of the present paper is to show an application of quantitative parallel EELS spectrum imaging where statistical analysis is performed at each pixel and interpretation is carried out using criteria established from the statistical analysis and variations in composition are analyzed with the help of information retreived from t/γ maps so that artifacts are avoided.


Sign in / Sign up

Export Citation Format

Share Document