scholarly journals A novel bioinformatics approach to identify the consistently well-performing normalization strategy for current metabolomic studies

2019 ◽  
Vol 21 (6) ◽  
pp. 2142-2152 ◽  
Author(s):  
Qingxia Yang ◽  
Jiajun Hong ◽  
Yi Li ◽  
Weiwei Xue ◽  
Song Li ◽  
...  

Abstract Unwanted experimental/biological variation and technical error are frequently encountered in current metabolomics, which requires the employment of normalization methods for removing undesired data fluctuations. To ensure the ‘thorough’ removal of unwanted variations, the collective consideration of multiple criteria (‘intragroup variation’, ‘marker stability’ and ‘classification capability’) was essential. However, due to the limited number of available normalization methods, it is extremely challenging to discover the appropriate one that can meet all these criteria. Herein, a novel approach was proposed to discover the normalization strategies that are consistently well performing (CWP) under all criteria. Based on various benchmarks, all normalization methods popular in current metabolomics were ‘first’ discovered to be non-CWP. ‘Then’, 21 new strategies that combined the ‘sample’-based method with the ‘metabolite’-based one were found to be CWP. ‘Finally’, a variety of currently available methods (such as cubic splines, range scaling, level scaling, EigenMS, cyclic loess and mean) were identified to be CWP when combining with other normalization. In conclusion, this study not only discovered several strategies that performed consistently well under all criteria, but also proposed a novel approach that could ensure the identification of CWP strategies for future biological problems.

Author(s):  
Arjun Bhattacharya ◽  
Alina M Hamilton ◽  
Helena Furberg ◽  
Eugene Pietzak ◽  
Mark P Purdue ◽  
...  

Abstract The NanoString RNA counting assay for formalin-fixed paraffin embedded samples is unique in its sensitivity, technical reproducibility and robustness for analysis of clinical and archival samples. While commercial normalization methods are provided by NanoString, they are not optimal for all settings, particularly when samples exhibit strong technical or biological variation or where housekeeping genes have variable performance across the cohort. Here, we develop and evaluate a more comprehensive normalization procedure for NanoString data with steps for quality control, selection of housekeeping targets, normalization and iterative data visualization and biological validation. The approach was evaluated using a large cohort ($N=\kern0.5em 1649$) from the Carolina Breast Cancer Study, two cohorts of moderate sample size ($N=359$ and$130$) and a small published dataset ($N=12$). The iterative process developed here eliminates technical variation (e.g. from different study phases or sites) more reliably than the three other methods, including NanoString’s commercial package, without diminishing biological variation, especially in long-term longitudinal multiphase or multisite cohorts. We also find that probe sets validated for nCounter, such as the PAM50 gene signature, are impervious to batch issues. This work emphasizes that systematic quality control, normalization and visualization of NanoString nCounter data are an imperative component of study design that influences results in downstream analyses.


2019 ◽  
Vol 160 (44) ◽  
pp. 1727-1734
Author(s):  
László Tamási ◽  
Ágnes Miksi ◽  
Zsófia Kardos ◽  
Ágnes Flórián ◽  
Zoltán Szekanecz

Abstract: Authors discuss the musculoskeletal aspects of obesity by applying a novel approach. Biochemical changes associated with obesity and especially metabolic syndrome, may have a great impact on the function of bones, joints and muscles. Therefore we need a new view and new strategies in rheumatic diseases. Obesity-associated metabolic changes should be considered during the progress of as well as the selection of treatment in inflammatory rheumatic diseases. Individualised treatment is necessary due to associated comorbidities as well. Orv Hetil. 2019; 160(44): 1727–1734.


Author(s):  
Ava Farley ◽  
Gary J. Slater ◽  
Karen Hind

Athletic populations require high-precision body composition assessments to identify true change. Least significant change determines technical error via same-day consecutive tests but does not integrate biological variation, which is more relevant for longitudinal monitoring. The aim of this study was to assess biological variation using least significant change measures from body composition methods used on athletes, including surface anthropometry (SA), air displacement plethysmography (BOD POD), dual-energy X-ray absorptiometry (DXA), and bioelectrical impedance spectroscopy (BIS). Thirty-two athletic males (age = 31 ± 7 years; stature = 183 ± 7 cm; mass = 92 ± 10 kg) underwent three testing sessions over 2 days using four methods. Least significant change values were calculated from differences in Day 1 Test 1 versus Day 1 Test 2 (same-day precision), as well as Day 1 Test 1 versus Day 2 (consecutive-day precision). There was high agreement between same-day and consecutive-day fat mass and fat-free mass measurements for all methods. Consecutive-day precision error in comparison with the same-day precision error was 50% higher for fat mass estimates from BIS (3,607 vs. 2,331 g), 25% higher from BOD POD (1,943 vs. 1,448 g) and DXA (1,615 vs. 1,204 g), but negligible from SA (442 vs. 586 g). Consecutive-day precision error for fat-free mass was 50% higher from BIS (3,966 vs. 2,276 g) and SA (1,159 vs. 568 g) and 25% higher from BOD POD (1,894 vs. 1,450 g) and DXA (1,967 vs. 1,461 g) than the same-day precision error. Precision error in consecutive-day analysis considers both technical error and biological variation, enhancing the identification of small, yet significant changes in body composition of resistance-trained male athletes. Given that change in physique is likely to be small in this population, the use of DXA, BOD POD, or SA is recommended.


2021 ◽  
Author(s):  
Yusheng Liu ◽  
Yiwei Zhang ◽  
Falong Lu ◽  
Jiaqiang Wang

AbstractThe normalization of high-throughput RNA sequencing (RNA-seq) data is needed to accurately analyze gene expression levels. Traditional normalization methods can either correct the differences in sequencing depth, or correct both the sequencing depth and other unwanted variations introduced during sequencing library preparation through exogenous spike-ins1-4. However, the exogenous spike-ins are prone to variation5,6. Therefore, a better normalization approach with a more appropriate reference is an ongoing demand. In this study, we demonstrated that mitochondrial mRNA (mRNA encoded by mitochondria genome) can serve as a steady endogenous reference for RNA-seq data analysis, and performs better than exogenous spike-ins. We also found that using mitochondrial mRNA as a reference can reduce batch effects for RNA-seq data. These results provide a simple and practical normalization strategy for RNA-seq data, which will serve as a valuable tool widely applicable to transcriptomic studies.


2021 ◽  
Vol 11 (1) ◽  
Author(s):  
Azzurra Valerio ◽  
C. Steven Borrego ◽  
Luigi Boitani ◽  
Luca Casadei ◽  
Alessandro Giuliani ◽  
...  

AbstractFew field tests have assessed the effects of predator-induced stress on prey fitness, particularly in large carnivore-ungulate systems. Because traditional measures of stress present limitations when applied to free-ranging animals, new strategies and systemic methodologies are needed. Recent studies have shown that stress and anxiety related behaviors can influence the metabolic activity of the gut microbiome in mammal hosts, and these metabolic alterations may aid in identification of stress. In this study, we used NMR-based fecal metabolomic fingerprinting to compare the fecal metabolome, a functional readout of the gut microbiome, of cattle herds grazing in low vs. high wolf-impacted areas within three wolf pack territories. Additionally, we evaluated if other factors (e.g., cattle nutritional state, climate, landscape) besides wolf presence were related to the variation in cattle metabolism. By collecting longitudinal fecal samples from GPS-collared cattle, we found relevant metabolic differences between cattle herds in areas where the probability of wolf pack interaction was higher. Moreover, cattle distance to GPS-collared wolves was the factor most correlated with this difference in cattle metabolism, potentially reflecting the variation in wolf predation risk. We further validated our results through a regression model that reconstructed cattle distances to GPS-collared wolves based on the metabolic difference between cattle herds. Although further research is needed to explore if similar patterns also hold at a finer scale, our results suggests that fecal metabolomic fingerprinting is a promising tool for assessing the physiological responses of prey to predation risk. This novel approach will help improve our knowledge of the consequences of predators beyond the direct effect of predation.


2019 ◽  
Vol 806 ◽  
pp. 87-92
Author(s):  
Arseniy Portnyagin ◽  
Alexey Golikov ◽  
Evgenii K. Papynov ◽  
Valentin Avramenko

Temperature-programmed reduction (TPR) is a widely used method for characterization of oxide-based catalysts, sorbents, and functional materials, but its results lack quantitative assessment. Here, we present a novel approach to kinetic analysis of the TPR that can be applied to a large variety of systems involving multiple limiting stages. Implementation of cubic splines to approximate rate constant vs. conversion dependencies obtained from several TPR curves recorded at different heating rates yields in a set of kinetic parameters (activation energy and preexponential factors) for all reduction stages. Relationship between preexponential factor of the first reduction stage and the specific surface area of the sample has been shown. Reduction of hematite has been studied to prove the performance of the developed kinetic analysis technique.


Sign in / Sign up

Export Citation Format

Share Document