scholarly journals A Python-Based Pipeline for Preprocessing LC–MS Data for Untargeted Metabolomics Workflows

Metabolites ◽  
2020 ◽  
Vol 10 (10) ◽  
pp. 416
Author(s):  
Gabriel Riquelme ◽  
Nicolás Zabalegui ◽  
Pablo Marchi ◽  
Christina M. Jones ◽  
María Eugenia Monge

Preprocessing data in a reproducible and robust way is one of the current challenges in untargeted metabolomics workflows. Data curation in liquid chromatography–mass spectrometry (LC–MS) involves the removal of biologically non-relevant features (retention time, m/z pairs) to retain only high-quality data for subsequent analysis and interpretation. The present work introduces TidyMS, a package for the Python programming language for preprocessing LC–MS data for quality control (QC) procedures in untargeted metabolomics workflows. It is a versatile strategy that can be customized or fit for purpose according to the specific metabolomics application. It allows performing quality control procedures to ensure accuracy and reliability in LC–MS measurements, and it allows preprocessing metabolomics data to obtain cleaned matrices for subsequent statistical analysis. The capabilities of the package are shown with pipelines for an LC–MS system suitability check, system conditioning, signal drift evaluation, and data curation. These applications were implemented to preprocess data corresponding to a new suite of candidate plasma reference materials developed by the National Institute of Standards and Technology (NIST; hypertriglyceridemic, diabetic, and African-American plasma pools) to be used in untargeted metabolomics studies in addition to NIST SRM 1950 Metabolites in Frozen Human Plasma. The package offers a rapid and reproducible workflow that can be used in an automated or semi-automated fashion, and it is an open and free tool available to all users.

2021 ◽  
Vol 13 (20) ◽  
pp. 4081
Author(s):  
Peter Weston ◽  
Patricia de Rosnay

Brightness temperature (Tb) observations from the European Space Agency (ESA) Soil Moisture Ocean Salinity (SMOS) instrument are passively monitored in the European Centre for Medium-range Weather Forecasts (ECMWF) Integrated Forecasting System (IFS). Several quality control procedures are performed to screen out poor quality data and/or data that cannot accurately be simulated from the numerical weather prediction (NWP) model output. In this paper, these quality control procedures are reviewed, and enhancements are proposed, tested, and evaluated. The enhancements presented include improved sea ice screening, coastal and ambiguous land-ocean screening, improved radio frequency interference (RFI) screening, and increased usage of observation at the edge of the satellite swath. Each of the screening changes results in improved agreement between the observations and model equivalent values. This is an important step in advance of future experiments to test the direct assimilation of SMOS Tbs into the ECMWF land data assimilation system.


Author(s):  
Hua Younan

Abstract In wafer fabrication (Fab), Fluorine (F) based gases are used for Al bondpad opening process. Thus, even on a regular Al bondpad, there exists a low level of F contamination. However, the F level has to be controlled at a lower level. If the F level is higher than the control/spec limits, it could cause F-induced corrosion and Al-F defects, resulting in pad discoloration and NSOP problems. In our previous studies [1-5], the theories, characteristics, chemical and physical failure mechanisms and the root causes of the F-induced corrosion and Al-F defects on Al bondpads have been studied. In this paper, we further study F-induced corrosion and propose to establish an Auger monitoring system so as to monitor the F contamination level on Al bondpads in wafer fabrication. Auger monitoring frequency, sample preparation, wafer life, Auger analysis points, control/spec limits and OOC/OOS quality control procedures are also discussed.


Metabolites ◽  
2020 ◽  
Vol 11 (1) ◽  
pp. 8
Author(s):  
Michiel Bongaerts ◽  
Ramon Bonte ◽  
Serwet Demirdas ◽  
Edwin H. Jacobs ◽  
Esmee Oussoren ◽  
...  

Untargeted metabolomics is an emerging technology in the laboratory diagnosis of inborn errors of metabolism (IEM). Analysis of a large number of reference samples is crucial for correcting variations in metabolite concentrations that result from factors, such as diet, age, and gender in order to judge whether metabolite levels are abnormal. However, a large number of reference samples requires the use of out-of-batch samples, which is hampered by the semi-quantitative nature of untargeted metabolomics data, i.e., technical variations between batches. Methods to merge and accurately normalize data from multiple batches are urgently needed. Based on six metrics, we compared the existing normalization methods on their ability to reduce the batch effects from nine independently processed batches. Many of those showed marginal performances, which motivated us to develop Metchalizer, a normalization method that uses 10 stable isotope-labeled internal standards and a mixed effect model. In addition, we propose a regression model with age and sex as covariates fitted on reference samples that were obtained from all nine batches. Metchalizer applied on log-transformed data showed the most promising performance on batch effect removal, as well as in the detection of 195 known biomarkers across 49 IEM patient samples and performed at least similar to an approach utilizing 15 within-batch reference samples. Furthermore, our regression model indicates that 6.5–37% of the considered features showed significant age-dependent variations. Our comprehensive comparison of normalization methods showed that our Log-Metchalizer approach enables the use out-of-batch reference samples to establish clinically-relevant reference values for metabolite concentrations. These findings open the possibilities to use large scale out-of-batch reference samples in a clinical setting, increasing the throughput and detection accuracy.


1974 ◽  
Vol 20 (4) ◽  
pp. 502-504 ◽  
Author(s):  
Daniel M Baer

Abstract Several technical difficulties diminish the usefulness of serum triglyceride estimation by the method of Stone and Thorp [Clin. Chim. Acta 14, 812 (1966)]. An artificial and somewhat unstable material is used in the standardization. Falsely elevated readings caused by scratched cuvettes are a frequent problem. Conventional quality-control procedures cannot be used because stable preparations are not available. Specimen stability is a greater problem than with conventional chemical methods. In spite of these difficulties, the method can be useful, if its limitations are recognized, in measurements made on nonfasting individuals.


2012 ◽  
Vol 500 ◽  
pp. 715-720
Author(s):  
Jian Guang Li ◽  
Jian Ding ◽  
Huai Jing Jing ◽  
Ying Xue Yao

Accuracy of the Stewart parallel manipulator is utmost important in assembly quality control procedures, and it's also difficult to demonstrate the inner relations quantitatively between errors of pose and of actuators. A novel methodology is proposed in this paper. Firstly the experiment area planning approach is proposed according to the pose and strut symmetries of the manipulator, and the uniform experiment design is conducted to investigate indexes of accuracy sensitivity. The regression equations are established to analyze the significance of various factors according to experiment results.


2021 ◽  
pp. 147807712110121
Author(s):  
Adam Tamas Kovacs ◽  
Andras Micsik

This article discusses a BIM Quality Control Ecosystem that is based on Requirement Linked Data in order to create a framework where automated BIM compliance checking methods can be widely used. The meaning of requirements is analyzed in a building project context as a basis for data flow analysis: what are the main types of requirements, how they are handled, and what sources they originate from. A literature review has been conducted to find the present development directions in quality checking, besides a market research on present, already widely used solutions. With the conclusions of these research and modern data management theory, the principles of a holistic approach have been defined for quality checking in the Architecture, Engineering and Construction (AEC) industry. A comparative analysis has been made on current BIM compliance checking solutions according to our review principles. Based on current practice and ongoing research, a state-of-the-art BIM quality control ecosystem is proposed that is open, enables automation, promotes interoperability, and leaves the data governing responsibility at the sources of the requirements. In order to facilitate the flow of requirement and quality data, we propose a model for requirements as Linked Data and provide example for quality checking using Shapes Constraint Language (SHACL). As a result, an opportunity is given for better quality and cheaper BIM design methods to be implemented in the industry.


Sign in / Sign up

Export Citation Format

Share Document