scholarly journals Automatic 1D 1H NMR Metabolite Quantification for Bioreactor Monitoring

Metabolites ◽  
2021 ◽  
Vol 11 (3) ◽  
pp. 157
Author(s):  
Roy Chih Chung Wang ◽  
David A. Campbell ◽  
James R. Green ◽  
Miroslava Čuperlović-Culf

High-throughput metabolomics can be used to optimize cell growth for enhanced production or for monitoring cell health in bioreactors. It has applications in cell and gene therapies, vaccines, biologics, and bioprocessing. NMR metabolomics is a method that allows for fast and reliable experimentation, requires only minimal sample preparation, and can be set up to take online measurements of cell media for bioreactor monitoring. This type of application requires a fully automated metabolite quantification method that can be linked with high-throughput measurements. In this review, we discuss the quantifier requirements in this type of application, the existing methods for NMR metabolomics quantification, and the performance of three existing quantifiers in the context of NMR metabolomics for bioreactor monitoring.

Plants ◽  
2021 ◽  
Vol 10 (3) ◽  
pp. 466
Author(s):  
Marie-Christine Carpentier ◽  
Cécile Bousquet-Antonelli ◽  
Rémy Merret

The recent development of high-throughput technologies based on RNA sequencing has allowed a better description of the role of post-transcriptional regulation in gene expression. In particular, the development of degradome approaches based on the capture of 5′monophosphate decay intermediates allows the discovery of a new decay pathway called co-translational mRNA decay. Thanks to these approaches, ribosome dynamics could now be revealed by analysis of 5′P reads accumulation. However, library preparation could be difficult to set-up for non-specialists. Here, we present a fast and efficient 5′P degradome library preparation for Arabidopsis samples. Our protocol was designed without commercial kit and gel purification and can be easily done in one working day. We demonstrated the robustness and the reproducibility of our protocol. Finally, we present the bioinformatic reads-outs necessary to assess library quality control.


1959 ◽  
Vol 37 (3) ◽  
pp. 471-478
Author(s):  
B. C. Das

Three synthetic growth-promoting substances were administered to cauliflower plants (Brassica oleracea) to determine their effect on flower weight and flower growth. A Youden square experimental design was employed to set up treatment groups for all possible combinations of three substances at four concentration levels and three frequencies of application, and an untreated control group. The substances employed were 3-indolylbutyric acid (IBA), 2,3,5-tri-iodobenzoic acid (TIBA), and β-naphthoxyacetic acid (NOXA); concentrations used were 0.01, 0.2, 4.0, and 80.0 parts per million; and frequencies adopted were 2, 3, and 6 times in a 12-day period. Final flower weights were analyzed by an analysis of variance which showed that the treatments differed significantly (P <.01). Treatments with NOXA and IBA produced flowers weighing significantly more than those of the control group. Flower growth was characterized by a rapid initial rise, which subsequently levelled off in the control group. For treated groups, the initial rise continued throughout the flowering period. The results suggest that the use of certain synthetic growth-promoting substances may be feasible commercially for enhanced production of cauliflower crops.


2021 ◽  
Author(s):  
Yan Chen ◽  
Nurgul Kaplan Lease ◽  
Jennifer Gin ◽  
Tad Ogorzalek ◽  
Paul D. Adams ◽  
...  

Manual proteomic sample preparation methods limit sample throughput and often lead to poor data quality when thousands of samples must be analyzed. Automated workflows are increasingly used to overcome these issues for some (or even all) of the sample preparation steps. Here, we detail three optimised step-by-step protocols to: (A) lyse Gram-negative bacteria and fungal cells; (B) quantify the amount of protein extracted; and (C) normalize the amount of protein and set up tryptic digestion. These protocols have been developed to facilitate rapid, low variance sample preparation of hundreds of samples, be easily implemented on widely-available Beckman-Coulter Biomek automated liquid handlers, and allow flexibility for future protocol development. By using this workflow 50 micrograms of peptides for 96 samples can be prepared for tryptic digestion in under an hour. We validate these protocols by analyzing 47 E. coli and R. toruloides samples and show that this modular workflow provides robust, reproducible proteomic samples for high-throughput applications. The expected results from these protocols are 94 peptide samples from Gram-negative bacterial and fungal cells prepared for bottom-up quantitative proteomic analysis without the need for desalting column cleanup and with peptide variance (CVs) below 15%.


Author(s):  
Nicolás M. Morato ◽  
MyPhuong T. Le ◽  
Dylan T. Holden ◽  
R. Graham Cooks

The Purdue Make It system is a unique automated platform capable of small-scale in situ synthesis, screening small-molecule reactions, and performing direct label-free bioassays. The platform is based on desorption electrospray ionization (DESI), an ambient ionization method that allows for minimal sample workup and is capable of accelerating reactions in secondary droplets, thus conferring unique advantages compared with other high-throughput screening technologies. By combining DESI with liquid handling robotics, the system achieves throughputs of more than 1 sample/s, handling up to 6144 samples in a single run. As little as 100 fmol/spot of analyte is required to perform both initial analysis by mass spectrometry (MS) and further MSn structural characterization. The data obtained are processed using custom software so that results are easily visualized as interactive heatmaps of reaction plates based on the peak intensities of m/ z values of interest. In this paper, we review the system’s capabilities as described in previous publications and demonstrate its utilization in two new high-throughput campaigns: (1) the screening of 188 unique combinatorial reactions (24 reaction types, 188 unique reaction mixtures) to determine reactivity trends and (2) label-free studies of the nicotinamide N-methyltransferase enzyme directly from the bioassay buffer. The system’s versatility holds promise for several future directions, including the collection of secondary droplets containing the products from successful reaction screening measurements, the development of machine learning algorithms using data collected from compound library screening, and the adaption of a variety of relevant bioassays to high-throughput MS.


Molecules ◽  
2019 ◽  
Vol 24 (24) ◽  
pp. 4451 ◽  
Author(s):  
Patrick Weber ◽  
Cédric Pissis ◽  
Rafael Navaza ◽  
Ariel E. Mechaly ◽  
Frederick Saul ◽  
...  

The availability of whole-genome sequence data, made possible by significant advances in DNA sequencing technology, led to the emergence of structural genomics projects in the late 1990s. These projects not only significantly increased the number of 3D structures deposited in the Protein Data Bank in the last two decades, but also influenced present crystallographic strategies by introducing automation and high-throughput approaches in the structure-determination pipeline. Today, dedicated crystallization facilities, many of which are open to the general user community, routinely set up and track thousands of crystallization screening trials per day. Here, we review the current methods for high-throughput crystallization and procedures to obtain crystals suitable for X-ray diffraction studies, and we describe the crystallization pipeline implemented in the medium-scale crystallography platform at the Institut Pasteur (Paris) as an example.


2020 ◽  
Vol 1148 ◽  
pp. 122134
Author(s):  
Ludwig Gerhard Bauer ◽  
Sina Hoelterhoff ◽  
Tobias Graf ◽  
Christian Bell ◽  
Anja Bathke

2015 ◽  
Vol 4 (3) ◽  
Author(s):  
Claudia Zani ◽  
Francesco Maria Restivo ◽  
Mauro Carcelli ◽  
Donatella Feretti ◽  
Giorgio Pelosi ◽  
...  

<em>Background</em>. In the Po Valley aflatoxins play a relevant role: the local food economy is heavily based on cereal cultivations for animal feed and human nutrition. Aims of this project are the identification of new compounds that inhibit <em>Aspergillus</em> proliferation, the development of new inhibitors of aflatoxins production, and the set-up a practical screening procedure to identify the most effective and safe compounds. <br /><em>Design and Methods.</em> New compounds will be synthetized with natural origin molecules as ligands and endogenous metal ions to increase their bioavailability for the fungi as metal complexes. A biotechnological high-throughput screening will be set up to identify efficiently the most powerful substances. The newly synthesized compounds with effective antifungal activities, will be evaluated with battery of tests with different end-points to assess the toxic potential risk for environmental and human health. <br /><em>Expected impact of the study for public health.</em> The fundamental step in the project will be the synthesis of new compounds and the study of their capability to inhibit aflatoxin biosynthesis. A new, simple, inexpensive and high-throughput method to screen the anti-fungine and anti-mycotoxin activity of the new synthesised compounds will be applied. The evaluation of possible risks for humans due to toxic and genotoxic activities of the molecules will be made with a new approach using different types of cells (bacteria, plants and human cells).


2006 ◽  
Vol 11 (6) ◽  
pp. 606-616 ◽  
Author(s):  
Oliver Von Ahsen ◽  
Anne Schmidt ◽  
Monika Klotz ◽  
Karsten Parczyk

High-throughput screening (HTS) of large chemical libraries has become the main source of new lead compounds for drug development. Several specialized detection technologies have been developed to facilitate the cost- and time-efficient screening of millions of compounds. However, concerns have been raised, claiming that different HTS technologies may produce different hits, thus limiting trust in the reliability of HTS data. This study was aimed to investigate the reliability of the authors most frequently used assay techniques: scintillation proximity assay (SPA) and homogeneous time-resolved fluorescence resonance energy transfer (TR-FRET). To investigate the data concordance between these 2 detection technologies, the authors screened a large subset of the Schering compound library consisting of 300,000 compounds for inhibitors of a nonreceptor tyrosine kinase. They chose to set up this study in realistic HTS scale to ensure statistical significance of the results. The findings clearly demonstrate that the choice of detection technology has no significant impact on hit finding, provided that assays are biochemically equivalent. Data concordance is up to 90%. The little differences in hit findings are caused by threshold setting but not by systematic differences between the technologies. The most significant difference between the compared techniques is that in the SPA format, more false-positive primary hits were obtained.


Sign in / Sign up

Export Citation Format

Share Document