scholarly journals Contaminant Analysis Automation Demonstration Proposal

Author(s):  
Mike G. Dodson ◽  
Anne Schur ◽  
Janet G. Heubach

The nation-wide and global need for environmental restoration and waste remediation (ER & WR) presents significant challenges to the analytical chemistry laboratory. The expansion of ER & WR programs forces an increase in the volume of samples processed and the demand for analysis data. To handle this expanding volume, productivity must be increased. However, the need for significantly increased productivity faces a contaminant analysis process which is costly in time, labor, equipment, and safety protection. Laboratory automation offers a cost effective approach to meeting current and future contaminant analytical laboratory needs. The proposed demonstration will present a proof-of-concept automated laboratory conducting varied sample preparations. This automated process also highlights a graphical user interface that provides supervisory control and monitoring of the automated process. The demonstration provides affirming answers to the following questions about laboratory automation: • Can preparation of contaminants be successfully automated? • Can a full-scale working proof-of-concept automated laboratory be developed that is capable of preparing contaminant and hazardous chemical samples? • Can the automated processes be seamlessly integrated and controlled? • Can the automated laboratory be customized through really convertible design? • Can automated sample preparation concepts be extended to the other phases of the sample analysis process? To fully reap the benefits of automation, four human factors areas should be studied and the outputs used to increase the efficiency of laboratory automation. These areas include: 1) laboratory configuration, 2) procedures, 3) receptacles and fixtures, and 4) human-computer interface for the full automated system and complex laboratory information management systems.

Author(s):  
L.R. Kashapova ◽  
D.L. Pankratov ◽  
V.G. Shibakov

The procedure of automated process reliability evaluation is developed in order to prevent recurrent defects in parts manufactured by die stamping. The procedure is based on the analysis of such factors as part design, material, its mechanical and physical properties; equipment parameters, tool performance, etc. The list of reliability factors may vary according to type of operation as deformation process is different for each group of operations. The adjustment of stamping process reliability performance prevents any defects emerging during production of critical parts as early as the work preparation stage.


Author(s):  
Josep Sanchís ◽  
Maria José Farré ◽  
Mira Petrovic

N-Nitrosodimethylamine (NDMA) is a nitrogenous disinfection by-product (DBP) that has been included in drinking water regulations worldwide because of its carcinogenicity and hazardousness. Anticipating the NDMA formation potential (FP) of...


Author(s):  
Thomas E. Grissom ◽  
Andrew DuKatz ◽  
Hubert A. Kordylewski ◽  
Richard P. Dutton

Recent healthcare legislation, financial pressures, and regulatory oversight have increased the need to create improved mechanisms for performance measurement, quality management tracking, and outcomes-based research. The Anesthesia Quality Institute (AQI) has established the National Anesthesia Clinical Outcomes Registry (NACOR) to support these requirements for a wide-range of customers including individual anesthesiologists, anesthesia practices, hospitals, and credentialing agencies. Concurrently, the availability of increased digital sources of healthcare data make it possible to capture massive quantities of data in a more efficient and cost-effective manner than ever before. With NACOR, AQI has established a user-friendly, automated process to effectively and efficiently collect a wide-range of anesthesia-related data directly from anesthesia practices. This review will examine the issues guiding the evolution of NACOR as well as some potential pitfalls in its growth and usage.


Author(s):  
Suganya Ramamoorthy ◽  
Rajaram Sivasubramaniam

Medical diagnosis has been gaining importance in everyday life. The diseases and their symptoms are highly varying and there is always a need for a continuous update of knowledge needed for the doctors. The diseases fall into different categories and a small variation of symptoms may leave to different categories of diseases. This is further supplemented by the medical analysts for a continuous treatment process. The treatment generally starts with a diagnosis and further goes through a set of procedures including X-ray, CT-scans, ultrasound imaging for qualitative analysis and diagnosis by doctors. A small level of error in disease identification introduces overhead in diagnosis and difficult in treatment. In such cases, an automated system that could retrieve medical images based on user's interest. This chapter deals with various techniques, methodologies that correspond to the classification problem in data analysis process and its methodological impacts to big data.


2018 ◽  
pp. 2350-2362
Author(s):  
Suganya Ramamoorthy ◽  
Rajaram Sivasubramaniam

Medical diagnosis has been gaining importance in everyday life. The diseases and their symptoms are highly varying and there is always a need for a continuous update of knowledge needed for the doctors. The diseases fall into different categories and a small variation of symptoms may leave to different categories of diseases. This is further supplemented by the medical analysts for a continuous treatment process. The treatment generally starts with a diagnosis and further goes through a set of procedures including X-ray, CT-scans, ultrasound imaging for qualitative analysis and diagnosis by doctors. A small level of error in disease identification introduces overhead in diagnosis and difficult in treatment. In such cases, an automated system that could retrieve medical images based on user's interest. This chapter deals with various techniques, methodologies that correspond to the classification problem in data analysis process and its methodological impacts to big data.


Pharmaceutics ◽  
2019 ◽  
Vol 11 (7) ◽  
pp. 353 ◽  
Author(s):  
Adriana Bezerra-Souza ◽  
Raquel Fernandez-Garcia ◽  
Gabriela F. Rodrigues ◽  
Francisco Bolas-Fernandez ◽  
Marcia Dalastra Laurenti ◽  
...  

Leishmaniasis is a neglected tropical disease affecting more than 12 million people worldwide, which in its visceral clinical form (VL) is characterised by the accumulation of parasites in the liver and spleen, and can lead to death if not treated. Available treatments are not well tolerated due to severe adverse effects, need for parenteral administration and patient hospitalisation, and long duration of expensive treatments. These treatment realities justify the search for new effective drugs, repurposing existing licensed drugs towards safer and non-invasive cost-effective medicines for VL. In this work, we provide proof of concept studies of butenafine and butenafine self-nanoemulsifying drug delivery systems (B-SNEDDS) against Leishmania infantum. Liquid B-SNEDDS were optimised using design of experiments, and then were spray-dried onto porous colloidal silica carriers to produce solid-B-SNEDDS with enhanced flow properties and drug stability. Optimal liquid B-SNEDDS consisted of Butenafine:Capryol 90:Peceol:Labrasol (3:49.5:24.2:23.3 w/w), which were then sprayed-dried with Aerosil 200 with a final 1:2 (Aerosil:liquid B-SNEDDS w/w) ratio. Spray-dried particles exhibited near-maximal drug loading, while maintaining excellent powder flow properties (angle of repose <10°) and sustained release in acidic gastrointestinal media. Solid-B-SNEDDS demonstrated greater selectivity index against promastigotes and L. infantum-infected amastigotes than butenafine alone. Developed oral solid nanomedicines enable the non-invasive and safe administration of butenafine as a cost-effective and readily scalable repurposed medicine for VL.


2020 ◽  
pp. 096228022095817
Author(s):  
Linchen He ◽  
Linqiu Du ◽  
Zoran Antonijevic ◽  
Martin Posch ◽  
Valeriy R Korostyshevskiy ◽  
...  

Previous work has shown that individual randomized “proof-of-concept” (PoC) studies may be designed to maximize cost-effectiveness, subject to an overall PoC budget constraint. Maximizing cost-effectiveness has also been considered for arrays of simultaneously executed PoC studies. Defining Type III error as the opportunity cost of not performing a PoC study, we evaluate the common pharmaceutical practice of allocating PoC study funds in two stages. Stage 1, or the first wave of PoC studies, screens drugs to identify those to be permitted additional PoC studies in Stage 2. We investigate if this strategy significantly improves efficiency, despite slowing development. We quantify the benefit, cost, benefit-cost ratio, and Type III error given the number of Stage 1 PoC studies. Relative to a single stage PoC strategy, significant cost-effective gains are seen when at least one of the drugs has a low probability of success (10%) and especially when there are either few drugs (2) with a large number of indications allowed per drug (10) or a large portfolio of drugs (4). In these cases, the recommended number of Stage 1 PoC studies ranges from 2 to 4, tracking approximately with an inflection point in the minimization curve of Type III error.


BMC Genomics ◽  
2020 ◽  
Vol 21 (1) ◽  
Author(s):  
Houriiyah Tegally ◽  
James Emmanuel San ◽  
Jennifer Giandhari ◽  
Tulio de Oliveira

Abstract In research and clinical genomics laboratories today, sample preparation is the bottleneck of experiments, particularly when it comes to high-throughput next generation sequencing (NGS). More genomics laboratories are now considering liquid-handling automation to make the sequencing workflow more efficient and cost effective. The question remains as to its suitability and return on investment. A number of points need to be carefully considered before introducing robots into biological laboratories. Here, we describe the state-of-the-art technology of both sophisticated and do-it-yourself (DIY) robotic liquid-handlers and provide a practical review of the motivation, implications and requirements of laboratory automation for genome sequencing experiments.


Author(s):  
Vlad Florea ◽  
Vishrut Shah ◽  
Stephen Roper ◽  
Garrett Vierhout ◽  
Il Yong Kim

Over the past decade there has been an increasing demand for light-weight components for the automotive and aerospace industries. This has led to significant advancement in Topology Optimization methods, especially in developing new algorithms which can consider multi-material design. While Multi-Material Topology Optimization (MMTO) can be used to determine the optimum material layout and choice for a given engineering design problem, it fails to consider practical manufacturing constraints. One such constraint is the practical joining of multi-component designs. In this paper, a new method will be proposed for simultaneously performing MMTO and Joint Topology Optimization (JTO). This algorithm will use a serial approach to loop through the MMTO and JTO phases to obtain a truly optimum design which considers both aspects. A case study is performed on an automotive ladder frame chassis component as a proof of concept for the proposed approach. Two loops of the proposed process resulted in a reduction of components and in the number of joints used between them. This translates into a tangible improvement in the manufacturability of the MMTO design. Ultimately, being able to consider additional manufacturing constraints in the Topology Optimization process can greatly benefit research and development efforts. A better design is reached with fewer iterations, thus driving down engineering costs. Topology Optimization can help in determining a cost effective and efficient design which address existing structural design challenges.


2019 ◽  
Vol 20 (21) ◽  
pp. 5486
Author(s):  
Anna Nykel ◽  
Marcin Kaszkowiak ◽  
Wojciech Fendler ◽  
Agnieszka Gach

In the prenatal period, the copy number aberrations of chromosomes 13, 18, 21, X and Y account for over 80% of the clinically significant chromosome abnormalities. Classical cytogenetic analysis is the gold standard in invasive prenatal diagnostics but the long test waiting time affects its clinical utility. Several molecular rapid tests have been developed and employed in clinical practice, however all have substantial drawbacks. The aim of the study was to design and evaluate an optimized tool for rapid molecular detection of fetal aneuploidies. We established a novel single-day method using a chip-based platform, the QuantStudio 3D Digital PCR system. In order to assess the clinical usefulness of our screening test, we analyzed 133 prenatal samples. The difference in distributions of euploid and aneuploid samples identified the ploidy of each of the target chromosomes with high precision. The distribution of the chromosome ratio for euploid and aneuploid samples showed a statistically significant result (p = 0.003 for trisomy 13, p = 0.001 for trisomies 18 and 21, Mann–Whitney U test). Our results suggest that this novel chip-based approach provides a tool for rapid, technically simple, cost-effective screening for common fetal aneuploidies.


Sign in / Sign up

Export Citation Format

Share Document