scholarly journals Miniaturization of analytical systems

1998 ◽  
Vol 44 (9) ◽  
pp. 2008-2014 ◽  
Author(s):  
Larry J Kricka

Abstract Miniaturization has been a long-term trend in clinical diagnostics instrumentation. Now a range of new technologies, including micromachining and molecular self-assembly, are providing the means for further size reduction of analyzers to devices with micro- to nanometer dimensions and submicroliter volumes. Many analytical techniques (e.g., mass spectrometry and electrophoresis) have been successfully implemented on microchips made from silicon, glass, or plastic. The new impetus for miniaturization stems from the perceived benefits of faster, easier, less costly, and more convenient analyses and by the needs of the pharmaceutical industry for microscale, massively parallel drug discovery assays. Perfecting a user-friendly interface between a human and a microchip and determining the realistic lower limit for sample volume are key issues in the future implementation of these devices. Resolution of these issues will be important for the long-term success of microminiature analyzers; in the meantime, the scope, diversity, and rate of progress in the development of these devices promises products in the near future.

2019 ◽  
Vol 5 (Supplement_1) ◽  
Author(s):  
D Schmitz ◽  
S Nooij ◽  
T Janssens ◽  
J Cremer ◽  
H Vennema ◽  
...  

Abstract As research next-generation sequencing (NGS) metagenomic pipelines transition to clinical diagnostics, the user-base changes from bioinformaticians to biologists, medical doctors, and lab-technicians. Besides the obvious need for benchmarking and assessment of diagnostic outcomes of the pipelines and tools, other focus points remain: reproducibility, data immutability, user-friendliness, portability/scalability, privacy, and a clear audit trail. We have a research metagenomics pipeline that takes raw fastq files and produces annotated contigs, but it is too complicated for non-bioinformaticians. Here, we present preliminary findings in adapting this pipeline for clinical diagnostics. We used information available on relevant fora (www.bioinfo-core.org) and experiences and publications from colleague bioinformaticians in other institutes (COMPARE, UBC, and LUMC). From this information, a robust and user-friendly storage and analysis workflow was designed for non-bioinformaticians in a clinical setting. Via Conda [https://conda.io] and Docker containers [http://www.docker.com], we made our disparate pipeline processes self-contained and reproducible. Furthermore, we moved all pipeline settings into a separate JSON file. After every analysis, the pipeline settings and virtual-environment recipes will be archived (immutably) under a persistent unique identifier. This allows long-term precise reproducibility. Likewise, after every run the raw data and final products will be automatically archived, complying with data retention laws/guidelines. All the disparate processes in the pipeline are parallelized and automated via Snakemake1 (i.e. end-users need no coding skills). In addition, interactive web-reports such as MultiQC [http://multiqc.info] and Krona2 are generated automatically. By combining Snakemake, Conda, and containers, our pipeline is highly portable and easily scaled up for outbreak situations, or scaled down to reduce costs. Since patient privacy is a concern, our pipeline automatically removes human genetic data. Moreover, all source code will be stored on an internal Gitlab server, and, combined with the archived data, ensures a clear audit trail. Nevertheless, challenges remain: (1) reproducible reference databases, e.g. being able to revert to an older version to reproduce old analyses. (2) A user-friendly GUI. (3) Connecting the pipeline and NGS data to in-house LIMS. (4) Efficient long-term storage, e.g. lossless compression algorithms. Nevertheless, this work represents a step forward in making user-friendly clinical diagnostic workflows.


The Condor ◽  
2000 ◽  
Vol 102 (3) ◽  
pp. 492-502 ◽  
Author(s):  
Walter D. Koenig ◽  
Philip N. Hooge ◽  
Mark T. Stanback ◽  
Joseph Haydock

Abstract Dispersal data are inevitably biased toward short-distance events, often highly so. We illustrate this problem using our long-term study of Acorn Woodpeckers (Melanerpes formicivorus) in central coastal California. Estimating the proportion of birds disappearing from the study area and correcting for detectability within the maximum observable distance are the first steps toward achieving a realistic estimate of dispersal distributions. Unfortunately, there is generally no objective way to determine the fates of birds not accounted for by these procedures, much less estimating the distances they may have moved. Estimated mean and root-mean-square dispersal distances range from 0.22–2.90 km for males and 0.53–9.57 km for females depending on what assumptions and corrections are made. Three field methods used to help correct for bias beyond the limits of normal study areas include surveying alternative study sites, expanding the study site (super study sites), and radio-tracking dispersers within a population. All of these methods have their limitations or can only be used in special cases. New technologies may help alleviate this problem in the near future. Until then, we urge caution in interpreting observed dispersal data from all but the most isolated of avian populations.


2013 ◽  
Vol 2013 ◽  
pp. 1-7 ◽  
Author(s):  
Matthieu J. Guitton

Given the important patient needs for support and treatment, telemedicine—defined by medical approaches supported by the new technologies of information—could provide interesting alternative in tinnitus treatment. By analyzing the published tools and approaches which could be used in the context of telemedicine for tinnitus by health professionals or self-administrated by patients, this review summarizes, presents, and describes the principal telemedicine approaches available presently or in the near future to help assess or treat tinnitus or to offer support to tinnitus sufferers. Several pieces of evidence strongly support the feasibility of telemedicine approaches for tinnitus. Telemedicine can be used to help tinnitus sufferers at several points in the therapeutic process: for early screening, initial evaluation, and diagnosis; for optimizing therapeutic tools, particularly behavioural therapies and virtual reality-enhanced behavioral therapies; for long-term monitoring of patients and provision of online support. Several limitations are, however, discussed in order to optimize the safe development of such approaches. Cost effective and easy to implement, telemedicine is likely to represent an important part of the future of tinnitus therapies and should be progressively integrated by otolaryngologists.


Author(s):  
Albert E. Beaton ◽  
James R. Chromy
Keyword(s):  

2017 ◽  
Vol 7 (2) ◽  
pp. 207-230 ◽  
Author(s):  
Mustafa Murat Yucesahin ◽  
Ibrahim Sirkeci

Syrian crisis resulted in at least 6.1 million externally displaced people 983,876 of whom are in Europe while the rest are in neighbouring countries in the region. Turkey, due to its geographical proximity and substantial land borders with the country, has been the most popular destination for those fleeing Syria since April 2011. Especially after 2012, a sharp increase in the number of Syrian refugees arriving in Turkey was witnessed. This has triggered an exponential growth in academic and public interest in Syrian population. Numerous reports mostly based on non-representative sample surveys have been disseminated whilst authoritative robust analyses remained absent. This study aims to fill this gap by offering a comprehensive demographic analysis of the Syrian population. We focus on the demographic differences (from 1950s to 2015) and demographic trends (from 2015 to 2100) in medium to long term, based on data from World Population Prospects (WPP). We offer a comparative picture to underline potential changes and convergences between populations in Syria, Turkey, Germany, and the United Kingdom. We frame our discussion here with reference to the demographic transition theory to help understanding the implications for movers and non-movers in receiving countries in the near future.


2019 ◽  
Author(s):  
Rumen Manolov

The lack of consensus regarding the most appropriate analytical techniques for single-case experimental designs data requires justifying the choice of any specific analytical option. The current text mentions some of the arguments, provided by methodologists and statisticians, in favor of several analytical techniques. Additionally, a small-scale literature review is performed in order to explore if and how applied researchers justify the analytical choices that they make. The review suggests that certain practices are not sufficiently explained. In order to improve the reporting regarding the data analytical decisions, it is proposed to choose and justify the data analytical approach prior to gathering the data. As a possible justification for data analysis plan, we propose using as a basis the expected the data pattern (specifically, the expectation about an improving baseline trend and about the immediate or progressive nature of the intervention effect). Although there are multiple alternatives for single-case data analysis, the current text focuses on visual analysis and multilevel models and illustrates an application of these analytical options with real data. User-friendly software is also developed.


2020 ◽  
Vol 20 ◽  
Author(s):  
L. Hajba ◽  
A. Guttman

: Adeno-associated virus (AAV) is one of the most promising viral gene delivery vectors with long-term gene expression and disease correction featuring high efficiency and excellent safety in human clinical trials. During the production of AAV vectors,there are several quality control (QC)parameters that should be rigorously monitored to comply with clini-cal safety and efficacy. This review gives a short summary of the most frequently used AVV production and purification methods,focusing on the analytical techniques applied to determine the full/empty capsid ratio and the integrity of the encapsidated therapeutic DNA of the products.


2020 ◽  
Vol 16 ◽  
Author(s):  
Muhammad Bilal Tahir ◽  
Aleena Shoukat ◽  
Tahir Iqbal ◽  
Asma Ayub ◽  
Saff-e Awal ◽  
...  

: The field of nanosensors has been gaining a lot of attention due to its properties such as mechanical and electrical ever since its first discovery by Dr. Wolter and first mechanical sensor in 1994. The rapidly growing demand of nanosensors has become profitable for a multidisciplinary approach in designing and fabrication of materials and strategies for potential applications. Frequent stimulating advancements are being suggested and established in recent years and thus heading towards multiple applications including food safety, healthcare, environmental monitoring, and biomedical research. Nanofabrication being an efficient method has been used in different industries like medical pharmaceutical for their complex functional geometry at a lower scale. These nanofabrications apply through different methods. There are five most commonly known methods which are frequently used, including top-down lithography, molecular self-assembly, bottom-up assembly, heat and pull method for fabrication of biosensors, etching for fabrication of nanosensors etc. Nanofabrication help at the nanoscale to design and work with small models. But these models due to their small size and being sensitive need more care for use as well as more training and experience to do work with. All methods used for nanofabrication are good and helpful. But more preferred is molecular self-assembly as it is helpful in mass production. Nanofabrication has become an emerging and developing field and it assumed that in near future our world is known by the new devices of nanofabrication.


Sign in / Sign up

Export Citation Format

Share Document