scholarly journals Advancing Clinical Cohort Selection with Genomics Analysis on a Distributed Platform

2019 ◽  
Author(s):  
Jaclyn Marjorie Smith ◽  
Melvin Lathara ◽  
Hollis Wright ◽  
Brian Hill ◽  
Nalini Ganapati ◽  
...  

Abstract Background The affordability of next-generation genomic sequencing and the improvement of medical data management have contributed largely to the evolution of biological analysis from both a clinical and research perspective. Precision medicine is a response to these advancements that places individuals into better-defined subsets based on shared clinical and genetic features. The identification of personalized diagnosis and treatment options is dependent on the ability to draw insights from large-scale, multi-modal analysis of biomedical datasets. Driven by a real use case, we premise that platforms that support precision medicine analysis should maintain data in their optimal data stores, should support distributed storage and query mechanisms, and should scale as more samples are added to the system. Results We extended a genomics-based columnar data store, GenomicsDB, for ease of use within a distributed analytics platform for clinical and genomic data integration, known as the ODA framework. The framework supports interaction from an i2b2 plugin as well as a notebook environment. We show that the ODA framework exhibits worst-case linear scaling for array size (storage), import time (data construction), and query time for an increasing number of samples. We go on to show worst-case linear time for both import of clinical data and aggregate query execution time within a distributed environment. Conclusions This work highlights the integration of a distributed genomic database with a distributed compute environment to support scalable and efficient precision medicine queries from a HIPAA-compliant, cohort system in a real-world setting. The ODA framework is currently deployed in production to support precision medicine exploration and analysis from clinicians and researchers at UCLA David Geffen School of Medicine.

2019 ◽  
Author(s):  
Jaclyn M Smith ◽  
Melvin Lathara ◽  
Hollis Wright ◽  
Brian Hill ◽  
Nalini Ganapati ◽  
...  

AbstractBackgroundThe affordability of next-generation genomic sequencing and the improvement of medical data management have contributed largely to the evolution of biological analysis from both a clinical and research perspective. Precision medicine is a response to these advancements that places individuals into better-defined subsets based on shared clinical and genetic features. The identification of personalized diagnosis and treatment options is dependent on the ability to draw insights from large-scale, multi-modal analysis of biomedical datasets. Driven by a real use case, we premise that platforms that support precision medicine analysis should maintain data in their optimal data stores, should support distributed storage and query mechanisms, and should scale as more samples are added to the system.ResultsWe extended a genomics-based columnar data store, GenomicsDB, for ease of use within a distributed analytics platform for clinical and genomic data integration, known as the ODA framework. The framework supports interaction from an i2b2 plugin as well as a notebook environment. We show that the ODA framework exhibits worst-case linear scaling for array size (storage), import time (data construction), and query time for an increasing number of samples. We go on to show worst-case linear time for both import of clinical data and aggregate query execution time within a distributed environment.ConclusionsThis work highlights the integration of a distributed genomic database with a distributed compute environment to support scalable and efficient precision medicine queries from a HIPAA-compliant, cohort system in a real-world setting. The ODA framework is currently deployed in production to support precision medicine exploration and analysis from clinicians and researchers at UCLA David Geffen School of Medicine.


Quantum ◽  
2021 ◽  
Vol 5 ◽  
pp. 595
Author(s):  
Nicolas Delfosse ◽  
Naomi H. Nickerson

In order to build a large scale quantum computer, one must be able to correct errors extremely fast. We design a fast decoding algorithm for topological codes to correct for Pauli errors and erasure and combination of both errors and erasure. Our algorithm has a worst case complexity of O(nα(n)), where n is the number of physical qubits and α is the inverse of Ackermann's function, which is very slowly growing. For all practical purposes, α(n)≤3. We prove that our algorithm performs optimally for errors of weight up to (d−1)/2 and for loss of up to d−1 qubits, where d is the minimum distance of the code. Numerically, we obtain a threshold of 9.9% for the 2d-toric code with perfect syndrome measurements and 2.6% with faulty measurements.


2021 ◽  
Vol 14 (1) ◽  
pp. 51
Author(s):  
Brinda Balasubramanian ◽  
Simran Venkatraman ◽  
Kyaw Zwar Myint ◽  
Tavan Janvilisri ◽  
Kanokpan Wongprasert ◽  
...  

Cholangiocarcinoma (CCA), a group of malignancies that originate from the biliary tract, is associated with a high mortality rate and a concerning increase in worldwide incidence. In Thailand, where the incidence of CCA is the highest, the socioeconomic burden is severe. Yet, treatment options are limited, with surgical resection being the only form of treatment with curative intent. The current standard-of-care remains adjuvant and palliative chemotherapy which is ineffective in most patients. The overall survival rate is dismal, even after surgical resection and the tumor heterogeneity further complicates treatment. Together, this makes CCA a significant burden in Southeast Asia. For effective management of CCA, treatment must be tailored to each patient, individually, for which an assortment of targeted therapies must be available. Despite the increasing numbers of clinical studies in CCA, targeted therapy drugs rarely get approved for clinical use. In this review, we discuss the shortcomings of the conventional clinical trial process and propose the implementation of a novel concept, co-clinical trials to expedite drug development for CCA patients. In co-clinical trials, the preclinical studies and clinical trials are conducted simultaneously, thus enabling real-time data integration to accurately stratify and customize treatment for patients, individually. Hence, co-clinical trials are expected to improve the outcomes of clinical trials and consequently, encourage the approval of targeted therapy drugs. The increased availability of targeted therapy drugs for treatment is expected to facilitate the application of precision medicine in CCA.


Sensors ◽  
2021 ◽  
Vol 21 (11) ◽  
pp. 3774
Author(s):  
Pavlos Topalidis ◽  
Cristina Florea ◽  
Esther-Sevil Eigl ◽  
Anton Kurapov ◽  
Carlos Alberto Beltran Leon ◽  
...  

The purpose of the present study was to evaluate the performance of a low-cost commercial smartwatch, the Xiaomi Mi Band (MB), in extracting physical activity and sleep-related measures and show its potential use in addressing questions that require large-scale real-time data and/or intercultural data including low-income countries. We evaluated physical activity and sleep-related measures and discussed the potential application of such devices for large-scale step and sleep data acquisition. To that end, we conducted two separate studies. In Study 1, we evaluated the performance of MB by comparing it to the GT3X (ActiGraph, wGT3X-BT), a scientific actigraph used in research, as well as subjective sleep reports. In Study 2, we distributed the MB across four countries (Austria, Germany, Cuba, and Ukraine) and investigated physical activity and sleep among these countries. The results of Study 1 indicated that MB step counts correlated highly with the scientific GT3X device, but did display biases. In addition, the MB-derived wake-up and total-sleep-times showed high agreement with subjective reports, but partly deviated from GT3X predictions. Study 2 revealed similar MB step counts across countries, but significant later wake-up and bedtimes for Ukraine than the other countries. We hope that our studies will stimulate future large-scale sensor-based physical activity and sleep research studies, including various cultures.


2019 ◽  
Vol 214 ◽  
pp. 04033
Author(s):  
Hervé Rousseau ◽  
Belinda Chan Kwok Cheong ◽  
Cristian Contescu ◽  
Xavier Espinal Curull ◽  
Jan Iven ◽  
...  

The CERN IT Storage group operates multiple distributed storage systems and is responsible for the support of the infrastructure to accommodate all CERN storage requirements, from the physics data generated by LHC and non-LHC experiments to the personnel users' files. EOS is now the key component of the CERN Storage strategy. It allows to operate at high incoming throughput for experiment data-taking while running concurrent complex production work-loads. This high-performance distributed storage provides now more than 250PB of raw disks and it is the key component behind the success of CERNBox, the CERN cloud synchronisation service which allows syncing and sharing files on all major mobile and desktop platforms to provide offline availability to any data stored in the EOS infrastructure. CERNBox recorded an exponential growth in the last couple of year in terms of files and data stored thanks to its increasing popularity inside CERN users community and thanks to its integration with a multitude of other CERN services (Batch, SWAN, Microsoft Office). In parallel CASTOR is being simplified and transitioning from an HSM into an archival system, focusing mainly in the long-term data recording of the primary data from the detectors, preparing the road to the next-generation tape archival system, CTA. The storage services at CERN cover as well the needs of the rest of our community: Ceph as data back-end for the CERN OpenStack infrastructure, NFS services and S3 functionality; AFS for legacy home directory filesystem services and its ongoing phase-out and CVMFS for software distribution. In this paper we will summarise our experience in supporting all our distributed storage system and the ongoing work in evolving our infrastructure, testing very-dense storage building block (nodes with more than 1PB of raw space) for the challenges waiting ahead.


2002 ◽  
Vol 16 (1) ◽  
pp. 6-8 ◽  
Author(s):  
Sebastian Ciancio

Powered toothbrushes were first introduced on a large scale in the early 1960s. However, because of a clear lack of superiority compared with manual brushes, and problems with mechanical breakdowns, their sales decreased significantly. However, recommendation for their use continued in special populations with dexterity and cognition problems. The 1990s ushered in an era of new technology, and studies began to suggest superiority of some powered brushes, particularly those using oscillating-rotating or counter-rotational actions. Some studies have shown interproximal cleansing abilities superior to those of manual brushes and yielding results similar to those achieved with the use of a manual brush and floss. Both controlled and open-labeled studies have suggested that electric brushes improve gingival health with patients who routinely used manual brushes prior to using these new powered brushes, and safety has been clearly established. In recommending powered toothbrushes, practitioners should familiarize themselves with the products available, with the clinical studies supporting their benefits compared with manual brushes, their safety and ease of use, and the patient's economic status.


Author(s):  
Sepehr Fathizadan ◽  
Feng Ju ◽  
Kyle Rowe ◽  
Alex Fiechter ◽  
Nils Hofmann

Abstract Production efficiency and product quality need to be addressed simultaneously to ensure the reliability of large scale additive manufacturing. Specifically, print surface temperature plays a critical role in determining the quality characteristics of the product. Moreover, heat transfer via conduction as a result of spatial correlation between locations on the surface of large and complex geometries necessitates the employment of more robust methodologies to extract and monitor the data. In this paper, we propose a framework for real-time data extraction from thermal images as well as a novel method for controlling layer time during the printing process. A FLIR™ thermal camera captures and stores the stream of images from the print surface temperature while the Thermwood Large Scale Additive Manufacturing (LSAM™) machine is printing components. A set of digital image processing tasks were performed to extract the thermal data. Separate regression models based on real-time thermal imaging data are built on each location on the surface to predict the associated temperatures. Subsequently, a control method is proposed to find the best time for printing the next layer given the predictions. Finally, several scenarios based on the cooling dynamics of surface structure were defined and analyzed, and the results were compared to the current fixed layer time policy. It was concluded that the proposed method can significantly increase the efficiency by reducing the overall printing time while preserving the quality.


2021 ◽  
Vol ahead-of-print (ahead-of-print) ◽  
Author(s):  
Siti Nurdiyana Atikah Sulaiman ◽  
Mohammad Nabil Almunawar

Purpose The purpose of this paper is to investigate factors that influence customers’ adoption of biometric-based point-of-sale in Brunei. Design/methodology/approach This paper extends technology acceptance model constructs with trust and some other variables as the framework to investigate their influence on the attitude toward the usage of a biometric point-of-sale terminal for payments in Brunei. Nine variables may influence user’s perception toward usage. The nine variables are needed, perceived ease of use, perceived usefulness, experience, innovativeness, privacy, security, trust and attitude toward usage. Multiple regression analysis was conducted to test hypotheses related to these nine variables. Findings It is found that the innovativeness of an individual and similar experience corresponds toward trust, which is positively related to attitude toward usage. Perceived usefulness and trust have significantly influenced the intention of individuals to use biometrics as an authentication method for payment. Research limitations/implications The nature of this research is to gather the public’s opinion and perception as much as it is deemed possible to get a bigger and clearer picture of the study. As the target respondence is citizens and residents of Brunei without any specification or exclusion, a large response would be needed to have a more reliable and accurate result. However, only 205 respondents can be gathered in this study. Had there been a longer time frame, it would be best to gather a lot more responses. Originality/value This paper explores the adoption of biometric authentication in large-scale point-of-terminals. It identifies factors that influence adoption. The results of this study could assist future researchers in which direction to take to further explore biometric as an authentication method for payment. In addition to this, it could also provide banks and financial technology in Brunei a clearer picture of the Brunei market and Bruneians perspective on the biometric system.


Sign in / Sign up

Export Citation Format

Share Document