scholarly journals Ubiquitous Computing in the Biology Laboratory

Author(s):  
Larry Arnstein ◽  
Stefan Sigdursson ◽  
Bob Franza

Our objective is to eliminate the digital divide that persists between the physical and information spaces of wet-lab based enterprises by embedding computational resources into the shared laboratory environment. Our first challenge is to enable individual lab workers to contribute to a fine-grained formal representation of ongoing lab activities — to build the database by doing the work, without having to stop and write things down in a notebook or to enter information into a computer. By eliminating the redundancy of doing the work and then recording it, accuracy and completeness will be improved. And, by capturing information at a finer detail than is practical for manual entry systems, unanticipated applications can be supported.

Metabolites ◽  
2018 ◽  
Vol 8 (4) ◽  
pp. 77
Author(s):  
Shreya Shaw ◽  
Robin Ghosh

The Kulka resorcinol assay (Kulka, R.G., Biochemistry 1956, 63, 542–548) for ketoses has been widely used in the literature but suffers from two major disadvantages: (a) it employs large amounts of potentially harmful reagents for a general biology laboratory environment; and (b) in its original formulation, it is unsuited for modern high-throughput applications. Here, we have developed a modified Kulka assay, which contains a safer formulation, employing approx. 5.4 M HCl in 250 µL aliquots, and is suitable for use in high-throughput systems biology or enzymatic applications. The modified assay has been tested extensively for the measurement of two ketoses—fructose (a common substrate in cell growth experiments) and 1-deoxy-d-xylulose-5-phosphate (DXP), the product of the DXP-synthase reaction—which until now has only been assayable using time-consuming chromatographic methods or radioactivity. The Kulka microassay has a sensitivity of 0–250 nmol fructose or 0–500 nmol DXP. The assay is suitable for monitoring the consumption of fructose in bacterial growth experiments but is too insensitive to be used directly for the measurement of DXP in in vitro enzyme assays. However, we show that after concentration of the DXP-enzyme mix by butanol extraction, the Kulka resorcinol method can be used for enzyme assays.


2013 ◽  
Vol 43 (3) ◽  
pp. 77-86
Author(s):  
M. Mironova ◽  
V. Naidenov

Abstract The main strength-deformation properties of fine-grained fiber-reinforced concretes with different type and quantity of fibres, used as repair overlays, are discussed. The results of mechanical properties of experimental compositions are obtained and generalized for two basis ages and standard laboratory environment. The experimental results are mathematical processed using MATLAB procedure. The basic character- istics - residual strength, toughness indexes and residual strength factors are obtained as a function of type as well as quantity of the hybrid fibre- reinforcement.


2018 ◽  
Author(s):  
Tiago Lubiana-Alves ◽  
André A.N.A. Gonçalves ◽  
Helder I Nakaya

ABSTRACTVoice User Interfaces such as Amazon Alexa and Google Home are already widely available and used for personal purposes. These services could be used to improve experimental biology laboratory routine, facilitate troubleshooting and increase efficiency. Till date, no applications that are tailored to enhance laboratory routine have been made available. Here, we present a set of free-to-use, open source tools adapted to Alexa for application in the laboratory environment, with prospects of enhancing productivity and reducing work-related stress. All skills, 3D printer model and source codes are freely available in the Alexa app store and in GitHub.


Author(s):  
Martin R. Dowding

Since the advent of the Information Highway (Society/Economy) considerable policy-making has been undertaken by governments in Canada and the U.S. in response to the Digital Divide. While measuring the divide has largely been limited to neo-liberal economic analysis, the U.S. appears to be committing more resources and doing more fine-grained analyses of the situation. This paper compares statistical and political economic analyses already completed and provides alternative analyses useful to guarantors of access to information.Depuis l’avènement de l’autoroute de l’information (Société/Économie), un nombre considérable de politiques ont été élaborées par les gouvernements du Canada et des États-Unis afin de réduire le fossé numérique. Alors que la mesure du fossé a été limitée en grande partie par l’analyse économique néo-libérale, les États-Unis semblent engager davantage de ressources et effectuer des analyses plus précises de la situation. Cette étude compare les analyses statistiques et politico-économique déjà achevées et présente des analyses alternatives utiles garantissant l’accès à l’information. 


Author(s):  
Matteo Turilli ◽  
David Wallom ◽  
Chris Williams ◽  
Steve Gough ◽  
Neal Curran ◽  
...  

Cloud computing has been increasingly adopted by users and providers to promote a flexible, scalable and tailored access to computing resources. Nonetheless, the consolidation of this paradigm has uncovered some of its limitations. Initially devised by corporations with direct control over large amounts of computational resources, cloud computing is now being endorsed by organizations with limited resources or with a more articulated, less direct control over these resources. The challenge for these organizations is to leverage the benefits of cloud computing while dealing with limited and often widely distributed computing resources. This study focuses on the adoption of cloud computing by higher education institutions and addresses two main issues: flexible and on-demand access to a large amount of storage resources, and scalability across a heterogeneous set of cloud infrastructures. The proposed solutions leverage a federated approach to cloud resources in which users access multiple and largely independent cloud infrastructures through a highly customizable broker layer. This approach allows for a uniform authentication and authorization infrastructure, a fine-grained policy specification and the aggregation of accounting and monitoring. Within a loosely coupled federation of cloud infrastructures, users can access vast amount of data without copying them across cloud infrastructures and can scale their resource provisions when the local cloud resources become insufficient.


2021 ◽  
Vol 49 (4) ◽  
pp. 6-11
Author(s):  
Jonas Traub ◽  
Zoi Kaoudi ◽  
Jorge-Arnulfo Quiané-Ruiz ◽  
Volker Markl

Data science and artificial intelligence are driven by a plethora of diverse data-related assets, including datasets, data streams, algorithms, processing software, compute resources, and domain knowledge. As providing all these assets requires a huge investment, data science and artificial intelligence technologies are currently dominated by a small number of providers who can afford these investments. This leads to lock-in effects and hinders features that require a flexible exchange of assets among users. In this paper, we introduce Agora, our vision towards a unified ecosystem that brings together data, algorithms, models, and computational resources and provides them to a broad audience. Agora (i) treats assets as first-class citizens and leverages a fine-grained exchange of assets, (ii) allows for combining assets to novel applications, and (iii) flexibly executes such applications on available resources. As a result, it enables easy creation and composition of data science pipelines as well as their scalable execution. In contrast to existing data management systems, Agora operates in a heavily decentralized and dynamic environment: Data, algorithms, and even compute resources are dynamically created, modified, and removed by different stakeholders. Agora presents novel research directions for the data management community as a whole: It requires to combine our traditional expertise in scalable data processing and management with infrastructure provisioning as well as economic and application aspects of data, algorithms, and infrastructure.


2021 ◽  
Author(s):  
Justin Liu

Abstract Background: In a worldwide health crisis as severe as COVID-19, there has become a pressing need for rapid, reliable diagnostics. Currently, popular testing methods such as reversetranscription polymerase chain reaction (RT-PCR) can have high false negative rates. Consequently, COVID-19 patients are not accurately identified nor treated quickly enough to prevent transmission of the virus. However, the recent rise of medical CT data has presented promising avenues, since CT manifestations contain key characteristics indicative of COVID-19. Findings: This study aimed to take a novel approach in the machine learning-based detection of COVID-19 from chest CT scans. First, the dataset utilized in this study was derived from three major sources, comprising a total of 17,698 chest CT slices across 923 patient cases. Additionally, image preprocessing algorithms were developed to reduce noise by excluding irrelevant features. Transfer learning was also implemented with the EfficientNetB7 pre-trained model to provide a backbone architecture and save computational resources. Lastly, several explainability techniques were leveraged to qualitatively validate model performance by localizing infected regions and highlighting fine-grained pixel details. The proposed model attained an overall accuracy of 92.71% and a sensitivity of 95.79%. Explainability measures showed that the model correctly distinguished between relevant, critical features pertaining to COVID-19 chest CT images and normal controls.Conclusions: Deep learning frameworks provide efficient, human-interpretable COVID-19 diagnostics that could complement a radiologist’s decision or serve as an alternative screening tool. Future endeavors could provide insight into infection severity, patient risk stratification, and more precise visualizations


Sign in / Sign up

Export Citation Format

Share Document