work flow
Recently Published Documents


TOTAL DOCUMENTS

722
(FIVE YEARS 151)

H-INDEX

31
(FIVE YEARS 6)

2021 ◽  
Author(s):  
Annika Faucon ◽  
Julian Samaroo ◽  
Tian Ge ◽  
Lea K Davis ◽  
Ran Tao ◽  
...  

To enable large-scale application of polygenic risk scores in a computationally efficient manner we translate a widely used polygenic risk score construction method, Polygenic Risk Score – Continuous Shrinkage (PRS-CS), to the Julia programing language, PRS.jl. On nine different traits with varying genetic architectures, we demonstrate that PRS.jl maintains accuracy of prediction while decreasing the average run time by 5.5x. Additional programmatic modifications improve usability and robustness. This freely available software substantially improves work flow and democratizes utilization of polygenic risk scores by lowering the computational burden of the PRS-CS method.


2021 ◽  
Author(s):  
Paul Kalke ◽  
Conrad Helm
Keyword(s):  

How to do data refinements like repair, smooth and simplification in MeshLab and export as -ply-file.


2021 ◽  
Vol 22 ◽  
Author(s):  
Sanoj Chacko ◽  
Yumna B. Haseeb ◽  
Sohaib Haseeb

: Metabolomics is an omics approach of systems biology that involves the development and assessment of large-scale, comprehensive biochemical analysis tools for metabolites in biological systems. This review describes the metabolomics workflow and provides an overview of current analytic tools used for the quantification of metabolic profiles. We explain analytic tools such as mass spectrometry (MS), nuclear magnetic resonance (NMR) spectroscopy, ionization techniques, and approaches for data extraction and analysis.


Author(s):  
Koto Hiramatsu ◽  
Shin-ichi SAKAMOTO ◽  
Yoshiaki Watanabe

Abstract For improvement of energy conversion efficiency, sound wave is superimposed with a loudspeaker to the working fluid in the stack. By using this method the work-flow generation of the stack was enhanced. To analyze this enhancement mechanism, the thickness of the boundary layer and the heat exchange area in the stack are calculated from the view point of heat exchange circumstance. The effect of the heat exchange circumstance on the particle displacement and heat flow is investigated. As a result, it is confirmed that the superimposed sound wave improves the heat exchange circumstance and then the thermoacoustic phenomenon is enhanced.


2021 ◽  
Author(s):  
Paul Kalke ◽  
Conrad Helm
Keyword(s):  

Step-by-step-guide how to use Blender to analyse and visualize your segmented structures, for example exporting it as a video as mp4-file.


2021 ◽  
Author(s):  
Paul Kalke ◽  
Conrad Helm
Keyword(s):  

How to use ImageJ in our 3D-workflow from data manipulation, import, segmentation and export as OBJ-file.


2021 ◽  
Vol 25 (4) ◽  
pp. 134-146
Author(s):  
A. V. Petraikin ◽  
I. A. Skripnikova

In the review we discussed about the method of quantitative computed tomography (QCT, quantitative computed tomography). In QCT, X-ray density (HU) is converted to bone mineral density (BMD mg / ml) using linear relationships obtained by scanning calibration standards (phantoms). When compared with the normative age data, it is possible to diagnose osteoporosis (OP). The review presents various QCT techniques and their diagnostic capabilities in accordance with the positions of ISCD 2019 - (International Society for Clinical Densitometry). The results of comparison of QCT and conventional dual-energy X-ray absorptiometry (DXA) are  considered.  It is noted that in the study of the proximal femur (PF), the results of the methods are well comparable, according to the results of both methods, it is possible to diagnose OP by the T-score. However, when examining the spine QCT, the volume BMD of the trabecular bone of the vertebral bodies is assessed, and with DXA, the projection BMD is assessed. The approaches to the interpretation of the results are also different - diagnosis of OP in DXA of the spine based on the T-score, but in QCT, the ACR (American College of Radiology) criteria are used.We describe the phantoms used in QCT, as well as provide data on radiation exposure during QCT and DXA.The article describes an approach to opportunistic screening of osteoporosis by the QCT based on the results of previously performed CT scans, including its automated work-flow using artificial intelligence technologies. These promising techniques are attractive due to the large number of CT examinations performed and the exclusion of additional examinations.


2021 ◽  
Vol 2021 ◽  
pp. 1-11 ◽  
Author(s):  
Zhiying Zhang ◽  
Huiju Zhu

In order to achieve significant improvements in the evaluation of key indicators such as speed, quality, cost, and service, this paper fundamentally rethinks and completely redesigns the business process, and recreates a new business process. This study combines the particularity of AMI with emergency nursing to construct an in-hospital AMI emergency nursing process to further standardize the AMI rescue work. The implementation of the process helps to clarify the responsibilities and requirements of nurses in the AMI emergency process, reduce the delay time of AMI emergency, and improve the efficiency and effectiveness of emergency. In addition, after refactoring the business process, this paper builds an intelligent digital critical illness monitoring system. This system combines the original work flow of the ICU medical staff, optimizes the work flow of the medical staff through computer technology and information technology, and designs and completes the digital intensive nursing system software to run and use in the hospital and obtain significant results.


2021 ◽  
Author(s):  
Hamideh Hajiabadi ◽  
Irina Mamontova ◽  
Roshan Prizak ◽  
Agnieszka Pancholi ◽  
Anne Koziolek ◽  
...  

AbstractFluorescence microscopy, a central tool of biological research, is subject to inherent trade-offs in experiment design. For instance, image acquisition speed can only be increased in exchange for a lowered signal quality, or for an increased rate of photo-damage to the specimen. Computational denoising can recover some loss of signal, extending the trade-off margin for high-speed imaging. Recently proposed denoising on the basis of neural networks shows exceptional performance but raises concerns of errors typical of neural networks. Here, we present a work-flow that supports an empirically optimized reduction of exposure times, as well as per-image quality control to exclude images with reconstruction errors. We implement this work-flow on the basis of the denoising tool Noise2Void and assess the molecular state and three-dimensional shape of RNA Polymerase II (Pol II) clusters in live zebrafish embryos. Image acquisition speed could be tripled, achieving 2-second time resolution and 350-nanometer lateral image resolution. The obtained data reveal stereotyped events of approximately 10 seconds duration: initially, the molecular mark for initiated Pol II increases, then the mark for active Pol II increases, and finally Pol II clusters take on a stretched and unfolded shape. An independent analysis based on fixed sample images reproduces this sequence of events, and suggests that they are related to the transient association of genes with Pol II clusters. Our work-flow consists of procedures that can be implemented on commercial fluorescence microscopes without any hardware or software modification, and should therefore be transferable to many other applications.


2021 ◽  
Vol 3 ◽  
Author(s):  
Martin J. Jolley ◽  
Andrew J. Russell ◽  
Paul F. Quinn ◽  
Matthew T. Perks

Large-scale image velocimetry is a novel approach for non-contact remote sensing of flow in rivers. Research within this topic has largely focussed on developing specific aspects of the image velocimetry work-flow, or alternatively, testing specific tools or software using case studies. This has resulted in the development of a multitude of techniques, with varying practice being employed between groups, and authorities. As such, for those new to image velocimetry, it may be hard to decipher which methods are suited for particular challenges. This research collates, synthesises, and presents current understanding related to the application of particle image velocimetry (PIV) and particle tracking velocimetry (PTV) approaches in a fluvial setting. The image velocimetry work-flow is compartmentalised into sub-systems of: capture optimisation, pre-processing, processing, and post-processing. The focus of each section is to provide examples from the wider literature for best practice, or where this is not possible, to provide an overview of the theoretical basis and provide examples to use as precedence and inform decision making. We present literature from a range of sources from across the hydrology and remote sensing literature to suggest circumstances in which specific approaches are best applied. For most sub-systems, there is clear research or precedence indicating how to best perform analysis. However, there are some stages in the process that are not conclusive with one set method and require user intuition or further research. For example, the role of external environmental conditions on the performance of image velocimetry being a key aspect that is currently lacking research. Further understanding in areas that are lacking, such as environmental challenges, is vital if image velocimetry is to be used as a method for the extraction of river flow information across the range of hydro-geomorphic conditions.


Sign in / Sign up

Export Citation Format

Share Document