scholarly journals INTEGRATING RAARR (ROBUST AUTOMATED ASSIGNMENT OF RIGID ROTORS) INTO AUTOFIT: PROGRESS AND PROSPECTS

Author(s):  
Arianna Rodriguez ◽  
Steven Shipman
Radiation ◽  
2021 ◽  
Vol 1 (2) ◽  
pp. 79-94
Author(s):  
Peter K. Rogan ◽  
Eliseos J. Mucaki ◽  
Ben C. Shirley ◽  
Yanxin Li ◽  
Ruth C. Wilkins ◽  
...  

The dicentric chromosome (DC) assay accurately quantifies exposure to radiation; however, manual and semi-automated assignment of DCs has limited its use for a potential large-scale radiation incident. The Automated Dicentric Chromosome Identifier and Dose Estimator (ADCI) software automates unattended DC detection and determines radiation exposures, fulfilling IAEA criteria for triage biodosimetry. This study evaluates the throughput of high-performance ADCI (ADCI-HT) to stratify exposures of populations in 15 simulated population scale radiation exposures. ADCI-HT streamlines dose estimation using a supercomputer by optimal hierarchical scheduling of DC detection for varying numbers of samples and metaphase cell images in parallel on multiple processors. We evaluated processing times and accuracy of estimated exposures across census-defined populations. Image processing of 1744 samples on 16,384 CPUs required 1 h 11 min 23 s and radiation dose estimation based on DC frequencies required 32 sec. Processing of 40,000 samples at 10 exposures from five laboratories required 25 h and met IAEA criteria (dose estimates were within 0.5 Gy; median = 0.07). Geostatistically interpolated radiation exposure contours of simulated nuclear incidents were defined by samples exposed to clinically relevant exposure levels (1 and 2 Gy). Analysis of all exposed individuals with ADCI-HT required 0.6–7.4 days, depending on the population density of the simulation.


2008 ◽  
Vol 43 (6) ◽  
pp. 527-541 ◽  
Author(s):  
D.J. Rodrigues ◽  
A.R. Champneys ◽  
M.I. Friswell ◽  
R.E. Wilson
Keyword(s):  

2005 ◽  
Vol 33 (4) ◽  
pp. 261-279 ◽  
Author(s):  
Jianyong Wang ◽  
Tianzhi Wang ◽  
Erik R. P. Zuiderweg ◽  
Gordon M. Crippen

2019 ◽  
Author(s):  
Felix Plasser

<p>The advent of ever more powerful excited-state electronic structure methods has lead to a tremendous increase in the predictive power of computation but it has also rendered the analysis of these computations more and more challenging and time-consuming. TheoDORE tackles this problem through providing tools for post-processing excited-state computations, which automate repetitive tasks and provide rigorous and reproducible descriptors. Interfaces are available for ten different quantum chemistry codes and a range of excited-state methods implemented therein. This article provides an overview of three popular functionalities within TheoDORE, a fragment-based analysis for assigning state character, the computation of exciton sizes for measuring charge transfer, and the natural transition orbitals used not only for visualisation but also for quantifying multiconfigurational character. Using the examples of an organic push-pull chromophore and a transition metal complex, it is shown how these tools can be used for a rigorous and automated assignment of excited-state character. In the case of a conjugated polymer, we venture beyond the limits of the traditional molecular orbital picture to uncover spatial correlation effects using electron-hole correlation plots and conditional densities.</p>


2021 ◽  
Author(s):  
Linda Baldewein ◽  
Ulrike Kleeberg ◽  
Lars Möller

&lt;p&gt;In Earth and environmental sciences data analyzed from field samples are a significant portion of all research data, oftentimes collected under significant costs and non-reproducibly. If important metadata is not immediately secured and stored in the field, the quality and re-usability of the resulting data will be diminished. &amp;#160;&lt;/p&gt;&lt;p&gt;At the Helmholtz Coastal Data Center (HCDC) a metadata and data workflow for biogeochemical data has been developed over the last couple of years to ensure the quality and richness of metadata and enable that the final data product will be FAIR. It automates and standardizes the data transfer from the campaign planning stage, through sample collection in the field, analysis and quality control to the storage into databases and the publication in repositories.&lt;/p&gt;&lt;p&gt;Prior to any sampling campaign, the scientists are equipped with a customized app on a tablet that enables them to record relevant metadata information, such as the date and time of sampling, the involved scientists and the type of sample collected. Each sample and station already receives a unique identifier at this stage. The location is directly retrieved from a high-accuracy GNSS receiver connected to the tablet. This metadata is transmitted via mobile data transfer to the institution&amp;#8217;s cloud storage.&lt;/p&gt;&lt;p&gt;After the campaign, the metadata is quality checked by the field scientists and the data curator and stored in a relational database. Once the samples are analyzed in the lab, the data is imported into the database and connected to the corresponding metadata using a template. Data DOIs are registered for finalized datasets in close collaboration with the World Data Center PANGAEA. The data sets are discoverable through their DOIs as well as through the HCDC data portal and the API of the metadata catalogue service.&lt;/p&gt;&lt;p&gt;This workflow is well established within the institute, but is still in the process of being refined and becoming more sophisticated and FAIRer. For example, an automated assignment of International Geo Sample Numbers (IGSN) for all samples is currently being planned.&lt;/p&gt;


Sign in / Sign up

Export Citation Format

Share Document