scholarly journals Practical challenges for biomedical modeling using HPC

Author(s):  
David W Wright ◽  
Robin A Richardson ◽  
Peter V Coveney

The concept underlying precision medicine is that prevention, diagnosis and treatment of pathologies such as cancer can be improved through an understanding of the influence of individual patient characteristics. Predictive medicine seeks to derive this understanding through mechanistic models of the causes and (potential) progression of diseases within a given individual. This represents a grand challenge for computational biomedicine as it requires the integration of highly varied (and potentially vast) quantitative experimental datasets into models of complex biological systems. It is becoming increasingly clear that this challenge can only be answered through the use of complex workflows that combine diverse analyses and whose design is informed by an understanding of how predictions must be accompanied by estimates of uncertainty. Each stage in such a workflow can, in general, have very different computational requirements. If funding bodies and the HPC community are serious about the desire to support such approaches, they must consider the need for portable, persistent and stable tools designed to promote extensive long term development and testing of these workflows. From the perspective of model developers (and with even greater relevance to potential clinical or experimental collaborators) the enormous diversity of interfaces and supercomputer policies, frequently designed with monolithic applications in mind, can represent a serious barrier to innovation. Here we use experiences from work on two very different biomedical modeling scenarios - brain bloodflow and small molecule drug selection - to highlight issues with the current programming and execution environments and suggest potential solutions.

2018 ◽  
Author(s):  
David W Wright ◽  
Robin A Richardson ◽  
Peter V Coveney

The concept underlying precision medicine is that prevention, diagnosis and treatment of pathologies such as cancer can be improved through an understanding of the influence of individual patient characteristics. Predictive medicine seeks to derive this understanding through mechanistic models of the causes and (potential) progression of diseases within a given individual. This represents a grand challenge for computational biomedicine as it requires the integration of highly varied (and potentially vast) quantitative experimental datasets into models of complex biological systems. It is becoming increasingly clear that this challenge can only be answered through the use of complex workflows that combine diverse analyses and whose design is informed by an understanding of how predictions must be accompanied by estimates of uncertainty. Each stage in such a workflow can, in general, have very different computational requirements. If funding bodies and the HPC community are serious about the desire to support such approaches, they must consider the need for portable, persistent and stable tools designed to promote extensive long term development and testing of these workflows. From the perspective of model developers (and with even greater relevance to potential clinical or experimental collaborators) the enormous diversity of interfaces and supercomputer policies, frequently designed with monolithic applications in mind, can represent a serious barrier to innovation. Here we use experiences from work on two very different biomedical modeling scenarios - brain bloodflow and small molecule drug selection - to highlight issues with the current programming and execution environments and suggest potential solutions.


2016 ◽  
Vol 20 (17) ◽  
pp. 1827-1834
Author(s):  
Liqian Gao ◽  
Jun Chen ◽  
Yi Hu ◽  
Hongyan Sun ◽  
Yong Siang Ong ◽  
...  

2020 ◽  
Vol 7 (1) ◽  
pp. 4-16
Author(s):  
Daria Kotlarek ◽  
Agata Pawlik ◽  
Maria Sagan ◽  
Marta Sowała ◽  
Alina Zawiślak-Architek ◽  
...  

Targeted Protein Degradation (TPD) is an emerging new modality of drug discovery that offers unprecedented therapeutic benefits over traditional protein inhibition. Most importantly, TPD unlocks the untapped pool of the proteome that to date has been considered undruggable. Captor Therapeutics (Captor) is the fourth global, and first European, company that develops small molecule drug candidates based on the principles of targeted protein degradation. Captor is located in Basel, Switzerland and Wroclaw, Poland and exploits the best opportunities of the two sites – experience and non-dilutive European grants, and talent pool, respectively. Through over $38 M of funding, Captor has been active in three areas of TPD: molecular glues, bi-specific degraders and direct degraders, ObteronsTM.


2021 ◽  
Vol 21 (1) ◽  
Author(s):  
Fabian Dusse ◽  
Johanna Pütz ◽  
Andreas Böhmer ◽  
Mark Schieren ◽  
Robin Joppich ◽  
...  

Abstract Background Handovers of post-anesthesia patients to the intensive care unit (ICU) are often unstructured and performed under time pressure. Hence, they bear a high risk of poor communication, loss of information and potential patient harm. The aim of this study was to investigate the completeness of information transfer and the quantity of information loss during post anesthesia handovers of critical care patients. Methods Using a self-developed checklist, including 55 peri-operative items, patient handovers from the operation room or post anesthesia care unit to the ICU staff were observed and documented in real time. Observations were analyzed for the amount of correct and completely transferred patient data in relation to the written documentation within the anesthesia record and the patient’s chart. Results During a ten-week study period, 97 handovers were included. The mean duration of a handover was 146 seconds, interruptions occurred in 34% of all cases. While some items were transferred frequently (basic patient characteristics [72%], surgical procedure [83%], intraoperative complications [93.8%]) others were commonly missed (underlying diseases [23%], long-term medication [6%]). The completeness of information transfer is associated with the handover’s duration [B coefficient (95% CI): 0.118 (0.084-0.152), p<0.001] and increases significantly in handovers exceeding a duration of 2 minutes (24% ± 11.7 vs. 40% ± 18.04, p<0.001). Conclusions Handover completeness is affected by time pressure, interruptions, and inappropriate surroundings, which increase the risk of information loss. To improve completeness and ensure patient safety, an adequate time span for handover, and the implementation of communication tools are required.


Author(s):  
Sam Ade Jacobs ◽  
Tim Moon ◽  
Kevin McLoughlin ◽  
Derek Jones ◽  
David Hysom ◽  
...  

We improved the quality and reduced the time to produce machine learned models for use in small molecule antiviral design. Our globally asynchronous multi-level parallel training approach strong scales to all of Sierra with up to 97.7% efficiency. We trained a novel, character-based Wasserstein autoencoder that produces a higher quality model trained on 1.613 billion compounds in 23 minutes while the previous state of the art takes a day on 1 million compounds. Reducing training time from a day to minutes shifts the model creation bottleneck from computer job turnaround time to human innovation time. Our implementation achieves 318 PFLOPs for 17.1% of half-precision peak. We will incorporate this model into our molecular design loop enabling the generation of more diverse compounds; searching for novel, candidate antiviral drugs improves and reduces the time to synthesize compounds to be tested in the lab.


2021 ◽  
Vol 50 (2) ◽  
pp. 702-734
Author(s):  
Luling Wu ◽  
Jihong Liu ◽  
Ping Li ◽  
Bo Tang ◽  
Tony D. James

In this tutorial review, we will explore recent advances for the design, construction and application of two-photon excited fluorescence (TPEF)-based small-molecule probes.


2021 ◽  
Vol 10 (1) ◽  
pp. 130
Author(s):  
Ertan Saridogan ◽  
Mona Salman ◽  
Lerzan Sinem Direk ◽  
Ali Alchami

Uterine septum can negatively affect reproductive outcomes in women. Based on evidence from retrospective observational studies, hysteroscopic incision has been considered a solution to improve reproductive performance, however there has been recent controversy on the need for surgery for uterine septum. High quality evidence from prospective studies is still lacking, and until it is available, experts are encouraged to publish their data. We are therefore presenting our data that involves analysis of the patient characteristics, surgical approach and long-term reproductive outcomes of women who received treatment for uterine septum under the care of a single surgeon. This includes all women (99) who underwent hysteroscopic surgery for uterine septum between January 2001 and December 2019. Of those 99 women treated for intrauterine septum who were trying to conceive, 91.4% (64/70) achieved pregnancy, 78.6% (55/70) had live births and 8.6% (6/70) had miscarriages. No statistically significant difference was found in the live birth rates when data was analyzed in subgroups based on age, reason for referral/aetiology and severity of pathology. Our study results support the view that surgical treatment of uterine septa is beneficial in improving reproductive outcomes.


Sign in / Sign up

Export Citation Format

Share Document