Predicting Tumor Growth and Ligand Dependence from mRNA by Combining Machine Learning with Mechanistic Modeling

2018 ◽  
Author(s):  
Helge Hass ◽  
Andreas Raue
2020 ◽  
Author(s):  
Pietro Mascheroni ◽  
Symeon Savvopoulos ◽  
Juan Carlos López Alfonso ◽  
Michael Meyer-Hermann ◽  
Haralampos Hatzikirou

AbstractBiomedical problems are highly complex and multidimensional. Commonly, only a small subset of the relevant variables can be modeled by virtue of mathematical modeling due to lack of knowledge of the involved phenomena. Although these models are effective in analyzing the approximate dynamics of the system, their predictive accuracy is generally limited. On the other hand, statistical learning methods are well-suited for quantitative reproduction of data, but they do not provide mechanistic understanding of the investigated problem. Herein, we propose a novel method, based on the Bayesian coupling of mathematical modeling and machine learning (BaM3). We evaluate the proposed BaM3 method on a synthetic dataset for brain tumor growth as a proof of concept and analyze its performance in predicting two major clinical outputs, namely tumor burden and infiltration. Combining these two approaches results in improved predictions in almost all simulated patients, especially for those with a late clinical presentation. In addition, we test the proposed methodology on a set of patients suffering from Chronic Lymphocytic Leukemia (CLL) and show excellent agreement with reported data.


2021 ◽  
Vol 1 (1) ◽  
Author(s):  
Pietro Mascheroni ◽  
Symeon Savvopoulos ◽  
Juan Carlos López Alfonso ◽  
Michael Meyer-Hermann ◽  
Haralampos Hatzikirou

Abstract Background In clinical practice, a plethora of medical examinations are conducted to assess the state of a patient’s pathology producing a variety of clinical data. However, investigation of these data faces two major challenges. Firstly, we lack the knowledge of the mechanisms involved in regulating these data variables, and secondly, data collection is sparse in time since it relies on patient’s clinical presentation. The former limits the predictive accuracy of clinical outcomes for any mechanistic model. The latter restrains any machine learning algorithm to accurately infer the corresponding disease dynamics. Methods Here, we propose a novel method, based on the Bayesian coupling of mathematical modeling and machine learning, aiming at improving individualized predictions by addressing the aforementioned challenges. Results We evaluate the proposed method on a synthetic dataset for brain tumor growth and analyze its performance in predicting two relevant clinical outputs. The method results in improved predictions in almost all simulated patients, especially for those with a late clinical presentation (>95% patients show improvements compared to standard mathematical modeling). In addition, we test the methodology in two additional settings dealing with real patient cohorts. In both cases, namely cancer growth in chronic lymphocytic leukemia and ovarian cancer, predictions show excellent agreement with reported clinical outcomes (around 60% reduction of mean squared error). Conclusions We show that the combination of machine learning and mathematical modeling approaches can lead to accurate predictions of clinical outputs in the context of data sparsity and limited knowledge of disease mechanisms.


Foods ◽  
2021 ◽  
Vol 10 (4) ◽  
pp. 763
Author(s):  
Ran Yang ◽  
Zhenbo Wang ◽  
Jiajia Chen

Mechanistic-modeling has been a useful tool to help food scientists in understanding complicated microwave-food interactions, but it cannot be directly used by the food developers for food design due to its resource-intensive characteristic. This study developed and validated an integrated approach that coupled mechanistic-modeling and machine-learning to achieve efficient food product design (thickness optimization) with better heating uniformity. The mechanistic-modeling that incorporated electromagnetics and heat transfer was previously developed and validated extensively and was used directly in this study. A Bayesian optimization machine-learning algorithm was developed and integrated with the mechanistic-modeling. The integrated approach was validated by comparing the optimization performance with a parametric sweep approach, which is solely based on mechanistic-modeling. The results showed that the integrated approach had the capability and robustness to optimize the thickness of different-shape products using different initial training datasets with higher efficiency (45.9% to 62.1% improvement) than the parametric sweep approach. Three rectangular-shape trays with one optimized thickness (1.56 cm) and two non-optimized thicknesses (1.20 and 2.00 cm) were 3-D printed and used in microwave heating experiments, which confirmed the feasibility of the integrated approach in thickness optimization. The integrated approach can be further developed and extended as a platform to efficiently design complicated microwavable foods with multiple-parameter optimization.


2020 ◽  
Author(s):  
Cristian Axenie ◽  
Daria Kurz

AbstractMathematical and computational oncology has increased the pace of cancer research towards the advancement of personalized therapy. Serving the pressing need to exploit the large amounts of currently underutilized data, such approaches bring a significant clinical advantage in tailoring the therapy. CHIMERA is a novel system that combines mechanistic modelling and machine learning for personalized chemotherapy and surgery sequencing in breast cancer. It optimizes decision-making in personalized breast cancer therapy by connecting tumor growth behaviour and chemotherapy effects through predictive modelling and learning. We demonstrate the capabilities of CHIMERA in learning simultaneously the tumor growth patterns, across several types of breast cancer, and the pharmacokinetics of a typical breast cancer chemotoxic drug. The learnt functions are subsequently used to predict how to sequence the intervention. We demonstrate the versatility of CHIMERA in learning from tumor growth and pharmacokinetics data to provide robust predictions under two, typically used, chemotherapy protocol hypotheses.


2021 ◽  
Author(s):  
Ranjan Anantharaman ◽  
Anas Abdelrehim ◽  
Anand Jain ◽  
Avik Pal ◽  
Danny Sharp ◽  
...  

AbstractQuantitative systems pharmacology (QsP) may need to change in order to accommodate machine learning (ML), but ML may need to change to work for QsP. Here we investigate the use of neural network surrogates of stiff QsP models. This technique reduces and accelerates QsP models by training ML approximations on simulations. We describe how common neural network methodologies, such as residual neural networks, recurrent neural networks, and physics/biologically-informed neural networks, are fundamentally related to explicit solvers of ordinary differential equations (ODEs). Similar to how explicit ODE solvers are unstable on stiff QsP models, we demonstrate how these ML architectures see similar training instabilities. To address this issue, we showcase methods from scientific machine learning (SciML) which combine techniques from mechanistic modeling with traditional deep learning. We describe the continuous-time echo state network (CTESN) as the implicit analogue of ML architectures and showcase its ability to accurately train and predict on these stiff models where other methods fail. We demonstrate the CTESN’s ability to surrogatize a production QsP model, a >1,000 ODE chemical reaction system from the SBML Biomodels repository, and a reaction-diffusion partial differential equation. We showcase the ability to accelerate QsP simulations by up to 56x against the optimized DifferentialEquations.jl solvers while achieving <5% relative error in all of the examples. This shows how incorporating the numerical properties of QsP methods into ML can improve the intersection, and thus presents a potential method for accelerating repeated calculations such as global sensitivity analysis and virtual populations.


Mathematics ◽  
2020 ◽  
Vol 8 (5) ◽  
pp. 770
Author(s):  
Matteo Rucco ◽  
Giovanna Viticchi ◽  
Lorenzo Falsetti

Glioblastoma multiforme (GBM) is a fast-growing and highly invasive brain tumor, which tends to occur in adults between the ages of 45 and 70 and it accounts for 52 percent of all primary brain tumors. Usually, GBMs are detected by magnetic resonance images (MRI). Among MRI, a fluid-attenuated inversion recovery (FLAIR) sequence produces high quality digital tumor representation. Fast computer-aided detection and segmentation techniques are needed for overcoming subjective medical doctors (MDs) judgment. This study has three main novelties for demonstrating the role of topological features as new set of radiomics features which can be used as pillars of a personalized diagnostic systems of GBM analysis from FLAIR. For the first time topological data analysis is used for analyzing GBM from three complementary perspectives—tumor growth at cell level, temporal evolution of GBM in follow-up period and eventually GBM detection. The second novelty is represented by the definition of a new Shannon-like topological entropy, the so-called Generator Entropy. The third novelty is the combination of topological and textural features for training automatic interpretable machine learning. These novelties are demonstrated by three numerical experiments. Topological Data Analysis of a simplified 2D tumor growth mathematical model had allowed to understand the bio-chemical conditions that facilitate tumor growth—the higher the concentration of chemical nutrients the more virulent the process. Topological data analysis was used for evaluating GBM temporal progression on FLAIR recorded within 90 days following treatment completion and at progression. The experiment had confirmed that persistent entropy is a viable statistics for monitoring GBM evolution during the follow-up period. In the third experiment we developed a novel methodology based on topological and textural features and automatic interpretable machine learning for automatic GBM classification on FLAIR. The algorithm reached a classification accuracy up to 97%.


2020 ◽  
Author(s):  
Giuseppe Sciumè

AbstractExisting continuum multiphase tumor growth models typically do not include microvasculature, or if present, this is modeled as non-deformable. Vasculature behavior and blood flow are usually non-coupled with the underlying tumor phenomenology from the mechanical viewpoint; hence, phenomena as vessel compression/occlusion modifying microcirculation and oxygen supply cannot be taken into account.The tumor tissue is here modeled as a reactive bi-compartment porous medium: the extracellular matrix constitutes the solid scaffold; blood is in the vascular porosity whereas the extra-vascular porous compartment is saturated by two cell phases and interstitial fluid (mixture of water and nutrient species). The pressure difference between blood and the extra-vascular overall pressure is sustained by vessel walls and drives shrinkage or dilatation of the vascular porosity. Model closure is achieved thanks to a consistent non-conventional definition of the Biot’s effective stress tensor.Angiogenesis is modeled by introducing a vascularization state variable, and accounting for tumor angiogenic factors and endothelial cells. Closure relationships and mass exchange terms related to vessel formation are detailed in a numerical example reproducing the principal features of angiogenesis. This example is preceded by a first pedagogical numerical study on one-dimensional bio-consolidation. Results are exquisite to realize that the bi-compartment poromechanical model is fully coupled (the external loads impact fluid flow in both porous compartments) and to envision further applications as for instance modeling of drugs delivery and tissue ulceration.


2017 ◽  
Vol 13 (12) ◽  
pp. e1005874 ◽  
Author(s):  
Thomas D. Gaddy ◽  
Qianhui Wu ◽  
Alyssa D. Arnheim ◽  
Stacey D. Finley

2021 ◽  
Vol 11 (1) ◽  
Author(s):  
Victor Antontsev ◽  
Aditya Jagarapu ◽  
Yogesh Bundey ◽  
Hypatia Hou ◽  
Maksim Khotimchenko ◽  
...  

AbstractPrediction of the first-in-human dosing regimens is a critical step in drug development and requires accurate quantitation of drug distribution. Traditional in vivo studies used to characterize clinical candidate’s volume of distribution are error-prone, time- and cost-intensive and lack reproducibility in clinical settings. The paper demonstrates how a computational platform integrating machine learning optimization with mechanistic modeling can be used to simulate compound plasma concentration profile and predict tissue-plasma partition coefficients with high accuracy by varying the lipophilicity descriptor logP. The approach applied to chemically diverse small molecules resulted in comparable geometric mean fold-errors of 1.50 and 1.63 in pharmacokinetic outputs for direct tissue:plasma partition and hybrid logP optimization, with the latter enabling prediction of tissue permeation that can be used to guide toxicity and efficacy dosing in human subjects. The optimization simulations required to achieve these results were parallelized on the AWS cloud and generated outputs in under 5 h. Accuracy, speed, and scalability of the framework indicate that it can be used to assess the relevance of other mechanistic relationships implicated in pharmacokinetic-pharmacodynamic phenomena with a lower risk of overfitting datasets and generate large database of physiologically-relevant drug disposition for further integration with machine learning models.


2020 ◽  
Author(s):  
Cemal Erdem ◽  
Ethan M. Bensman ◽  
Arnab Mutsuddy ◽  
Michael M. Saint-Antoine ◽  
Mehdi Bouhaddou ◽  
...  

ABSTRACTThe current era of big biomedical data accumulation and availability brings data integration opportunities for leveraging its totality to make new discoveries and/or clinically predictive models. Black-box statistical and machine learning methods are powerful for such integration, but often cannot provide mechanistic reasoning, particularly on the single-cell level. While single-cell mechanistic models clearly enable such reasoning, they are predominantly “small-scale”, and struggle with the scalability and reusability required for meaningful data integration. Here, we present an open-source pipeline for scalable, single-cell mechanistic modeling from simple, annotated input files that can serve as a foundation for mechanistic data integration. As a test case, we convert one of the largest existing single-cell mechanistic models to this format, demonstrating robustness and reproducibility of the approach. We show that the model cell line context can be changed with simple replacement of input file parameter values. We next use this new model to test alternative mechanistic hypotheses for the experimental observations that interferon-gamma (IFNG) inhibits epidermal growth factor (EGF)-induced cell proliferation. Model- based analysis suggested, and experiments support that these observations are better explained by IFNG-induced SOCS1 expression sequestering activated EGF receptors, thereby downregulating AKT activity, as opposed to direct IFNG-induced upregulation of p21 expression. Overall, this new pipeline enables large-scale, single-cell, and mechanistically-transparent modeling as a data integration modality complementary to machine learning.


Sign in / Sign up

Export Citation Format

Share Document