scholarly journals Approximate Bayesian computation reveals the importance of repeated measurements for parameterising cell-based models of growing tissues

2017 ◽  
Author(s):  
Jochen Kursawe ◽  
Ruth E. Baker ◽  
Alexander G. Fletcher

AbstractThe growth and dynamics of epithelial tissues govern many morphogenetic processes in embryonic development. A recent quantitative transition in data acquisition, facilitated by advances in genetic and live-imaging techniques, is paving the way for new insights to these processes. Computational models can help us understand and interpret observations, and then make predictions for future experiments that can distinguish between hypothesised mechanisms. Increasingly, cell-based modelling approaches such as vertex models are being used to help understand the mechanics underlying epithelial morphogenesis. These models typically seek to reproduce qualitative phenomena, such as cell sorting or tissue buckling. However, it remains unclear to what extent quantitative data can be used to constrain these models so that they can then be used to make quantitative, experimentally testable predictions. To address this issue, we perform an in silico study to investigate whether vertex model parameters can be inferred from imaging data, and explore methods to quantify the uncertainty of such estimates. Our approach requires the use of summary statistics to estimate parameters. Here, we focus on summary statistics of cellular packing and of laser ablation experiments, as are commonly reported from imaging studies. We find that including data from repeated experiments is necessary to generate reliable parameter estimates that can facilitate quantitative model predictions.

2020 ◽  
Vol 17 (173) ◽  
pp. 20200886
Author(s):  
L. Mihaela Paun ◽  
Mitchel J. Colebank ◽  
Mette S. Olufsen ◽  
Nicholas A. Hill ◽  
Dirk Husmeier

This study uses Bayesian inference to quantify the uncertainty of model parameters and haemodynamic predictions in a one-dimensional pulmonary circulation model based on an integration of mouse haemodynamic and micro-computed tomography imaging data. We emphasize an often neglected, though important source of uncertainty: in the mathematical model form due to the discrepancy between the model and the reality, and in the measurements due to the wrong noise model (jointly called ‘model mismatch’). We demonstrate that minimizing the mean squared error between the measured and the predicted data (the conventional method) in the presence of model mismatch leads to biased and overly confident parameter estimates and haemodynamic predictions. We show that our proposed method allowing for model mismatch, which we represent with Gaussian processes, corrects the bias. Additionally, we compare a linear and a nonlinear wall model, as well as models with different vessel stiffness relations. We use formal model selection analysis based on the Watanabe Akaike information criterion to select the model that best predicts the pulmonary haemodynamics. Results show that the nonlinear pressure–area relationship with stiffness dependent on the unstressed radius predicts best the data measured in a control mouse.


2019 ◽  
Vol 36 (2) ◽  
pp. 586-593
Author(s):  
Boseung Choi ◽  
Yu-Yu Cheng ◽  
Selahattin Cinar ◽  
William Ott ◽  
Matthew R Bennett ◽  
...  

Abstract Motivation Advances in experimental and imaging techniques have allowed for unprecedented insights into the dynamical processes within individual cells. However, many facets of intracellular dynamics remain hidden, or can be measured only indirectly. This makes it challenging to reconstruct the regulatory networks that govern the biochemical processes underlying various cell functions. Current estimation techniques for inferring reaction rates frequently rely on marginalization over unobserved processes and states. Even in simple systems this approach can be computationally challenging, and can lead to large uncertainties and lack of robustness in parameter estimates. Therefore we will require alternative approaches to efficiently uncover the interactions in complex biochemical networks. Results We propose a Bayesian inference framework based on replacing uninteresting or unobserved reactions with time delays. Although the resulting models are non-Markovian, recent results on stochastic systems with random delays allow us to rigorously obtain expressions for the likelihoods of model parameters. In turn, this allows us to extend MCMC methods to efficiently estimate reaction rates, and delay distribution parameters, from single-cell assays. We illustrate the advantages, and potential pitfalls, of the approach using a birth–death model with both synthetic and experimental data, and show that we can robustly infer model parameters using a relatively small number of measurements. We demonstrate how to do so even when only the relative molecule count within the cell is measured, as in the case of fluorescence microscopy. Availability and implementation Accompanying code in R is available at https://github.com/cbskust/DDE_BD. Supplementary information Supplementary data are available at Bioinformatics online.


2020 ◽  
Author(s):  
Q. Feltgen ◽  
J. Daunizeau

AbstractDrift-diffusion models or DDMs are becoming a standard in the field of computational neuroscience. They extend models from signal detection theory by proposing a simple mechanistic explanation for the observed relationship between decision outcomes and reaction times (RT). In brief, they assume that decisions are triggered once the accumulated evidence in favor of a particular alternative option has reached a predefined threshold. Fitting a DDM to empirical data then allows one to interpret observed group or condition differences in terms of a change in the underlying model parameters. However, current approaches do not provide reliable parameter estimates when, e.g., evidence strength is varied over trials. In this note, we propose a fast and efficient approach that is based on fitting a self-consistency equation that the DDM fulfills. Using numerical simulations, we show that this approach enables one to extract relevant information from trial-by-trial variations of RT data that would typically be buried in the empirical distribution. Finally, we demonstrate the added-value of the approach, when applied to a recent value-based decision making experiment.


2018 ◽  
Vol 15 (149) ◽  
pp. 20180600 ◽  
Author(s):  
Sabrina Hross ◽  
Fabian J. Theis ◽  
Michael Sixt ◽  
Jan Hasenauer

Spatial patterns are ubiquitous on the subcellular, cellular and tissue level, and can be studied using imaging techniques such as light and fluorescence microscopy. Imaging data provide quantitative information about biological systems; however, mechanisms causing spatial patterning often remain elusive. In recent years, spatio-temporal mathematical modelling has helped to overcome this problem. Yet, outliers and structured noise limit modelling of whole imaging data, and models often consider spatial summary statistics. Here, we introduce an integrated data-driven modelling approach that can cope with measurement artefacts and whole imaging data. Our approach combines mechanistic models of the biological processes with robust statistical models of the measurement process. The parameters of the integrated model are calibrated using a maximum-likelihood approach. We used this integrated modelling approach to studyin vivogradients of the chemokine (C-C motif) ligand 21 (CCL21). CCL21 gradients guide dendritic cells and are important in the adaptive immune response. Using artificial data, we verified that the integrated modelling approach provides reliable parameter estimates in the presence of measurement noise and that bias and variance of these estimates are reduced compared to conventional approaches. The application to experimental data allowed the parametrization and subsequent refinement of the model using additional mechanisms. Among other results, model-based hypothesis testing predicted lymphatic vessel-dependent concentration of heparan sulfate, the binding partner of CCL21. The selected model provided an accurate description of the experimental data and was partially validated using published data. Our findings demonstrate that integrated statistical modelling of whole imaging data is computationally feasible and can provide novel biological insights.


2020 ◽  
Vol 5 ◽  
Author(s):  
Nikolai Bode

Simulation models for pedestrian crowds are a ubiquitous tool in research and industry. It is crucial that the parameters of these models are calibrated carefully and ultimately it will be of interest to compare competing models to decide which model is best suited for a particular purpose. In this contribution, I demonstrate how Approximate Bayesian Computation (ABC), which is already a popular tool in other areas of science, can be used for model fitting and model selection in a pedestrian dynamics context. I fit two different models for pedestrian dynamics to data on a crowd passing in one direction through a bottleneck. One model describes movement in continuous-space, the other model is a cellular automaton and thus describes movement in discrete-space. In addition, I compare models to data using two metrics. The first is based on egress times and the second on the velocity of pedestrians in front of the bottleneck. My results show that while model fitting is successful, a substantial degree of uncertainty about the value of some model parameters remains after model fitting. Importantly, the choice of metric in model fitting can influence parameter estimates. Model selection is inconclusive for the egress time metric but supports the continuous-space model for the velocity-based metric. These findings show that ABC is a flexible approach and highlights the difficulties associated with model fitting and model selection for pedestrian dynamics. ABC requires many simulation runs and choosing appropriate metrics for comparing data to simulations requires careful attention. Despite this, I suggest ABC is a promising tool, because it is versatile and easily implemented for the growing number of openly available crowd simulators and data sets.


2018 ◽  
Author(s):  
Sabrina Hross ◽  
Fabian J. Theis ◽  
Michael Sixt ◽  
Jan Hasenauer

AbstractSpatial patterns are ubiquitous on the subcellular, cellular and tissue level, and can be studied using imaging techniques such as light and fluorescence microscopy. Imaging data provide quantitative information about biological systems, however, mechanisms causing spatial patterning often remain illusive. In recent years, spatio-temporal mathematical modelling helped to overcome this problem. Yet, outliers and structured noise limit modelling of whole imaging data, and models often consider spatial summary statistics. Here, we introduce an integrated data-driven modelling approach that can cope with measurement artefacts and whole imaging data. Our approach combines mechanistic models of the biological processes with robust statistical models of the measurement process. The parameters of the integrated model are calibrated using a maximum likelihood approach. We used this integrated modelling approach to study in vivo gradients of the chemokine (C-C motif) ligand 21 (CCL21). CCL21 gradients guide dendritic cells and are important in the adaptive immune response. Using artificial data, we verified that the integrated modelling approach provides reliable parameter estimates in the presence of measurement noise and that bias and variance of these estimates are reduced compared to conventional approaches. The application to experimental data allowed the parameterisation and subsequent refinement of the model using additional mechanisms. Among others, model-based hypothesis testing predicted lymphatic vessel dependent concentration of heparan sulfate, the binding partner of CCL21. The selected model provided an accurate description of the experimental data and was partially validated using published data. Our findings demonstrate that integrated statistical modelling of whole imaging data is computationally feasible and can provide novel biological insights.


2019 ◽  
Author(s):  
Boseung Choi ◽  
Yu-Yu Cheng ◽  
Selahittin Cinar ◽  
William Ott ◽  
Matthew R. Bennett ◽  
...  

AbstractMotivationAdvances in experimental and imaging techniques have allowed for unprecedented insights into the dynamical processes within individual cells. However, many facets of intracellular dynamics remain hidden, or can be measured only indirectly. This makes it challenging to reconstruct the regulatory networks that govern the biochemical processes underlying various cell functions. Current estimation techniques for inferring reaction rates frequently rely on marginalization over unobserved processes and states. Even in simple systems this approach can be computationally challenging, and can lead to large uncertainties and lack of robustness in parameter estimates. Therefore we will require alternative approaches to efficiently uncover the interactions in complex biochemical networks.ResultsWe propose a Bayesian inference framework based on replacing uninteresting or unobserved reactions with time delays. Although the resulting models are non-Markovian, recent results on stochastic systems with random delays allow us to rigorously obtain expressions for the likelihoods of model parameters. In turn, this allows us to extend MCMC methods to efficiently estimate reaction rates, and delay distribution parameters, from single-cell assays. We illustrate the advantages, and potential pitfalls, of the approach using a birth-death model with both synthetic and experimental data, and show that we can robustly infer model parameters using a relatively small number of measurements. We demonstrate how to do so even when only the relative molecule count within the cell is measured, as in the case of fluorescence microscopy.


Author(s):  
Anubhav Gupta ◽  
Owen Petchey

1) Food web models explain and predict the trophic interactions in a food web, and they can infer missing interactions among the organisms. The allometric diet breadth model (ADBM) is a food web model based on the foraging theory. In the ADBM the foraging parameters are allometrically scaled to body sizes of predators and prey. In Petchey et al. (2008), the parameterisation of the ADBM had two limitations: (a) the model parameters were point estimates, and (b) food web connectance was not estimated. 2) The novelty of our current approach is: (a) we consider multiple predictions from the ADBM by parameterising it with approximate Bayesian computation, to estimate parameter distributions and not point estimates. (b) Connectance emerges from the parameterisation, by measuring model fit using the true skill statistic, which takes into account prediction of both the presences and absences of links. 3) We fit the ADBM using approximate Bayesian computation to 16 observed food webs from a wide variety of ecosystems. Connectance was consistently overestimated in the new parameterisation method. In some of the food webs, considerable variation in estimated parameter distributions occurred, and resulted in considerable variation (i.e. uncertainty) in predicted food web structure. 4) We conclude that the observed food web data is likely missing some trophic links that do actually occur, and that the ADBM likely predicts some links that do not exist. The latter could be addressed by accounting in the ADBM for additional traits other than body size. Further work could also address the significance of uncertainty in parameter estimates for predicted food web responses to environmental change.


2017 ◽  
Vol 75 (6) ◽  
pp. 1370-1389 ◽  
Author(s):  
Jamal Alikhani ◽  
Imre Takacs ◽  
Ahmed Al-Omari ◽  
Sudhir Murthy ◽  
Arash Massoudieh

A parameter estimation framework was used to evaluate the ability of observed data from a full-scale nitrification–denitrification bioreactor to reduce the uncertainty associated with the bio-kinetic and stoichiometric parameters of an activated sludge model (ASM). Samples collected over a period of 150 days from the effluent as well as from the reactor tanks were used. A hybrid genetic algorithm and Bayesian inference were used to perform deterministic and parameter estimations, respectively. The main goal was to assess the ability of the data to obtain reliable parameter estimates for a modified version of the ASM. The modified ASM model includes methylotrophic processes which play the main role in methanol-fed denitrification. Sensitivity analysis was also used to explain the ability of the data to provide information about each of the parameters. The results showed that the uncertainty in the estimates of the most sensitive parameters (including growth rate, decay rate, and yield coefficients) decreased with respect to the prior information.


2013 ◽  
Vol 2013 ◽  
pp. 1-10 ◽  
Author(s):  
Tom Burr ◽  
Alexei Skurikhin

Approximate Bayesian computation (ABC) is an approach for using measurement data to calibrate stochastic computer models, which are common in biology applications. ABC is becoming the “go-to” option when the data and/or parameter dimension is large because it relies on user-chosen summary statistics rather than the full data and is therefore computationally feasible. One technical challenge with ABC is that the quality of the approximation to the posterior distribution of model parameters depends on the user-chosen summary statistics. In this paper, the user requirement to choose effective summary statistics in order to accurately estimate the posterior distribution of model parameters is investigated and illustrated by example, using a model and corresponding real data of mitochondrial DNA population dynamics. We show that for some choices of summary statistics, the posterior distribution of model parameters is closely approximated and for other choices of summary statistics, the posterior distribution is not closely approximated. A strategy to choose effective summary statistics is suggested in cases where the stochastic computer model can be run at many trial parameter settings, as in the example.


Sign in / Sign up

Export Citation Format

Share Document