Bayesian Inference and RJMCMC in Structural Dynamics: On Experimental Data

Author(s):  
D. Tiboaca ◽  
P. L. Green ◽  
R. J. Barthorpe ◽  
I. Antoniadou ◽  
K. Worden
2020 ◽  
Vol 60 ◽  
pp. 103025 ◽  
Author(s):  
Chiara Pepi ◽  
Massimiliano Gioffrè ◽  
Mircea Grigoriu

2019 ◽  
Author(s):  
C. Vaghi ◽  
A. Rodallec ◽  
R. Fanciullino ◽  
J. Ciccolini ◽  
J. Mochel ◽  
...  

AbstractTumor growth curves are classically modeled by ordinary differential equations. In analyzing the Gompertz model several studies have reported a striking correlation between the two parameters of the model.We analyzed tumor growth kinetics within the statistical framework of nonlinear mixed-effects (population approach). This allowed for the simultaneous modeling of tumor dynamics and interanimal variability. Experimental data comprised three animal models of breast and lung cancers, with 843 measurements in 94 animals. Candidate models of tumor growth included the Exponential, Logistic and Gompertz. The Exponential and – more notably – Logistic models failed to describe the experimental data whereas the Gompertz model generated very good fits. The population-level correlation between the Gompertz parameters was further confirmed in our analysis (R2 > 0.96 in all groups). Combining this structural correlation with rigorous population parameter estimation, we propose a novel reduced Gompertz function consisting of a single individual parameter. Leveraging the population approach using bayesian inference, we estimated the time of tumor initiation using three late measurement timepoints. The reduced Gompertz model was found to exhibit the best results, with drastic improvements when using bayesian inference as compared to likelihood maximization alone, for both accuracy and precision. Specifically, mean accuracy was 12.1% versus 74.1% and mean precision was 15.2 days versus 186 days, for the breast cancer cell line.These results offer promising clinical perspectives for the personalized prediction of tumor age from limited data at diagnosis. In turn, such predictions could be helpful for assessing the extent of invisible metastasis at the time of diagnosis.Author summaryMathematical models for tumor growth kinetics have been widely used since several decades but mostly fitted to individual or average growth curves. Here we compared three classical models (Exponential, Logistic and Gompertz) using a population approach, which accounts for inter-animal variability. The Exponential and the Logistic models failed to fit the experimental data while the Gompertz model showed excellent descriptive power. Moreover, the strong correlation between the two parameters of the Gompertz equation motivated a simplification of the model, the reduced Gompertz model, with a single individual parameter and equal descriptive power. Combining the mixed-effects approach with Bayesian inference, we predicted the age of individual tumors with only few late measurements. Thanks to its simplicity, the reduced Gompertz model showed superior predictive power. Although our method remains to be extended to clinical data, these results are promising for the personalized estimation of the age of a tumor from limited measurements at diagnosis. Such predictions could contribute to the development of computational models for metastasis.


2020 ◽  
Author(s):  
Colin D. Kinz-Thompson ◽  
Korak Kumar Ray ◽  
Ruben L. Gonzalez

ABSTRACTBiophysics experiments performed at single-molecule resolution contain exceptional insight into the structural details and dynamic behavior of biological systems. However, extracting this information from the corresponding experimental data unequivocally requires applying a biophysical model. Here, we discuss how to use probability theory to apply these models to single-molecule data. Many current single-molecule data analysis methods apply parts of probability theory, sometimes unknowingly, and thus miss out on the full set of benefits provided by this self-consistent framework. The full application of probability theory involves a process called Bayesian inference that fully accounts for the uncertainties inherent to single-molecule experiments. Additionally, using Bayesian inference provides a scientifically rigorous manner to incorporate information from multiple experiments into a single analysis and to find the best biophysical model for an experiment without the risk of overfitting the data. These benefits make the Bayesian approach ideal for analyzing any type of single-molecule experiment.


2018 ◽  
Vol 15 ◽  
pp. 41-45
Author(s):  
Eliška Janouchová ◽  
Anna Kučerová

<p>Modelling of heterogeneous materials based on randomness of model input parameters involves parameter identification which is focused on solving a stochastic inversion problem. It can be formulated as a search for probabilistic description of model parameters providing the distribution of the model response corresponding to the distribution of the observed data</p><p>In this contribution, a numerical model of kinematic and isotropic hardening for viscoplastic material is calibrated on a basis of experimental data from a cyclic loading test at a high temperature. Five material model parameters are identified in probabilistic setting. The core of the identification method is the Bayesian inference of uncertain statistical moments of a prescribed joint lognormal distribution of the parameters. At first, synthetic experimental data are used to verify the identification procedure, then the real experimental data are processed to calibrate the material model of copper alloy.</p>


2020 ◽  
Vol 110 (09) ◽  
pp. 624-628
Author(s):  
Maximilian Busch ◽  
Thomas Semm ◽  
Michael Zäh

Industrieroboter werden aufgrund ihres großen Arbeitsraumes zunehmend für die Fräsbearbeitungen großer Werkstücke eingesetzt. Dynamische Instabilitäten während des Prozesses schränken jedoch ihre Produktivität ein. Maschinelle Lernverfahren gewinnen hierbei an Popularität, um Strukturmodelle aus experimentellen Daten abzuleiten. Das Institut für Werkzeugmaschinen und Betriebswissenschaften (iwb) der Technischen Universität München entwickelt in Zuge dessen Methoden, die mit maschinellen Lernverfahren Simulations- und Experimentaldaten verbinden, um dadurch die Strukturdynamik von Fräsrobotern zu modellieren. &nbsp; Industrial robots are increasingly used for milling applications of large workpieces due to their large working area. However, dynamic instabilities during the process limit their productivity. Thus, machine learning methods are becoming increasingly popular for deriving system models from experimental data. The Institute for Machine Tools and Industrial Management (iwb) at the Technical University of Munich is developing methods to fuse simulation data and experimental data using machine learning methods to model the structural dynamics of milling robots.


Author(s):  
Rachel Kenigsbuch ◽  
Yoram Halevi

Abstract The paper considers the problem of updating an analytical model from experimental data. The approach that is taken is the reference basis, where some of the parameters are considered to be completely accurate while the others are updated by solving a constrained optimization problem. The main results of this paper are closed form solutions to these problems with general weighting matrices in the optimization criterion. These are generalizations of several model reference updating problems that were solved and reported in the literature. The importance of this generalization is the ability to incorporate prior knowledge regarding the accuracy of the model in specified areas into the method. Another aspect of this work is the investigation of geometrical interpretation of the results which provides insight into the mechanism of the updating process. The advantages of the new updating schemes are demonstrated by means of examples.


2015 ◽  
Author(s):  
D. Sam Schwarzkopf

The problems with classical frequentist statistics have recently received much attention, yet the enthusiasm of researchers to adopt alternatives like Bayesian inference remains modest. Here I present the bootstrapped evidence test, an objective resampling procedure that takes the precision with which both the experimental and null hypothesis can be estimated into account. Simulations and reanalysis of actual experimental data demonstrate that this test minimizes false positives while maintaining sensitivity. It is equally applicable to a wide range of situations and thus minimizes problems arising from analytical flexibility. Critically, it does not dichotomize the results based on an arbitrary significance level but instead quantifies how well the data support either the alternative or the null hypothesis. It is thus particularly useful in situations with considerable uncertainty about the expected effect size. Because it is non-parametric, it is also robust to severe violations of assumptions made by classical statistics.


Sign in / Sign up

Export Citation Format

Share Document