Improved Parameter Estimation in Kinetic Models: Selection and Tuning of Regularization Methods

Author(s):  
Attila Gábor ◽  
Julio R. Banga
2018 ◽  
Vol 35 (5) ◽  
pp. 830-838 ◽  
Author(s):  
Alejandro F Villaverde ◽  
Fabian Fröhlich ◽  
Daniel Weindl ◽  
Jan Hasenauer ◽  
Julio R Banga

2014 ◽  
Vol 83 ◽  
pp. 104-115 ◽  
Author(s):  
Jimena Di Maggio ◽  
Cecilia Paulo ◽  
Vanina Estrada ◽  
Nora Perotti ◽  
Juan C. Diaz Ricci ◽  
...  

2019 ◽  
Vol 116 ◽  
pp. 131-146 ◽  
Author(s):  
Muhammad Akmal Remli ◽  
Mohd Saberi Mohamad ◽  
Safaai Deris ◽  
Azurah A Samah ◽  
Sigeru Omatu ◽  
...  

FEBS Letters ◽  
1970 ◽  
Vol 9 (5) ◽  
pp. 245-251 ◽  
Author(s):  
Jens G. Reich

2011 ◽  
Vol 27 (14) ◽  
pp. 1964-1970 ◽  
Author(s):  
Gengjie Jia ◽  
Gregory N. Stephanopoulos ◽  
Rudiyanto Gunawan

2018 ◽  
Author(s):  
Alejandro F. Villaverde ◽  
Fabian Fröhlich ◽  
Daniel Weindl ◽  
Jan Hasenauer ◽  
Julio R. Banga

AbstractMotivationMechanistic kinetic models usually contain unknown parameters, which need to be estimated by optimizing the fit of the model to experimental data. This task can be computationally challenging due to the presence of local optima and ill-conditioning. While a variety of optimization methods have been suggested to surmount these issues, it is not obvious how to choose the best one for a given problem a priori, since many factors can influence their performance. A systematic comparison of methods that are suited to parameter estimation problems of sizes ranging from tens to hundreds of optimization variables is currently missing, and smaller studies indeed provided contradictory findings.ResultsHere, we use a collection of benchmark problems to evaluate the performance of two families of optimization methods: (i) a multi-start of deterministic local searches; and (ii) a hybrid metaheuristic combining stochastic global search with deterministic local searches. A fair comparison is ensured through a collaborative evaluation, involving researchers applying each method on a daily basis, and a consideration of multiple performance metrics capturing the trade-off between computational efficiency and robustness. Our results show that, thanks to recent advances in the calculation of parametric sensitivities, a multi-start of gradient-based local methods is often a successful strategy, but a better performance can be obtained with a hybrid metaheuristic. The best performer is a combination of a global scatter search metaheuristic with an interior point local method, provided with gradients estimated with adjoint-based sensitivities. We provide an implementation of this novel method in an open-source software toolbox to render it available to the scientific community.Availability and ImplementationThe code to reproduce the results is available at Zenodo https://doi.org/10.5281/[email protected], [email protected]


Sign in / Sign up

Export Citation Format

Share Document