scholarly journals Efficient gradient-based parameter estimation for dynamic models using qualitative data

2021 ◽  
Author(s):  
Leonard Schmiester ◽  
Daniel Weindl ◽  
Jan Hasenauer

AbstractMotivationUnknown parameters of dynamical models are commonly estimated from experimental data. However, while various efficient optimization and uncertainty analysis methods have been proposed for quantitative data, methods for qualitative data are rare and suffer from bad scaling and convergence.ResultsHere, we propose an efficient and reliable framework for estimating the parameters of ordinary differential equation models from qualitative data. In this framework, we derive a semi-analytical algorithm for gradient calculation of the optimal scaling method developed for qualitative data. This enables the use of efficient gradient-based optimization algorithms. We demonstrate that the use of gradient information improves performance of optimization and uncertainty quantification on several application examples. On average, we achieve a speedup of more than one order of magnitude compared to gradient-free optimization. Additionally, in some examples, the gradient-based approach yields substantially improved objective function values and quality of the fits. Accordingly, the proposed framework substantially improves the parameterization of models from qualitative data.AvailabilityThe proposed approach is implemented in the open-source Python Parameter EStimation TOolbox (pyPESTO). All application examples and code to reproduce this study are available at https://doi.org/10.5281/zenodo.4507613.

SPE Journal ◽  
2009 ◽  
Vol 15 (01) ◽  
pp. 18-30 ◽  
Author(s):  
J.R.. R. Rommelse ◽  
J.D.. D. Jansen ◽  
A.W.. W. Heemink

Summary The discrepancy between observed measurements and model predictions can be used to improve either the model output alone or both the model output and the parameters that underlie the model. In the case of parameter estimation, methods exist that can efficiently calculate the gradient of the discrepancy to changes in the parameters, assuming that there are no uncertainties in addition to the unknown parameters. In the case of general nonlinear parameter estimation, many different parameter sets exist that locally minimize the discrepancy. In this case, the gradient must be regularized before it can be used by gradient-based minimization algorithms. This article proposes a method for calculating a gradient in the presence of additional model errors through the use of representer expansions. The representers are data-driven basis functions that perform the regularization. All available data can be used during every iteration of the minimization scheme, as is the case in the classical representer method (RM). However, the method proposed here also allows adaptive selection of different portions of the data during different iterations to reduce computation time; the user now has the freedom to choose the number of basis functions and revise this choice at every iteration. The method also differs from the classic RM by the introduction of measurement representers in addition to state, adjoint, and parameter representers and by the fact that no correction terms are calculated. Unlike the classic RM, where the minimization scheme is prescribed, the RM proposed here provides a gradient that can be used in any minimization algorithm. The applicability of the modified method is illustrated with a synthetic example to estimate permeability values in an inverted- five-spot waterflooding problem.


Author(s):  
Kourosh Danai ◽  
James R. McCusker

It is shown that output sensitivities of dynamic models can be better delineated in the time-scale domain. This enhanced delineation provides the capacity to isolate regions of the time-scale plane, coined as parameter signatures, wherein individual output sensitivities dominate the others. Due to this dominance, the prediction error can be attributed to the error of a single parameter at each parameter signature so as to enable estimation of each model parameter error separately. As a test of fidelity, the estimated parameter errors are evaluated in iterative parameter estimation in this paper. The proposed parameter signature isolation method (PARSIM) that uses the parameter error estimates for parameter estimation is shown to have an estimation precision comparable to that of the Gauss–Newton method. The transparency afforded by the parameter signatures, however, extends PARSIM’s features beyond rudimentary parameter estimation. One such potential feature is noise suppression by discounting the parameter error estimates obtained in the finer-scale (higher-frequency) regions of the time-scale plane. Another is the capacity to assess the observability of each output through the quality of parameter signatures it provides.


2019 ◽  
Author(s):  
Leonard Schmiester ◽  
Daniel Weindl ◽  
Jan Hasenauer

AbstractQuantitative dynamical models facilitate the understanding of biological processes and the prediction of their dynamics. These models usually comprise unknown parameters, which have to be inferred from experimental data. For quantitative experimental data, there are several methods and software tools available. However, for qualitative data the available approaches are limited and computationally demanding.Here, we consider the optimal scaling method which has been developed in statistics for categorical data and has been applied to dynamical systems. This approach turns qualitative variables into quantitative ones, accounting for constraints on their relation. We derive a reduced formulation for the optimization problem defining the optimal scaling. The reduced formulation possesses the same optimal points as the established formulation but requires less degrees of freedom. Parameter estimation for dynamical models of cellular pathways revealed that the reduced formulation improves the robustness and convergence of optimizers. This resulted in substantially reduced computation times.We implemented the proposed approach in the open-source Python Parameter EStimation TOolbox (pyPESTO) to facilitate reuse and extension. The proposed approach enables efficient parameterization of quantitative dynamical models using qualitative data.


2016 ◽  
pp. 141-149
Author(s):  
S.V. Yershov ◽  
◽  
R.М. Ponomarenko ◽  

Parallel tiered and dynamic models of the fuzzy inference in expert-diagnostic software systems are considered, which knowledge bases are based on fuzzy rules. Tiered parallel and dynamic fuzzy inference procedures are developed that allow speed up of computations in the software system for evaluating the quality of scientific papers. Evaluations of the effectiveness of parallel tiered and dynamic schemes of computations are constructed with complex dependency graph between blocks of fuzzy Takagi – Sugeno rules. Comparative characteristic of the efficacy of parallel-stacked and dynamic models is carried out.


Complexity ◽  
2017 ◽  
Vol 2017 ◽  
pp. 1-14
Author(s):  
Ahmed A. Mahmoud ◽  
Sarat C. Dass ◽  
Mohana S. Muthuvalu ◽  
Vijanth S. Asirvadam

This article presents statistical inference methodology based on maximum likelihoods for delay differential equation models in the univariate setting. Maximum likelihood inference is obtained for single and multiple unknown delay parameters as well as other parameters of interest that govern the trajectories of the delay differential equation models. The maximum likelihood estimator is obtained based on adaptive grid and Newton-Raphson algorithms. Our methodology estimates correctly the delay parameters as well as other unknown parameters (such as the initial starting values) of the dynamical system based on simulation data. We also develop methodology to compute the information matrix and confidence intervals for all unknown parameters based on the likelihood inferential framework. We present three illustrative examples related to biological systems. The computations have been carried out with help of mathematical software: MATLAB® 8.0 R2014b.


Agronomy ◽  
2021 ◽  
Vol 11 (5) ◽  
pp. 839
Author(s):  
Mitchell Kent ◽  
William Rooney

Interest in the use of popped sorghum in food products has resulted in a niche market for sorghum hybrids with high popping quality but little work has been done to assess the relative effects of field processing methods of grain on popping quality. This study evaluated the relative effects of harvest moisture and threshing methods on the popping quality of sorghum grain. A grain sorghum hybrid with good popping quality was produced during two different years in Texas wherein it was harvested at two moisture levels (low and high) and grain was removed from panicles using five different threshing methods (hand, rubber belt, metal brushes and two metal concave bar systems). Years, harvest moisture content and threshing method influenced all three popping quality measurements (popping efficacy, expansion ratio and flake size), but threshing method had an order of magnitude larger effect than either moisture level or year. While many of the interactions were significant, they did not influence the general trends observed. As such, the threshing methods with less direct impact force on the grain (hand and rubber belt) had higher popping quality than those samples threshed with greater impact force on the grain (metal-based systems). The popping quality differences between threshing system are likely due to a reduction in kernel integrity caused by the impacts to the kernel that occurred while threshing the grain. The results herein indicate that field processing of the grain, notably threshing method has significant impacts on the popping quality and should be taken into consideration when grain sorghum is harvested for popping purposes.


2021 ◽  
Vol 19 (1) ◽  
Author(s):  
Isabella de Vere Hunt ◽  
James M. Kilgour ◽  
Robert Danby ◽  
Andy Peniket ◽  
Rubeta N. Matin

Abstract Background Graft-versus-host disease (GVHD) is a significant cause of morbidity and mortality following allogeneic stem cell transplantation. These patients face unique challenges due to the complexity of GVHD which can affect multiple organ systems, and the toxicity of treatments. Despite the known impact on quality of life (QOL), qualitative data within the bone marrow transplantation (BMT) literature is rare, and there has been no qualitative work exploring patient experience of specialist healthcare provision for GVHD in the United Kingdom. Methods We conducted a primary explorative qualitative study of the experience of QOL issues and multidisciplinary care in patients with chronic GVHD following allogeneic stem cell transplantation. Eight patients were identified using convenience sampling from specialist BMT outpatient clinics. Following consent, patients were interviewed individually via telephone. Transcripts of interviews were analyzed using an inductive thematic approach. Results Mean participant age was 61-years-old (range 45–68), with a mean time post-transplant of 3 years at time of interview (range 3 months–15 years). Five key QOL themes were identified: (1) ‘Restricted as to what I can do’; (2) Troubling symptoms—‘you can sort of get GVHD anywhere’; (3) Confusion/uncertainty over GVHD symptoms—‘Is this the GVHD?’; (4) Unpredictable course and uncertainty about the future; and (5) Adapting to the sick role. In addition, four themes related to experience of service provision were identified: (1) personal care and close relationship with BMT nurses; (2) efficiency versus long waits—‘On the case straight away’; (3) information provision—‘went into it with a bit of a rosy view’; and (4) the role of support groups. Conclusions These qualitative data reflect the heterogeneity of experiences of the GVHD patient population, reflecting the need for a flexible and nuanced approach to patient care with emphasis on comprehensive information provision. We have identified the key role that BMT specialist nurses within the multidisciplinary team play in supporting patients. We advocate future research should focus on ways to meet the complex needs of this patient group and ensure that the personal care and close relationships are not lost in service redesigns embracing remote consultations.


2021 ◽  
Vol 4 (1) ◽  
pp. 12-31
Author(s):  
Alifia Intan Sekar Sari ◽  
Ihda A'yunil Khotimah

ABSTRACTLibraries have an important role in the field of education, for research purposes, preservation of information resources and recreation areas. Libraries also have a role in increasing the intelligence and empowerment of the nation, as a human person who always wants to advance and the development of institutions that are increasingly experiencing intense competition. This study seeks to see the role of the library in improving the quality of elementary schools at SDIT Salsabila 2 Klaseman, both with regard to student achievement and the progress of the institution. The type of research used is descriptive qualitative, data collection is done by looking directly at the object of research (direct observation) interviews and documentation. The results of this study indicate that libraries have an important role in improving the quality of elementary schools such as the development of library facilities; there is a special room, a growing collection of books, modern service systems, air-conditioned room facilities, computers and LCD projectors are available and also supports children's achievements in winning several championships including in the PAI quiz competition and the MIPA OlympiadKeywords: Quality Improvement of Elementary Schools: The Role of Libraries


2016 ◽  
Vol 77 (2) ◽  
pp. 197-211 ◽  
Author(s):  
Phil Jones ◽  
Julia Bauder ◽  
Kevin Engel

Grinnell College participated in ACRL’s first cohort of Assessment in Action (AiA), undertaking a mixed-methods action research project to assess the effectiveness of librarian-led research literacy sessions in improving students’ research skills. The quantitative data showed that the quality of students’ sources did not markedly improve following a research literacy session, while the qualitative data indicated that many students were able to state and describe important research concepts they learned. This article profiles the development of Grinnell’s AiA project and discusses how Grinnell’s librarians responded when the initial results led to more questions rather than to satisfactory answers.


1981 ◽  
Vol 44 (1) ◽  
pp. 62-65 ◽  
Author(s):  
FATIMA S. ALI ◽  
FRANCES O. VANDUYNE

Six lots of ground meat, obtained at intervals from a local supermarket, were frozen, and later held with other frozen foods in the freezer compartment of a refrigerator-freezer where power failure was simulated by unplugging the unit. Mean values for the counts (log10) of the beef as purchased were as follows: aerobic and psychrotrophic plate counts 6.35 and 6.66, respectively; presumptive coliforms 4.48; coagulase-positive staphylococci 4.67; and presumptive Clostridium perfringens 1.43. Presumptive salmonellae were detected in three of the six lots. Counts of the same order of magnitude as above were obtained after 7 days in the freezers, complete defrost of the meat and 6 h thereafter. Between 6 and 24 h, aerobic and psychrotrophic plate counts and numbers of coliforms and coagulase-positive staphylococci increased approximately 10-fold. Forty-eight hours after complete defrost, further increases in counts occurred. The appearance and aroma of the meat were acceptable 24 h after defrost; after 48 h, it would have been discarded because of browning, slime and off-odors.


Sign in / Sign up

Export Citation Format

Share Document