scholarly journals Mechanistic Modeling of Biochemical Systems Without a Priori Parameter Values Using the Design Space Toolbox v.3.0

2020 ◽  
Author(s):  
Miguel A. Valderrama-Gomez ◽  
Jason G. Lomnitz ◽  
Rick A. Fasani ◽  
Michael A. Savageau
Author(s):  
Miguel Á. Valderrama-Gómez ◽  
Jason G. Lomnitz ◽  
Rick A. Fasani ◽  
Michael A. Savageau

SummaryMechanistic models of biochemical systems provide a rigorous kinetics-based description of various biological phenomena. They are indispensable to elucidate biological design principles and to devise and engineer systems with novel functionalities. To date, mathematical analysis and characterization of these models remain a challenging endeavor, the main difficulty being the lack of information for most system parameters. Here, we introduce the Design Space Toolbox v.3.0 (DST3), a software implementation of the Design Space formalism that enables mechanistic modeling of complex biological processes without requiring previous knowledge of the parameter values involved. This is achieved by making use of a phenotype-centric modeling approach, in which the system is first decomposed into a series of biochemical phenotypes. Parameter values realizing phenotypes of interest are predicted in a second step. DST3 represents the most generally applicable implementation of the Design Space formalism to date and offers unique advantages over earlier versions. By expanding the capabilities of the Design Space formalism and streamlining its distribution, DST3 represents a valuable tool for elucidating biological design principles and guiding the design and optimization of novel synthetic circuits.


iScience ◽  
2020 ◽  
Vol 23 (6) ◽  
pp. 101200 ◽  
Author(s):  
Miguel Á. Valderrama-Gómez ◽  
Jason G. Lomnitz ◽  
Rick A. Fasani ◽  
Michael A. Savageau

2021 ◽  
Author(s):  
Miguel Angel Valderrama-Gomez ◽  
Michael A. Savageau

Phenotype-centric modeling enables a paradigm shift in the analysis of kinetic models. It brings the focus to a network's biochemical phenotypes and their relationship with measurable traits (e.g., product yields, system dynamics, signal amplification factors, etc.) and away from computationally intensive parameter sampling and numerical simulation. Here, we explore applications of this new modeling strategy in the field of Rational Metabolic Engineering using the amorphadiene biosynthetic network as a case study. Our phenotype-centric approach not only identifies known beneficial intervention strategies for this network, but it also provides an understanding of the mechanistic context for the validity of these predictions. Additionally, we propose a set of hypothetical strains with the potential to outperform reported production strains and enhance the mechanistic understanding of the amorphadiene biosynthetic network. We believe that phenotype-centric modeling can advance the field of Rational Metabolic Engineering by enabling the development of next generation kinetics-based algorithms and methods that do not rely on a priori knowledge of kinetic parameters but allow a structured, global analysis of the design space of parameter values.


2021 ◽  
Author(s):  
Sebastian F. Riebl ◽  
Christian Wakelam ◽  
Reinhard Niehuis

Abstract Turbine Vane Frames (TVF) are a way to realize more compact jet engine designs. Located between the high pressure turbine (HPT) and the low pressure turbine (LPT), they fulfill structural and aerodynamic tasks. When used as an integrated concept with splitters located between the structural load-bearing vanes, the TVF configuration contains more than one type of airfoil with sometimes pronouncedly different properties. This system of multidisciplinary demands and mixed blading poses an interesting opportunity for optimization. Within the scope of the present work, a full geometric parameterization of a TVF with splitters is presented. The parameterization is chosen as to minimize the number of parameters required to automatically and flexibly represent all blade types involved in a TVF row in all three dimensions. Typical blade design parameters are linked to the fourth order Bézier-curve controlled camber line-thickness parameterization. Based on conventional design rules, a procedure is presented, which sets the parameters within their permissible ranges according to the imposed constraints, using a proprietary developed code. The presented workflow relies on subsequent three dimensional geometry generation by transfer of the proposed parameter set to a commercially available CAD package. The interdependencies of parameters are discussed and their respective significance for the adjustment process is detailed. Furthermore, the capability of the chosen parameterization and adjustment process to rebuild an exemplary reference TVF geometry is demonstrated. The results are verified by comparing not only geometrical profile data, but also validated CFD simulation results between the rebuilt and original geometries. Measures taken to ensure the robustness of the method are highlighted and evaluated by exploring extremes in the permissible design space. Finally, the embedding of the proposed method within the framework of an automated, gradient free numerical optimization is discussed. Herein, implications of the proposed method on response surface modeling in combination with the optimization method are highlighted. The method promises to be an option for improvement of optimization efficiency in gradient free optimization of interdependent blade geometries, by a-priori excluding unsuitable blade combinations, yet keeping restrictions to the design space as limited as possible.


Author(s):  
Vijitashwa Pandey ◽  
Zissimos P. Mourelatos ◽  
Monica Majcher

Optimization is needed for effective decision based design (DBD). However, a utility function assessed a priori in DBD does not usually capture the preferences of the decision maker over the entire design space. As a result, when the optimizer searches for the optimal design, it traverses (or ends up) in regions where the preference order among different solutions is different from the actual order. For a highly non-convex design space, this can lead to convergence to a grossly suboptimal design depending on the initial design. In this article, we propose two approaches to alleviate this issue. First, we map the trajectory of the solution as generated by the optimizer and generate ranking questions that are presented to the designer to verify the correctness of the utility function. We then propose backtracking rules if a local utility function is very different from the initially assessed function. We demonstrate our methodology using a mathematical example and a welded beam design problem.


2020 ◽  
Vol 14 (4) ◽  
pp. 640-652
Author(s):  
Abraham Gale ◽  
Amélie Marian

Ranking functions are commonly used to assist in decision-making in a wide variety of applications. As the general public realizes the significant societal impacts of the widespread use of algorithms in decision-making, there has been a push towards explainability and transparency in decision processes and results, as well as demands to justify the fairness of the processes. In this paper, we focus on providing metrics towards explainability and transparency of ranking functions, with a focus towards making the ranking process understandable, a priori , so that decision-makers can make informed choices when designing their ranking selection process. We propose transparent participation metrics to clarify the ranking process, by assessing the contribution of each parameter used in the ranking function in the creation of the final ranked outcome, using information about the ranking functions themselves, as well as observations of the underlying distributions of the parameter values involved in the ranking. To evaluate the outcome of the ranking process, we propose diversity and disparity metrics to measure how similar the selected objects are to each other, and to the underlying data distribution. We evaluate the behavior of our metrics on synthetic data, as well as on data and ranking functions on two real-world scenarios: high school admissions and decathlon scoring.


1997 ◽  
Vol 43 (143) ◽  
pp. 180-191 ◽  
Author(s):  
Ε. M. Morris ◽  
H. -P. Bader ◽  
P. Weilenmann

AbstractA physics-based snow model has been calibrated using data collected at Halley Bay, Antarctica, during the International Geophysical Year. Variations in snow temperature and density are well-simulated using values for the model parameters within the range reported from other polar field experiments. The effect of uncertainty in the parameter values on the accuracy of the predictions is no greater than the effect of instrumental error in the input data. Thus, this model can be used with parameters determined a priori rather than by optimization. The model has been validated using an independent data set from Halley Bay and then used to estimate 10 m temperatures on the Antarctic Peninsula plateau over the last half-century.


FEBS Letters ◽  
2009 ◽  
Vol 583 (24) ◽  
pp. 3914-3922 ◽  
Author(s):  
Michael A. Savageau ◽  
Rick A. Fasani

1987 ◽  
Vol 109 (3) ◽  
pp. 193-202 ◽  
Author(s):  
H. Seraji

The paper presents a new approach to adaptive control of manipulators to achieve trajectory tracking by the joint angles. The central concept in this approach is the utilization of the manipulator “inverse” as a feedforward controller. The desired trajectory is applied as an input to the feedforward controller which “behaves” as the “inverse” of the manipulator at any operating point; and the controller output is used as the driving torque for the manipulator. The controller gains are then updated by an adaptation algorithm derived from MR AC theory to cope with variations in the manipulator inverse due to changes of the operating point. An adaptive feedback controller and an auxiliary signal are also used to enhance closed-loop stability and to achieve faster adaptation. The proposed control scheme is computationally fast and does not require a priori knowledge of the complex dynamic model or the parameter values of the manipulator or the payload. Simulation results are presented in support of the proposed adaptive control scheme. The results demonstrate that the adaptive controller performs remarkably well for different reference trajectories and despite gross variations in the payload.


Author(s):  
Pek Ek Ong ◽  
Audrey Kah Ching Huong ◽  
Xavier Toh Ik Ngu ◽  
Farhanahani Mahmud ◽  
Sheena Punai Philimon

Noninvasive measurement of health parameters such as blood oxygen saturation and bilirubin concentration predicted via an appropriate light reflectance model based on the measured optical signals is of eminent interest in biomedical research. This is to replace the use of conventional invasive blood sampling approach. This study aims to investigate the feasibility of using Modified Lambert Beer model (MLB) in the prediction of one’s bilirubin concentration and blood oxygen saturation value, SO2. This quantification technique is based on a priori knowledge of extinction coefficients of bilirubin and hemoglobin derivatives in the wavelength range of 440 – 500 nm. The validity of the prediction was evaluated using light reflectance data from TracePro raytracing software for a single-layered skin model with varying bilirubin concentration. The results revealed some promising trends in the estimated bilirubin concentration with mean ± standard deviation (SD) error of 0.255 ± 0.025 g/l. Meanwhile, a remarkable low mean ± SD error of 9.11 ± 2.48 % was found for the predicted SO2 value. It was concluded that these errors are likely due to the insufficiency of the MLB at describing changes in the light attenuation with the underlying light absorption processes. In addition, this study also suggested the use of a linear regression model deduced from this work for an improved prediction of the required health parameter values.


Sign in / Sign up

Export Citation Format

Share Document