scholarly journals Sensitivity analysis of an electrophysiology model for the left ventricle

2020 ◽  
Vol 17 (171) ◽  
pp. 20200532
Author(s):  
Giulio Del Corso ◽  
Roberto Verzicco ◽  
Francesco Viola

Modelling the cardiac electrophysiology entails dealing with the uncertainties related to the input parameters such as the heart geometry and the electrical conductivities of the tissues, thus calling for an uncertainty quantification (UQ) of the results. Since the chambers of the heart have different shapes and tissues, in order to make the problem affordable, here we focus on the left ventricle with the aim of identifying which of the uncertain inputs mostly affect its electrophysiology. In a first phase, the uncertainty of the input parameters is evaluated using data available from the literature and the output quantities of interest (QoIs) of the problem are defined. According to the polynomial chaos expansion, a training dataset is then created by sampling the parameter space using a quasi-Monte Carlo method whereas a smaller independent dataset is used for the validation of the resulting metamodel. The latter is exploited to run a global sensitivity analysis with nonlinear variance-based indices and thus reduce the input parameter space accordingly. Thereafter, the uncertainty probability distribution of the QoIs are evaluated using a direct UQ strategy on a larger dataset and the results discussed in the light of the medical knowledge.

1991 ◽  
Vol 81 (3) ◽  
pp. 796-817
Author(s):  
Nitzan Rabinowitz ◽  
David M. Steinberg

Abstract We propose a novel multi-parameter approach for conducting seismic hazard sensitivity analysis. This approach allows one to assess the importance of each input parameter at a variety of settings of the other input parameters and thus provides a much richer picture than standard analyses, which assess each input parameter only at the default settings of the other parameters. We illustrate our method with a sensitivity analysis of seismic hazard for Jerusalem. In this example, we find several input parameters whose importance depends critically on the settings of other input parameters. This phenomenon, which cannot be detected by a standard sensitivity analysis, is easily diagnosed by our method. The multi-parameter approach can also be used in the context of a probabilistic assessment of seismic hazard that incorporates subjective probability distributions for the input parameters.


2019 ◽  
Vol 37 (4-6) ◽  
pp. 377-433
Author(s):  
Tatenda Nyazika ◽  
Maude Jimenez ◽  
Fabienne Samyn ◽  
Serge Bourbigot

Over the past years, pyrolysis models have moved from thermal models to comprehensive models with great flexibility including multi-step decomposition reactions. However, the downside is the need for a complete set of input data such as the material properties and the parameters related to the decomposition kinetics. Some of the parameters are not directly measurable or are difficult to determine and they carry a certain degree of uncertainty at high temperatures especially for materials that can melt, shrink, or swell. One can obtain input parameters by searching through the literature; however, certain materials may have the same nomenclature but the material properties may vary depending on the manufacturer, thereby inducing uncertainties in the model. Modelers have resorted to the use of optimization techniques such as gradient-based and direct search methods to estimate input parameters from experimental bench-scale data. As an integral part of the model, a sensitivity study allows to identify the role of each input parameter on the outputs. This work presents an overview of pyrolysis modeling, sensitivity analysis, and optimization techniques used to predict the fire behavior of combustible solids when exposed to an external heat flux.


2021 ◽  
Author(s):  
Séga Ndao

In the context of the Paris Agreement, and considering the importance of methane emissions from cattle in West Africa, application of a Tier 2 method to estimate enteric methane emission factors is clearly pertinent. The current study has two purposes. Firstly, it aims to detect how much each input parameter contributes to the overall uncertainty of enteric methane emission factors for cattle. Secondly, it aims to identify which input parameters require additional research efforts for strengthening the evidence base, thus reducing the uncertainty of methane enteric emission factors. Uncertainty and sensitivity analysis methodologies were applied to input parameters in the calculation of enteric methane emission factors for lactating cows and adult male Senegalese native cattle using the IPCC Tier 2 model. The results show that the IPCC default input parameters, such as the coefficient for calculating net energy for maintenance (Cfi), digestible energy (DE) and the methane conversion rate (Ym) are the first, second and third most important input parameters, respectively, in terms of their contribution to uncertainty of the enteric methane emission factor. Sensitivity analysis demonstrated that future research in Senegal should prioritize the development of Ym, Cfi and DE in order to estimate enteric methane emission factors more accurately and to reduce the uncertainty of the national agricultural greenhouse gas inventory.


Author(s):  
Emmanuel Boafo ◽  
Emmanuel Numapau Gyamfi

Abstract Uncertainty and Sensitivity analysis methods are often used in severe accident analysis for validating the complex physical models employed in the system codes that simulate such scenarios. This is necessitated by the large uncertainties associated with the physical models and boundary conditions employed to simulate severe accident scenarios. The input parameters are sampled within defined ranges based on assigned probability distribution functions (PDFs) for the required number of code runs/realizations using stochastic sampling techniques. Input parameter selection is based on their importance to the key FOM, which is determined by the parameter identification and ranking table (PIRT). Sensitivity analysis investigates the contribution of each uncertain input parameter to the uncertainty of the selected FOM. In this study, the integrated severe accident analysis code MELCOR was coupled with DAKOTA, an optimization and uncertainty quantification tool in order to investigate the effect of input parameter uncertainty on hydrogen generation. The methodology developed was applied to the Fukushima Daiichi unit 1 NPP accident scenario, which was modelled in another study. The results show that there is approximately 22.46% uncertainty in the amount of hydrogen generated as estimated by a single MELCOR run given uncertainty in selected input parameters. The sensitivity analysis results also reveal that MELCOR input parameters; COR_SC 1141(Melt flow rate per unit width at breakthrough candling) , COR_ZP (Porosity of fuel debris beds) and COR_EDR (Characteristic debris size in core region) contributed most significantly to the uncertainty in hydrogen generation.


2021 ◽  
Vol 12 (1) ◽  
Author(s):  
Ryan Roussel ◽  
Juan Pablo Gonzalez-Aguilera ◽  
Young-Kee Kim ◽  
Eric Wisniewski ◽  
Wanming Liu ◽  
...  

AbstractParticle accelerators are invaluable discovery engines in the chemical, biological and physical sciences. Characterization of the accelerated beam response to accelerator input parameters is often the first step when conducting accelerator-based experiments. Currently used techniques for characterization, such as grid-like parameter sampling scans, become impractical when extended to higher dimensional input spaces, when complicated measurement constraints are present, or prior information known about the beam response is scarce. Here in this work, we describe an adaptation of the popular Bayesian optimization algorithm, which enables a turn-key exploration of input parameter spaces. Our algorithm replaces  the need for parameter scans while minimizing prior information needed about the measurement’s behavior and associated measurement constraints. We experimentally demonstrate that our algorithm autonomously conducts an adaptive, multi-parameter exploration of input parameter space, potentially orders of magnitude faster than conventional grid-like parameter scans, while making highly constrained, single-shot beam phase-space measurements and accounts for costs associated with changing input parameters. In addition to applications in accelerator-based scientific experiments, this algorithm addresses challenges shared by many scientific disciplines, and is thus applicable to autonomously conducting experiments over a broad range of research topics.


2016 ◽  
Vol 13 (2) ◽  
Author(s):  
Sheikh Tijan Tabban ◽  
Nelson Fumo

Energy models of buildings can be developed and used for analysis of energy consumption. A model offers the opportunity to simulate a building under specific conditions for analysis of energy efficiency measures or optimum design. Due to the great amount of information needed to develop an energy model of a building, the number of inputs can be reduced by making variable the most relevant input parameters and making the others to take common or standard values. In this study, an analysis of input parameters required by computational tools to estimate energy consumption in homes was done in two stages. In the first stage, common input parameters were identified for three software and three webtools based on the criteria that the input parameter should be common for at least two software and at least one webtool. In the second stage, a sensitivity analysis was performed on the inputs identified in the first stage. The software BEopt, developed by the National Renewable Energy Laboratory, was used as the source of typical input parameters to be compared, and to perform the simulations for the sensitivity analysis. The base or reference model to perform simulations for the sensitivity analysis corresponds to a model developed with information from a research house located on the campus of the University of Texas at Tyler and default inputs for the BEopt B-10 reference benchmark. Results show that besides the location, and consequently the weather, common parameters are building orientation, air leakage, space conditioning settings, space conditioning schedule, water heating equipment, and terrain. Among these parameters, the sensitivity analysis identified the largest variations in energy consumption for variations on space conditioning schedule (heating and cooling setpoints), followed by the type of water heating equipment. KEYWORDS: Residential Buildings; Energy Consumption; Energy Analysis; Input Parameters; Building Simulation; Source Energy


2011 ◽  
Vol 243-249 ◽  
pp. 4068-4074 ◽  
Author(s):  
Tammam Merhej ◽  
De Cheng Feng

Federal aviation administration rigid and flexible iterative elastic layered design (FAARFIELD) software program became the exclusive approved method for airport pavement thickness design adopted by federal aviation administration (FAA) in the United States after the advisory circular AC150/5320-6E “Airport Pavement Design and Evaluation” was issued in September 2009. In this paper, a sensitivity analysis was conducted to investigate the effect of FAARFIELD input parameters on the required thickness of the airport rigid pavement. The input parameters studied are: concrete flexural strength (modulus of rupture, MOR), the subgrade reaction modulus, K, subbase layers and air traffic mix .Each evaluated input parameter was varied within its recommended range to study its effect on the required thickness of the airport pavement. It was found that the concrete modulus of rupture is the most sensitive parameter on the required thickness.


Algorithms ◽  
2020 ◽  
Vol 13 (7) ◽  
pp. 162
Author(s):  
Marion Gödel ◽  
Rainer Fischer ◽  
Gerta Köster

Microscopic crowd simulation can help to enhance the safety of pedestrians in situations that range from museum visits to music festivals. To obtain a useful prediction, the input parameters must be chosen carefully. In many cases, a lack of knowledge or limited measurement accuracy add uncertainty to the input. In addition, for meaningful parameter studies, we first need to identify the most influential parameters of our parametric computer models. The field of uncertainty quantification offers standardized and fully automatized methods that we believe to be beneficial for pedestrian dynamics. In addition, many methods come at a comparatively low cost, even for computationally expensive problems. This allows for their application to larger scenarios. We aim to identify and adapt fitting methods to microscopic crowd simulation in order to explore their potential in pedestrian dynamics. In this work, we first perform a variance-based sensitivity analysis using Sobol’ indices and then crosscheck the results by a derivative-based measure, the activity scores. We apply both methods to a typical scenario in crowd simulation, a bottleneck. Because constrictions can lead to high crowd densities and delays in evacuations, several experiments and simulation studies have been conducted for this setting. We show qualitative agreement between the results of both methods. Additionally, we identify a one-dimensional subspace in the input parameter space and discuss its impact on the simulation. Moreover, we analyze and interpret the sensitivity indices with respect to the bottleneck scenario.


Sign in / Sign up

Export Citation Format

Share Document