A Sensitivity Analysis of Two Mesoscale Models: COAMPS and WRF

2020 ◽  
Vol 148 (7) ◽  
pp. 2997-3014
Author(s):  
Caren Marzban ◽  
Robert Tardif ◽  
Scott Sandgathe

Abstract A sensitivity analysis methodology recently developed by the authors is applied to COAMPS and WRF. The method involves varying model parameters according to Latin Hypercube Sampling, and developing multivariate multiple regression models that map the model parameters to forecasts over a spatial domain. The regression coefficients and p values testing whether the coefficients are zero serve as measures of sensitivity of forecasts with respect to model parameters. Nine model parameters are selected from COAMPS and WRF, and their impact is examined on nine forecast quantities (water vapor, convective and gridscale precipitation, and air temperature and wind speed at three altitudes). Although the conclusions depend on the model parameters and specific forecast quantities, it is shown that sensitivity to model parameters is often accompanied by nontrivial spatial structure, which itself depends on the underlying forecast model (i.e., COAMPS vs WRF). One specific difference between these models is in their sensitivity with respect to a parameter that controls temperature increments in the Kain–Fritsch trigger function; whereas this parameter has a distinct spatial structure in COAMPS, that structure is completely absent in WRF. The differences between COAMPS and WRF also extend to the quality of the statistical models used to assess sensitivity; specifically, the differences are largest over the waters off the southeastern coast of the United States. The implication of these findings is twofold: not only is the spatial structure of sensitivities different between COAMPS and WRF, the underlying relationship between the model parameters and the forecasts is also different between the two models.

2021 ◽  
Author(s):  
Sabine Bauer ◽  
Ivanna Kramer

The knowledge about the impact of structure-specific parameters on the biomechanical behavior of a computer model has an essential meaning for the realistic modeling and system improving. Especially the biomechanical parameters of the intervertebral discs, the ligamentous structures and the facet joints are seen in the literature as significant components of a spine model, which define the quality of the model. Therefore, it is important to understand how the variations of input parameters for these components affect the entire model and its individual structures. Sensitivity analysis can be used to gain the required knowledge about the correlation of the input and output variables in a complex spinal model. The present study analyses the influence of the biomechanical parameters of the intervertebral disc using different sensitivity analysis methods to optimize the spine model parameters. The analysis is performed with a multi-body simulation model of the cervical functional spinal unit C6-C7.


2019 ◽  
Author(s):  
Mohsen Yaghoubi ◽  
Amin Adibi ◽  
Zafar Zafari ◽  
J Mark FitzGerald ◽  
Shawn D. Aaron ◽  
...  

AbstractBackgroundAsthma diagnosis in the community is often made without objective testing.ObjectiveThe aim of this study was to evaluate the cost-effectiveness of implementing a stepwise objective diagnostic verification algorithm among patients with community-diagnosed asthma in the United States (US).MethodsWe developed a probabilistic time-in-state cohort model that compared a stepwise asthma verification algorithm based on spirometry and methacholine challenge test against the current standard of care over 20 years. Model input parameters were informed from the literature and with original data analyses when required. The target population was US adults (≥15 y/o) with physician-diagnosed asthma. The final outcomes were costs (in 2018 $) and quality-adjusted life years (QALYs), discounted at 3% annually. Deterministic and probabilistic analyses were undertaken to examine the effect of alternative assumptions and uncertainty in model parameters on the results.ResultsIn a simulated cohort of 10,000 adults with diagnosed asthma, the stepwise algorithm resulted in the removal of diagnosis in 3,366. This was projected to be associated with savings of $36.26 million in direct costs and a gain of 4,049.28 QALYs over 20 years. Extrapolating these results to the US population indicated an undiscounted potential savings of $56.48 billion over 20 years. Results were robust against alternative assumptions and plausible changes in values of input parameters.ConclusionImplementation of a simple diagnostic testing algorithm to verify asthma diagnosis might result in substantial savings and improvement in patients’ quality of life.Key MessagesCompared with current standards of practice, the implementation of an asthma verification algorithm among US adults with diagnosed asthma can be associated with reduction in costs and gain in quality of life.There is substantial room for improving patient care and outcomes through promoting objective asthma diagnosis.Capsule summaryAsthma ‘overdiagnosis’ is common among US adults. An objective, stepwise verification algorithm for re-evaluation of asthma diagnosis can result in substantial savings in costs and improvements in quality of life.


1998 ◽  
Vol 84 (6) ◽  
pp. 2070-2088 ◽  
Author(s):  
Thien D. Bui ◽  
Donald Dabdub ◽  
Steven C. George

The steady-state exchange of inert gases across an in situ canine trachea has recently been shown to be limited equally by diffusion and perfusion over a wide range (0.01–350) of blood solubilities (βblood; ml ⋅ ml−1 ⋅ atm−1). Hence, we hypothesize that the exchange of ethanol (βblood = 1,756 at 37°C) in the airways depends on the blood flow rate from the bronchial circulation. To test this hypothesis, the dynamics of the bronchial circulation were incorporated into an existing model that describes the simultaneous exchange of heat, water, and a soluble gas in the airways. A detailed sensitivity analysis of key model parameters was performed by using the method of Latin hypercube sampling. The model accurately predicted a previously reported experimental exhalation profile of ethanol ( R 2= 0.991) as well as the end-exhalation airstream temperature (34.6°C). The model predicts that 27, 29, and 44% of exhaled ethanol in a single exhalation are derived from the tissues of the mucosa and submucosa, the bronchial circulation, and the tissue exterior to the submucosa (which would include the pulmonary circulation), respectively. Although the concentration of ethanol in the bronchial capillary decreased during inspiration, the three key model outputs (end-exhaled ethanol concentration, the slope of phase III, and end-exhaled temperature) were all statistically insensitive ( P > 0.05) to the parameters describing the bronchial circulation. In contrast, the model outputs were all sensitive ( P < 0.05) to the thickness of tissue separating the core body conditions from the bronchial smooth muscle. We conclude that both the bronchial circulation and the pulmonary circulation impact soluble gas exchange when the entire conducting airway tree is considered.


2020 ◽  
Vol 13 (10) ◽  
pp. 4691-4712
Author(s):  
Chia-Te Chien ◽  
Markus Pahlow ◽  
Markus Schartau ◽  
Andreas Oschlies

Abstract. We analyse 400 perturbed-parameter simulations for two configurations of an optimality-based plankton–ecosystem model (OPEM), implemented in the University of Victoria Earth System Climate Model (UVic-ESCM), using a Latin hypercube sampling method for setting up the parameter ensemble. A likelihood-based metric is introduced for model assessment and selection of the model solutions closest to observed distributions of NO3-, PO43-, O2, and surface chlorophyll a concentrations. The simulations closest to the data with respect to our metric exhibit very low rates of global N2 fixation and denitrification, indicating that in order to achieve rates consistent with independent estimates, additional constraints have to be applied in the calibration process. For identifying the reference parameter sets, we therefore also consider the model's ability to represent current estimates of water-column denitrification. We employ our ensemble of model solutions in a sensitivity analysis to gain insights into the importance and role of individual model parameters as well as correlations between various biogeochemical processes and tracers, such as POC export and the NO3- inventory. Global O2 varies by a factor of 2 and NO3- by more than a factor of 6 among all simulations. Remineralisation rate is the most important parameter for O2, which is also affected by the subsistence N quota of ordinary phytoplankton (Q0,phyN) and zooplankton maximum specific ingestion rate. Q0,phyN is revealed as a major determinant of the oceanic NO3- pool. This indicates that unravelling the driving forces of variations in phytoplankton physiology and elemental stoichiometry, which are tightly linked via Q0,phyN, is a prerequisite for understanding the marine nitrogen inventory.


2006 ◽  
Vol 8 (3) ◽  
pp. 223-234 ◽  
Author(s):  
Husam Baalousha

Characterisation of groundwater modelling involves significant uncertainty because of estimation errors of these models and other different sources of uncertainty. Deterministic models do not account for uncertainties in model parameters, and thus lead to doubtful output. The main alternatives for deterministic models are the probabilistic models and perturbation methods such as Monte Carlo Simulation (MCS). Unfortunately, these methods have many drawbacks when applied in risk analysis of groundwater pollution. In this paper, a modified Latin Hypercube Sampling method is presented and used for risk, uncertainty, and sensitivity analysis of groundwater pollution. The obtained results were compared with other sampling methods. Results of the proposed method have shown that it can predict the groundwater contamination risk for all values of probability better than other methods, maintaining the accuracy of mean estimation. Sensitivity analysis results reveal that the contaminant concentration is more sensitive to longitudinal dispersivity than to velocity.


2020 ◽  
Author(s):  
Chia-Te Chien ◽  
Markus Pahlow ◽  
Markus Schartau ◽  
Andreas Oschlies

Abstract. We analyse 400 perturbed-parameter simulations for two configurations of an optimality-based plankton-ecosystem model (OPEM), implemented in the University of Victoria Earth-System Climate Model (UVic-ESCM), using a Latin-Hypercube sampling method for setting up the parameter ensemble. A likelihood-based metric is introduced for model assessment and selection of the model solutions closest to observed distributions of NO3−, PO43−, O2, and surface chlorophyll a concentrations. According to our metric the optimal model solutions comprise low rates of global N2 fixation and denitrification. These two rate estimates turned out to be poorly constrained by the data. For identifying the “best” model solutions we therefore also consider the model’s ability to represent current estimates of water-column denitrification. We employ our ensemble of model solutions in a sensitivity analysis to gain insights into the importance and role of individual model parameters as well as correlations between various biogeochemical processes and tracers, such as POC export and the NO3− inventory. Global O2 varies by a factor of two and NO3− by more than a factor of six among all simulations. Remineralisation rate is the most important parameter for O2, which is also affected by the subsistence N quota of ordinary phytoplankton (QN0,phy) and zooplankton maximum specific ingestion rate. QN0,phy is revealed as a major determinant of the oceanic NO3− pool. This indicates that unraveling the driving forces of variations in phytoplankton physiology and elemental stoichiometry, which are tightly linked via QN0,phy, is a prerequisite for understanding the marine nitrogen inventory.


Atmosphere ◽  
2018 ◽  
Vol 9 (8) ◽  
pp. 296 ◽  
Author(s):  
Adam Kochanski ◽  
Aimé Fournier ◽  
Jan Mandel

Observational data collected during experiments, such as the planned Fire and Smoke Model Evaluation Experiment (FASMEE), are critical for evaluating and transitioning coupled fire-atmosphere models like WRF-SFIRE and WRF-SFIRE-CHEM into operational use. Historical meteorological data, representing typical weather conditions for the anticipated burn locations and times, have been processed to initialize and run a set of simulations representing the planned experimental burns. Based on an analysis of these numerical simulations, this paper provides recommendations on the experimental setup such as size and duration of the burns, and optimal sensor placement. New techniques are developed to initialize coupled fire-atmosphere simulations with weather conditions typical of the planned burn locations and times. The variation and sensitivity analysis of the simulation design to model parameters performed by repeated Latin Hypercube Sampling is used to assess the locations of the sensors. The simulations provide the locations for the measurements that maximize the expected variation of the sensor outputs with varying the model parameters.


2013 ◽  
Vol 17 (12) ◽  
pp. 5109-5125 ◽  
Author(s):  
J. D. Herman ◽  
J. B. Kollat ◽  
P. M. Reed ◽  
T. Wagener

Abstract. Distributed watershed models are now widely used in practice to simulate runoff responses at high spatial and temporal resolutions. Counter to this purpose, diagnostic analyses of distributed models currently aggregate performance measures in space and/or time and are thus disconnected from the models' operational and scientific goals. To address this disconnect, this study contributes a novel approach for computing and visualizing time-varying global sensitivity indices for spatially distributed model parameters. The high-resolution model diagnostics employ the method of Morris to identify evolving patterns in dominant model processes at sub-daily timescales over a six-month period. The method is demonstrated on the United States National Weather Service's Hydrology Laboratory Research Distributed Hydrologic Model (HL-RDHM) in the Blue River watershed, Oklahoma, USA. Three hydrologic events are selected from within the six-month period to investigate the patterns in spatiotemporal sensitivities that emerge as a function of forcing patterns as well as wet-to-dry transitions. Events with similar magnitudes and durations exhibit significantly different performance controls in space and time, indicating that the diagnostic inferences drawn from representative events will be heavily biased by the a priori selection of those events. By contrast, this study demonstrates high-resolution time-varying sensitivity analysis, requiring no assumptions regarding representative events and allowing modelers to identify transitions between sets of dominant parameters or processes a posteriori. The proposed approach details the dynamics of parameter sensitivity in nearly continuous time, providing critical diagnostic insights into the underlying model processes driving predictions. Furthermore, the approach offers the potential to identify transition points between dominant parameters and processes in the absence of observations, such as under nonstationarity.


2013 ◽  
Vol 10 (86) ◽  
pp. 20121018 ◽  
Author(s):  
Jianyong Wu ◽  
Radhika Dhingra ◽  
Manoj Gambhir ◽  
Justin V. Remais

Sensitivity analysis (SA) can aid in identifying influential model parameters and optimizing model structure, yet infectious disease modelling has yet to adopt advanced SA techniques that are capable of providing considerable insights over traditional methods. We investigate five global SA methods—scatter plots, the Morris and Sobol’ methods, Latin hypercube sampling-partial rank correlation coefficient and the sensitivity heat map method—and detail their relative merits and pitfalls when applied to a microparasite (cholera) and macroparasite (schistosomaisis) transmission model. The methods investigated yielded similar results with respect to identifying influential parameters, but offered specific insights that vary by method. The classical methods differed in their ability to provide information on the quantitative relationship between parameters and model output, particularly over time. The heat map approach provides information about the group sensitivity of all model state variables, and the parameter sensitivity spectrum obtained using this method reveals the sensitivity of all state variables to each parameter over the course of the simulation period, especially valuable for expressing the dynamic sensitivity of a microparasite epidemic model to its parameters. A summary comparison is presented to aid infectious disease modellers in selecting appropriate methods, with the goal of improving model performance and design.


2015 ◽  
Vol 2015 ◽  
pp. 1-12 ◽  
Author(s):  
J. B. H. Njagarah ◽  
F. Nyabadza

A mathematical model for the dynamics of cholera transmission with permissible controls between two connected communities is developed and analysed. The dynamics of the disease in the adjacent communities are assumed to be similar, with the main differences only reflected in the transmission and disease related parameters. This assumption is based on the fact that adjacent communities often have different living conditions and movement is inclined toward the community with better living conditions. Community specific reproduction numbers are given assuming movement of those susceptible, infected, and recovered, between communities. We carry out sensitivity analysis of the model parameters using the Latin Hypercube Sampling scheme to ascertain the degree of effect the parameters and controls have on progression of the infection. Using principles from optimal control theory, a temporal relationship between the distribution of controls and severity of the infection is ascertained. Our results indicate that implementation of controls such as proper hygiene, sanitation, and vaccination across both affected communities is likely to annihilate the infection within half the time it would take through self-limitation. In addition, although an infection may still break out in the presence of controls, it may be up to 8 times less devastating when compared with the case when no controls are in place.


Sign in / Sign up

Export Citation Format

Share Document