scholarly journals Approximation and Uncertainty Quantification of Systems with Arbitrary Parameter Distributions Using Weighted Leja Interpolation

Algorithms ◽  
2020 ◽  
Vol 13 (3) ◽  
pp. 51
Author(s):  
Dimitrios Loukrezis ◽  
Herbert De Gersem

Approximation and uncertainty quantification methods based on Lagrange interpolation are typically abandoned in cases where the probability distributions of one or more system parameters are not normal, uniform, or closely related distributions, due to the computational issues that arise when one wishes to define interpolation nodes for general distributions. This paper examines the use of the recently introduced weighted Leja nodes for that purpose. Weighted Leja interpolation rules are presented, along with a dimension-adaptive sparse interpolation algorithm, to be employed in the case of high-dimensional input uncertainty. The performance and reliability of the suggested approach is verified by four numerical experiments, where the respective models feature extreme value and truncated normal parameter distributions. Furthermore, the suggested approach is compared with a well-established polynomial chaos method and found to be either comparable or superior in terms of approximation and statistics estimation accuracy.

Author(s):  
Djamalddine Boumezerane

Abstract In this study, we use possibility distribution as a basis for parameter uncertainty quantification in one-dimensional consolidation problems. A Possibility distribution is the one-point coverage function of a random set and viewed as containing both partial ignorance and uncertainty. Vagueness and scarcity of information needed for characterizing the coefficient of consolidation in clay can be handled using possibility distributions. Possibility distributions can be constructed from existing data, or based on transformation of probability distributions. An attempt is made to set a systematic approach for estimating uncertainty propagation during the consolidation process. The measure of uncertainty is based on Klir's definition (1995). We make comparisons with results obtained from other approaches (probabilistic…) and discuss the importance of using possibility distributions in this type of problems.


1984 ◽  
Vol 1 (19) ◽  
pp. 164 ◽  
Author(s):  
A. Mol ◽  
R.L. Groeneveld ◽  
A.J. Waanders

This paper discusses the need to incorporate a reliability analysis in the design procedures for rubble mound breakwaters. Such an analysis is defined and a suggested approach is outlined. Failure mechanisms are analysed and categorized in Damage Event Trees. The probability of failure is computed using a level III simulation method to include time and cumulative effects and to account for skewed probability distributions. Typical outputs of the computer program are shown and compared with results according to traditional design approaches. The paper concludes that there is a definite need to include reliability analysis in the design procedures for larger breakwaters and such an analysis must consider the accuracy of design parameters and methods.


Author(s):  
D. Bigoni ◽  
A. P. Engsig-Karup ◽  
H. True

This paper describes the results of the application of Uncertainty Quantification methods to a simple railroad vehicle dynamical example. Uncertainty Quantification methods take the probability distribution of the system parameters that stems from the parameter tolerances into account in the result. In this paper the methods are applied to a low-dimensional vehicle dynamical model composed by a two-axle truck that is connected to a car body by a lateral spring, a lateral damper and a torsional spring, all with linear characteristics. Their characteristics are not deterministically defined, but they are defined by probability distributions. The model — but with deterministically defined parameters — was studied in [1] and [2], and this article will focus on the calculation of the critical speed of the model, when the distribution of the parameters is taken into account. Results of the application of the traditional Monte Carlo sampling method will be compared with the results of the application of advanced Uncertainty Quantification methods [3]. The computational performance and fast convergence that result from the application of advanced Uncertainty Quantification methods is highlighted. Generalized Polynomial Chaos will be presented in the Collocation form with emphasis on the pros and cons of each of those approaches.


2020 ◽  
Vol 6 ◽  
pp. e298
Author(s):  
Fernando Rojas ◽  
Peter Wanke ◽  
Giuliani Coluccio ◽  
Juan Vega-Vargas ◽  
Gonzalo F. Huerta-Canepa

This paper proposes a slow-moving management method for a system using of intermittent demand per unit time and lead time demand of items in service enterprise inventory models. Our method uses zero-inflated truncated normal statistical distribution, which makes it possible to model intermittent demand per unit time using mixed statistical distribution. We conducted numerical experiments based on an algorithm used to forecast intermittent demand over fixed lead time to show that our proposed distributions improved the performance of the continuous review inventory model with shortages. We evaluated multi-criteria elements (total cost, fill-rate, shortage of quantity per cycle, and the adequacy of the statistical distribution of the lead time demand) for decision analysis using the Technique for Order of Preference by Similarity to Ideal Solution (TOPSIS). We confirmed that our method improved the performance of the inventory model in comparison to other commonly used approaches such as simple exponential smoothing and Croston’s method. We found an interesting association between the intermittency of demand per unit of time, the square root of this same parameter and reorder point decisions, that could be explained using classical multiple linear regression model. We confirmed that the parameter of variability of the zero-inflated truncated normal statistical distribution used to model intermittent demand was positively related to the decision of reorder points. Our study examined a decision analysis using illustrative example. Our suggested approach is original, valuable, and, in the case of slow-moving item management for service companies, allows for the verification of decision-making using multiple criteria.


2020 ◽  
Vol 8 ◽  
Author(s):  
Brioch Hemmings ◽  
Matthew J. Knowling ◽  
Catherine R. Moore

Effective decision making for resource management is often supported by combining predictive models with uncertainty analyses. This combination allows quantitative assessment of management strategy effectiveness and risk. Typically, history matching is undertaken to increase the reliability of model forecasts. However, the question of whether the potential benefit of history matching will be realized, or outweigh its cost, is seldom asked. History matching adds complexity to the modeling effort, as information from historical system observations must be appropriately blended with the prior characterization of the system. Consequently, the cost of history matching is often significant. When it is not implemented appropriately, history matching can corrupt model forecasts. Additionally, the available data may offer little decision-relevant information, particularly where data and forecasts are of different types, or represent very different stress regimes. In this paper, we present a decision support modeling workflow where early quantification of model uncertainty guides ongoing model design and deployment decisions. This includes providing justification for undertaking (or forgoing) history matching, so that unnecessary modeling costs can be avoided and model performance can be improved. The workflow is demonstrated using a regional-scale modeling case study in the Wairarapa Valley (New Zealand), where assessments of stream depletion and nitrate-nitrogen contamination risks are used to support water-use and land-use management decisions. The probability of management success/failure is assessed by comparing the proximity of model forecast probability distributions to ecologically motivated decision thresholds. This study highlights several important insights that can be gained by undertaking early uncertainty quantification, including: i) validation of the prior numerical characterization of the system, in terms of its consistency with historical observations; ii) validation of model design or indication of areas of model shortcomings; iii) evaluation of the relative proximity of management decision thresholds to forecast probability distributions, providing a justifiable basis for stopping modeling.


2011 ◽  
Vol 10 (1) ◽  
pp. 140-160 ◽  
Author(s):  
Akil Narayan ◽  
Dongbin Xiu

AbstractIn this work we consider a general notion ofdistributional sensitivity, which measures the variation in solutions of a given physical/mathematical system with respect to the variation of probability distribution of the inputs. This is distinctively different from the classical sensitivity analysis, which studies the changes of solutions with respect to the values of the inputs. The general idea is measurement of sensitivity of outputs with respect to probability distributions, which is a well-studied concept in related disciplines. We adapt these ideas to present a quantitative framework in the context of uncertainty quantification for measuring such a kind of sensitivity and a set of efficient algorithms to approximate the distributional sensitivity numerically. A remarkable feature of the algorithms is that they do not incur additional computational effort in addition to a one-time stochastic solver. Therefore, an accurate stochastic computation with respect to a prior input distribution is needed only once, and the ensuing distributional sensitivity computation for different input distributions is a post-processing step. We prove that an accurate numericalmodel leads to accurate calculations of this sensitivity, which applies not just to slowly-converging Monte-Carlo estimates, but also to exponentially convergent spectral approximations. We provide computational examples to demonstrate the ease of applicability and verify the convergence claims.


Author(s):  
Hesham K. Alfares ◽  
Salih O. Duffuaa

This paper presents a simulation study to assess the performance of the five known methods for converting ranks of several criteria into weights in multi-criteria decision-making. The five methods assessed for converting criteria ranks into weights are: rank- sum (RS) weights, rank reciprocal (RR) weights, rank order centroid (ROC) weights, geometric weights (GW), and variable-slope linear (VSL) weights. The methods are compared in terms of weight estimation accuracy considering different numbers of criteria and decision makers’ (MS) preference structures. Alternative preference structures are represented by different probability distributions of randomly generated criteria weights, namely the uniform, normal, and exponential distributions. Results of the simulation experiments indicate that no single method is consistently superior to all others. On average, RS is best for uniform weights, VSL is best for normal weights, and ROC is best for exponential weights. However, for any multi-criteria decision-making (MCDM) problem, the best method for converting criteria ranks into weights depends on both the number of criteria and the weight distribution.


Author(s):  
Mark E. Ewing ◽  
Brian C. Liechty ◽  
David L. Black

Uncertainty quantification (UQ) is gaining in maturity and importance in engineering analysis. While historical engineering analysis and design methods have relied heavily on safety factors (SF) with built-in conservatism, modern approaches require detailed assessment of reliability to provide optimized and balanced designs. This paper presents methodologies that support the transition toward this type of approach. Fundamental concepts are described for UQ in general engineering analysis. These include consideration of the sources of uncertainty and their categorization. Of particular importance are the categorization of aleatory and epistemic uncertainties and their separate propagation through an UQ analysis. This familiar concept is referred to here as a “two-dimensional” approach, and it provides for the assessment of both the probability of a predicted occurrence and the credibility in that prediction. Unique to the approach presented here is the adaptation of the concept of a bounding probability box to that of a credible probability box. This requires estimates for probability distributions related to all uncertainties both aleatory and epistemic. The propagation of these distributions through the uncertainty analysis provides for the assessment of probability related to the system response, along with a quantification of credibility in that prediction. Details of a generalized methodology for UQ in this framework are presented, and approaches for interpreting results are described. Illustrative examples are presented.


2021 ◽  
Author(s):  
Srinivasa Murthy D ◽  
Aruna Jyothy S ◽  
Mallikarjuna P

Abstract The study aims at the probabilistic analysis of annual maximum daily streamflows at the gauging sites of Godavari upper, Godavari middle, Pranahitha, Indravathi and Godavari lower sub-basins. The daily streamflow data at Chass, Ashwi and Pachegaon of Godavari upper, Manjalegaon, Dhalegaon, Zari, GR Bridge, Purna and Yelli of Godavari middle, Gandlapet, Mancherial, Somanpally and Perur of Pranahitha, Pathagudem, Chindnar, Sonarpal, Jagdalpur and Nowrangpur of Indravathi, and, Sardaput, Injaram, Konta, Koida and Polavaram of Godavari lower sub-basins for the period varying between 1965–2011, collected from Central Water Commission (CWC), India were used in the analysis. Statistics of annual maximum daily streamflow series during the study period at the gauging sites of sub-basins indicated moderately variedand positively skewed streamflows, and flows with sharp peaks at the upstream gauging sites. Probabilistic analysis of streamflows showed that lognormal or gamma distribution with conventional moments fitted the maximum daily streamflow data at the gauging sites of Godavari sub-basins.Among 2-parameter distributions with L-moments,GPA2 followed by GAM2/LN2 fitted annual maximum daily streamflow data at most of the gauging sites.At the downstream-most gauging sites of Pranahitha, Indravathi and Godavari lower sub-basins, the data followed W2 probability distribution. Among 3-parameter distributions with L-moments, GPA3 at seven gauging sites, W3 and P3 at five gauging sites each, GLOG at four gauging sites and GEV at two gauging sites fitted the data. Based on the performance evaluation, 2 – parameter distributions using L-moments at the upstream, 3 – parameter distributions at the middle and probability distributions using conventional moments at the downstreamgauging sites performed better in the Godavari upper and middle sub-basins. Probability distributions based on conventional moments/ 3-parameter distributions using L-momentsfitted the annual maximum daily streamflow data at the gauging sites in the Pranahitha, Indravathi and Godavari lower sub-basins satisfactorily.


Sign in / Sign up

Export Citation Format

Share Document