Non-linear Multipliers for Risk Interactions

Author(s):  
Yuri G. Raydugin

The purpose of this chapter is to mathematically describe three types of risk interactions (internal risk amplifications, knock-on, and compounding) associated with the static and dynamic PSS–PDS mismatches. This is required to factor all relevant instances of risk interactions into Monte Carlo models. It is shown that corresponding three types of non-linearity parameters should be introduced to form non-linear (quadratic) multipliers for interacting risks. In the linear case (non-interacting risks) all non-linearity parameters are equal to zero and all non-linear multipliers are equal to one. As a risk may take part in several interactions it has several non-linear multipliers. Required non-linearity parameters and non-linear multipliers for opportunities are also developed. When all relevant instances of risk interactions are factored to a project risk register, they describe an aggregated impact of the affinity of interacting risks (dynamic risk pattern) on project schedule and cost objectives.

Author(s):  
Cristiana Tudor ◽  
Maria Tudor

This chapter covers the essentials of using the Monte Carlo Simulation technique (MSC) for project schedule and cost risk analysis. It offers a description of the steps involved in performing a Monte Carlo simulation and provides the basic probability and statistical concepts that MSC is based on. Further, a simple practical spreadsheet example goes through the steps presented before to show how MCS can be used in practice to assess the cost and duration risk of a project and ultimately to enable decision makers to improve the quality of their judgments.


Author(s):  
Yuri G. Raydugin

The purpose of this chapter is to finalize requirements to undertake non-linear Monte Carlo modelling. Besides mathematical aspects (Chapter 13), additional requirements to identification and addressing of risk interactions should be put forward. All relevant instances of risk interactions should be identified. Identification of ‘chronic’ project system issues that serve as additional causes of risks is required to pin down internal risk amplifications. Identification of cross-risk interactions can be undertaken by visualization of dynamic risk patterns (cross-risk interaction mapping). Principles of risk interaction calibration are introduced keeping in mind two challenges. First, a calibration of aggregated risk interactions at the project level. Second, evaluation of individual instances of risk interactions based on the overall calibration. Methods to address identified risk interactions are discussed. In the case of intra-risk interactions, the ‘chronic’ issues should be addressed. In the case of cross-risk interactions, the dynamic risk patterns should be disrupted.


Author(s):  
Yuri G. Raydugin

This chapter is devoted to the second business case of project Zemblanity. Based on the developed risk quantification principles for complex projects, two non-linear Monte Carlo schedule and cost risk analysis (N-SCRA) models are developed. These models factor in all relevant risk interactions before and after addressing. Modified ‘non-linear’ project risk registers that take into account the risk interactions are developed as inputs to the Monte Carlo models. It is shown that before risk interaction addressing the forecast project duration and cost are unacceptably high due to unaddressed risk interactions. Agreed risk interaction addressing measures factored to the models result in the acceptable project duration and cost. A joint confidence level (JCL) concept is used to amend the N-SCRA results at the P70 confidence level to distinguish stretched targets and management reserves using JCL70. The two workable N-SCRA models are available on the book’s companion website.


Author(s):  
Yuri G. Raydugin

There are multiple complaints that existing project risk quantification methods—both parametric and Monte Carlo—fail to produce accurate project duration and cost-risk contingencies in a majority of cases. It is shown that major components of project risk exposure—non-linear risk interactions—pertaining to complex projects are not taken into account. It is argued that a project system consists of two interacting subsystems: a project structure subsystem (PSS) and a project delivery subsystem (PDS). Any misalignments or imbalances between these two subsystems (PSS–PDS mismatches) are associated with the non-linear risk interactions. Principles of risk quantification are developed to take into account three types of non-linear risk interactions in complex projects: internal risk amplifications due to existing ‘chronic’ project system issues, knock-on interactions, and risk compounding. Modified bowtie diagrams for the three types of risk interactions are developed to identify and address interacting risks. A framework to visualize dynamic risk patterns in affinities of interacting risks is proposed. Required mathematical expressions and templates to factor relevant risk interactions to Monte Carlo models are developed. Business cases are discussed to demonstrate the power of the newly-developed non-linear Monte Carlo methodology (non-linear integrated schedule and cost risk analysis (N-SCRA)). A project system dynamics methodology based on rework cycles is adopted as a supporting risk quantification tool. Comparison of results yielded by the non-linear Monte Carlo and system dynamics models demonstrates a good alignment of the two methodologies. All developed Monte Carlo and system dynamics models are available on the book’s companion website.


Author(s):  
Yuri G. Raydugin

The purpose of this chapter is to develop project system dynamic (SD) models that mirror non-linear Monte Carlo N-SCRA models of project Zemblanity. Only schedule part of risk exposure is considered. Required recalculations of parameters is undertaken. As these are no one-to-one relations between the parameters of the SD and Monte Carlo models, required assumptions are applied. These can be used for mutual calibrations of the two types of models. Two SD models are built that reflect on the project risk exposure before and after risk interaction addressing. Limitations of the project SD modelling are revealed. The SD modelling results demonstrate a good alignment of corresponding non-linear schedule and cost risk analysis (N-SCRA) and SD models. One additional SD model is built to explicitly demonstrate a contribution of risk compounding to overall project duration. The three workable SD models are available on the book’s companion website.


2021 ◽  
pp. 1-14
Author(s):  
Tiffany M. Shader ◽  
Theodore P. Beauchaine

Abstract Growth mixture modeling (GMM) and its variants, which group individuals based on similar longitudinal growth trajectories, are quite popular in developmental and clinical science. However, research addressing the validity of GMM-identified latent subgroupings is limited. This Monte Carlo simulation tests the efficiency of GMM in identifying known subgroups (k = 1–4) across various combinations of distributional characteristics, including skew, kurtosis, sample size, intercept effect size, patterns of growth (none, linear, quadratic, exponential), and proportions of observations within each group. In total, 1,955 combinations of distributional parameters were examined, each with 1,000 replications (1,955,000 simulations). Using standard fit indices, GMM often identified the wrong number of groups. When one group was simulated with varying skew and kurtosis, GMM often identified multiple groups. When two groups were simulated, GMM performed well only when one group had steep growth (whether linear, quadratic, or exponential). When three to four groups were simulated, GMM was effective primarily when intercept effect sizes and sample sizes were large, an uncommon state of affairs in real-world applications. When conditions were less ideal, GMM often underestimated the correct number of groups when the true number was between two and four. Results suggest caution in interpreting GMM results, which sometimes get reified in the literature.


2015 ◽  
Vol 48 ◽  
pp. 420-446 ◽  
Author(s):  
Mireille Bossy ◽  
Nicolas Champagnat ◽  
Hélène Leman ◽  
Sylvain Maire ◽  
Laurent Violeau ◽  
...  

2008 ◽  
Vol 04 (02) ◽  
pp. 123-141 ◽  
Author(s):  
AREEG ABDALLA ◽  
JAMES BUCKLEY

We apply our new fuzzy Monte Carlo method to certain fuzzy non-linear regression problems to estimate the best solution. The best solution is a vector of triangular fuzzy numbers, for the fuzzy coefficients in the model, which minimizes an error measure. We use a quasi-random number generator to produce random sequences of these fuzzy vectors which uniformly fill the search space. We consider example problems to show that this Monte Carlo method obtains solutions comparable to those obtained by an evolutionary algorithm.


2018 ◽  
Vol 25 (3) ◽  
pp. 565-587 ◽  
Author(s):  
Mohamed Jardak ◽  
Olivier Talagrand

Abstract. Data assimilation is considered as a problem in Bayesian estimation, viz. determine the probability distribution for the state of the observed system, conditioned by the available data. In the linear and additive Gaussian case, a Monte Carlo sample of the Bayesian probability distribution (which is Gaussian and known explicitly) can be obtained by a simple procedure: perturb the data according to the probability distribution of their own errors, and perform an assimilation on the perturbed data. The performance of that approach, called here ensemble variational assimilation (EnsVAR), also known as ensemble of data assimilations (EDA), is studied in this two-part paper on the non-linear low-dimensional Lorenz-96 chaotic system, with the assimilation being performed by the standard variational procedure. In this first part, EnsVAR is implemented first, for reference, in a linear and Gaussian case, and then in a weakly non-linear case (assimilation over 5 days of the system). The performances of the algorithm, considered either as a probabilistic or a deterministic estimator, are very similar in the two cases. Additional comparison shows that the performance of EnsVAR is better, both in the assimilation and forecast phases, than that of standard algorithms for the ensemble Kalman filter (EnKF) and particle filter (PF), although at a higher cost. Globally similar results are obtained with the Kuramoto–Sivashinsky (K–S) equation.


Sign in / Sign up

Export Citation Format

Share Document