Complexity Risk and Modeling Disorder

Author(s):  
John K. Hollmann

Despite 50 years of cost accuracy research, companies are generally unable to quantify the worst outcomes. In the process industries about 10 percent of large projects overrun their budgets by 70 percent or more. The system behavior of these blowouts often reflects disorder. For complex projects, the blowout proportion is 15 to 30 percent of projects. Many risk analysts ignore the worst outcomes as “unknown-unknowns” or “black swans”; but they are neither—we know the causes and their impact is somewhat predictable. Cost disasters start with a mix of systemic weakness and risk events. The cost of mundane projects may overrun by 20 to 40% which is bad but no disaster (financiers assume they will overrun by 25%). Add complexity and stress and the projects can cross a “tipping point” into disorder and chaos with cost overruns of 50, 100 or 200 percent-true disasters. This chapter describes complexity risk and the disorder it can lead to, practical measures of complexity and stress and how to incorporate those measures in non-linear risk quantification models.

Author(s):  
Yuri G. Raydugin

This chapter is devoted to the second business case of project Zemblanity. Based on the developed risk quantification principles for complex projects, two non-linear Monte Carlo schedule and cost risk analysis (N-SCRA) models are developed. These models factor in all relevant risk interactions before and after addressing. Modified ‘non-linear’ project risk registers that take into account the risk interactions are developed as inputs to the Monte Carlo models. It is shown that before risk interaction addressing the forecast project duration and cost are unacceptably high due to unaddressed risk interactions. Agreed risk interaction addressing measures factored to the models result in the acceptable project duration and cost. A joint confidence level (JCL) concept is used to amend the N-SCRA results at the P70 confidence level to distinguish stretched targets and management reserves using JCL70. The two workable N-SCRA models are available on the book’s companion website.


Author(s):  
Yuri G. Raydugin

There are multiple complaints that existing project risk quantification methods—both parametric and Monte Carlo—fail to produce accurate project duration and cost-risk contingencies in a majority of cases. It is shown that major components of project risk exposure—non-linear risk interactions—pertaining to complex projects are not taken into account. It is argued that a project system consists of two interacting subsystems: a project structure subsystem (PSS) and a project delivery subsystem (PDS). Any misalignments or imbalances between these two subsystems (PSS–PDS mismatches) are associated with the non-linear risk interactions. Principles of risk quantification are developed to take into account three types of non-linear risk interactions in complex projects: internal risk amplifications due to existing ‘chronic’ project system issues, knock-on interactions, and risk compounding. Modified bowtie diagrams for the three types of risk interactions are developed to identify and address interacting risks. A framework to visualize dynamic risk patterns in affinities of interacting risks is proposed. Required mathematical expressions and templates to factor relevant risk interactions to Monte Carlo models are developed. Business cases are discussed to demonstrate the power of the newly-developed non-linear Monte Carlo methodology (non-linear integrated schedule and cost risk analysis (N-SCRA)). A project system dynamics methodology based on rework cycles is adopted as a supporting risk quantification tool. Comparison of results yielded by the non-linear Monte Carlo and system dynamics models demonstrates a good alignment of the two methodologies. All developed Monte Carlo and system dynamics models are available on the book’s companion website.


2018 ◽  
Vol 7 (4) ◽  
pp. 1-20
Author(s):  
Yuri Raydugin

Literature on project management contains an abundancy of references to abnormally high project cost overruns as well as bitter complaints that modern risk quantification methods could not accurately predict project cost outcomes in many cases. This article provides a) an explanation of abnormally high project cost overruns, b) a link of risk quantification with ‘project team's quality' (team's strengths and weaknesses) and bias, c) an explanation why all currently used risk quantification Monte Carlo methodologies are relevant to ‘strong teams' only, d) introduction of a new non-linear probabilistic (Monte Carlo) methodology to define adequate cost contingencies for projects managed by ‘weak teams', e) a practical case of the non-linear (Monte Carlo) probabilistic modeling, and f) rough calibration of non-linear probabilistic (Monte Carlo) models that could be practically used for rule-of-thumb estimating of project cost outcomes.


2019 ◽  
pp. 87-96
Author(s):  
А. Л. Свящук

In the time when the basic formulas and approaches of the aircraft industry are based on the principles of classical science, the nature of the observed phenomena seems non-linear. Such phenomena as turbulence, flutter, buffering, disruption of the air flow can be explained by means of synergetics and system theory in the context of the post-non-classical paradigm. However, a certain contradiction can be observed: non-linear phenomena are explained by linear traditional science. That is why many formulas of aerodynamics and strength have a large empirical part. Therefore, it becomes necessary to revise the philosophical foundations of most approaches and the overall picture of the world as a whole. The use of the concepts of synergetics and system theory allows us to describe more accurately certain phenomena in aviation, which ultimately will lead to the creation of more efficient and safer aircraft. For example, we can design our aircraft not only as a complex system, but also as part of other complex systems, evaluating its effectiveness from the point of view of more ambitious and higher levels, predicting its operation and modernization in the changing conditions of the socio-political system. Moreover, the very nature of engineering creativity based on synergistic approaches will become more efficient and effective by increasing the intensity of aviation thought. Therefore, understanding the role of chance, the effect of emergence will allow us to be prepared for many surprises and black swans and also be wary about our knowledge, assessing their probabilistic nature.


2009 ◽  
pp. 203-251
Author(s):  
Claudio Virno

- Cost overruns are common in large and complex projects, especially in high speed rail ones. Budgeting for cost escalation is a major issue in the planning phase of these projects. This paper describes lessons learned on high speed rail in Italy and focuses on problems such as initial poor design, tactical budgeting, inadequate cost estimation and risk assessment, etc. The paper discusses possible means to avoid major flaws in the initial conceptual design of mega-projects. There is a growing understanding of the need to focus on the front-end phase in order to achieve more successful and cost-effective projects.


2021 ◽  
Author(s):  
Urs Schaltegger

<p>Geoscientists tend to subdivide the system Earth into different subsystems (geosphere, hydrosphere, atmosphere, biosphere), which are interacting with each other in a non-linear way. The quantitative understanding of this interaction is essential to make reconstructions of the geological past. This is mostly done by a linear approach of establishing time-series of chemical and physical proxies, calibrating their contemporaneity through geochronology, and eventually invoke causality. A good example is the comparison of carbon or oxygen isotope time series to the paleo-biodiversity in ancient sedimentary sections, temporally correlated using astrochronology or high-precision U-Pb dating of volcanic zircon in interlayered ash beds. While highly accurate and precise data are necessary to form the basis for linear and non-linear models, we have to be aware that any analysis is the result of an experiment – an isotope-chemical analysis in the U-Pb example - introducing random and non-random noise, which can mimic, disturb, distort or mask non-linear system behavior. High-precision/high-accuracy U-Pb age determination using the mineral zircon (ZrSiO4) and application of the techniques of isotope dilution, thermal ionization mass spectrometry is a good example of such an experiment we apply to the geological history of our planet.</p><p>Two examples where precise U-Pb dating methods are used to link disparate processes are (1) using the duration and the tempo of zircon growth in a magmatic system as a measure for modeling magma flux in space and time, and apply these to infer potential eruptibility and volcanic hazard of a plutonic-volcanic plumbing system; (2) establish absolute age and duration of magma emplacement in large igneous provinces, feed these data into models of volatile injection into and residence of volatile species in the atmosphere, estimate their influence on the inherent parameters of Earth’s climate, and infer causality with climatic, environmental and biotic crises. Both of these are outstanding scientific questions that attract and deserve significant attention by a general as well as academic public. However, insufficient attention is drawn onto the questions of the nature and importance of the noise we add through isotopic age determination.</p><p>There are two prominent issues to be discussed in this context, (1) to what extent (at what precision) can we distinguish natural age variation among zircon grains from random scatter produced by analytical techniques and the complexity of the U-Pb isotopic system in zircon, and (2) how can we correlate the U-Pb dates established for crystallization of zircon in residual and/or assimilated melt portions of mafic magmatic rocks from large igneous provinces to the release and injection of magmatic and contact-metamorphic volatiles into the atmosphere? This contribution intends to demonstrate that analytical scatter and complex system behavior are often confounded with age variation (and vice versa) and will outline new approaches and insights how to quantify their respective contributions.</p>


2018 ◽  
Vol 5 ◽  
pp. 15-22
Author(s):  
Umesh Sukamani ◽  
Hari Mohan Shrestha

A fresh and present look at the performance and delivery of heritage projects is required because few studies have been conducted to explore the specific project management and participant issues that contributed failed elements (time and cost) in heritage projects. The major contribution of this research is in the guidance for improvement to help avoid delays and cost overruns in future heritage renovation projects. Bhaktapur municipality is rich in heritage; here the tourism market is one of the source of economy. Bhaktapur municipality has been selected for this study because many of the heritage renovation projects have been completed each year. Most of heritage renovations have been done with the help of users committee and amanat. Tourism has become an important economic factor for the region. So the heritage renovation is studied with the impact of the delay in the works. The research design for this study is more qualitative than quantitative. The main causes of delay have been found as difficulties in financing projects and poor managerial skills. Similarly, the causes of the cost overruns have been found as material cost increased due to inflation.


2020 ◽  
Vol 68 (8) ◽  
pp. 4905-4918 ◽  
Author(s):  
Antzela Kosta ◽  
Nikolaos Pappas ◽  
Anthony Ephremides ◽  
Vangelis Angelakis
Keyword(s):  

Author(s):  
E. F. G. van Daalen ◽  
J. L. Cozijn ◽  
C. Loussouarn ◽  
P. W. Hemker

In this paper we present a generic optimization algorithm for the allocation of dynamic positioning actuators, such as azimuthing thrusters and fixed thrusters. The algorithm is based on the well-known Lagrange multipliers method. In the present approach the Lagrangian functional represents not only the cost function (the total power delivered by all actuators), but also all constraints related to thruster saturation and forbidden zones for azimuthing thrusters. In the presented approach the application of the Lagrange multipliers method leads to a nonlinear set of equations, because an exact expression for the total power is applied and the actuator limitations are accounted for in an implicit manner, by means of nonlinear constraints. It is solved iteratively with the Newton-Raphson method and a step by step implementation of the constraints related to the actuator limitations. In addition, the results from the non-linear solution method were compared with the results from a simplified set of linear equations, based on an approximate (quadratic) expression for the thruster power. The non-linear solution was more accurate, while requiring only a slightly higher computational effort. An example is shown for a thruster configuration with 8 azimuthing thrusters, typical for a DP semi-submersible. The results show that the optimization algorithm is very stable and efficient. Finally, some options for improvements and future enhancements — such as including thruster-thruster and thruster-hull interactions and the effects of current — are discussed.


2015 ◽  
Vol 8 (4) ◽  
pp. 732-754 ◽  
Author(s):  
Terence Ahern ◽  
P.J. Byrne ◽  
Brian Leavy

Purpose – The purpose of this paper is to extend the learning boundaries of traditional project capability, which follows the linear planning paradigm, in order to include non-linear complex projects that cannot be completely specified and planned in advance, and so require continuous learning over their life cycles. Design/methodology/approach – Based on an earlier empirical investigation, where complex-project capability (CPC) is developed through dynamic organizational learning based on non-linear problem solving, the paper examines some of the conceptual and practical implications of this process insight. The focus here is on incomplete pre-given knowledge and emergent knowledge creation during CPC development. Findings – Using the three interrelated dimensions of project type, knowledge creation method, and organizational learning approach, the paper reinterprets Karl Popper’s linear problem solving model for learning in traditional projects by introducing the concept of knowledge entropy (disorder) for learning in non-linear complex projects. The latter follows a path from “order to disorder to order” rather than from “order to order” under traditional assumptions. Research limitations/implications – By identifying a common learning process at individual, group, and organizational levels, developing CPC can be viewed as a “learning organization”. This multi-level approach facilitates research into distributed organizing for emergent knowledge creation during CPC development. Practical implications – In contrast to traditional planned projects with up-front prior knowledge, complex projects are characterized by incomplete knowledge. The challenge of dealing with knowledge uncertainty in complex projects through continuous learning has practical implications for project learning, planning, knowledge management, and leadership. Originality/value – The concept of knowledge entropy (disorder) extends the learning boundaries of traditional projects, where little learning is anticipated, by including complex projects with knowledge uncertainty requiring continuous learning.


Sign in / Sign up

Export Citation Format

Share Document