Decision-Making for Resilience-Enhancing Endowments in Complex Systems using Principles of Risk Measures

Author(s):  
Julian Salomon ◽  
Sebastian Kruse ◽  
Matteo Broggi ◽  
Stefan Weber ◽  
Michael Beer
Author(s):  
Julian Salomon ◽  
Matteo Broggi ◽  
Sebastian Kruse ◽  
Stefan Weber ◽  
Michael Beer

Abstract Complex systems—such as gas turbines, industrial plants, and infrastructure networks—are of paramount importance to modern societies. However, these systems are subject to various threats. Novel research does not only focus on monitoring and improving the robustness and reliability of systems but also focus on their recovery from adverse events. The concept of resilience encompasses these developments. Appropriate quantitative measures of resilience can support decision-makers seeking to improve or to design complex systems. In this paper, we develop comprehensive and widely adaptable instruments for resilience-based decision-making. Integrating an appropriate resilience metric together with a suitable systemic risk measure, we design numerically efficient tools aiding decision-makers in balancing different resilience-enhancing investments. The approach allows for a direct comparison between failure prevention arrangements and recovery improvement procedures, leading to optimal tradeoffs with respect to the resilience of a system. In addition, the method is capable of dealing with the monetary aspects involved in the decision-making process. Finally, a grid search algorithm for systemic risk measures significantly reduces the computational effort. In order to demonstrate its wide applicability, the suggested decision-making procedure is applied to a functional model of a multistage axial compressor, and to the U-Bahn and S-Bahn system of Germany's capital Berlin.


Risks ◽  
2021 ◽  
Vol 9 (6) ◽  
pp. 115
Author(s):  
Despoina Makariou ◽  
Pauline Barrieu ◽  
George Tzougas

The key purpose of this paper is to present an alternative viewpoint for combining expert opinions based on finite mixture models. Moreover, we consider that the components of the mixture are not necessarily assumed to be from the same parametric family. This approach can enable the agent to make informed decisions about the uncertain quantity of interest in a flexible manner that accounts for multiple sources of heterogeneity involved in the opinions expressed by the experts in terms of the parametric family, the parameters of each component density, and also the mixing weights. Finally, the proposed models are employed for numerically computing quantile-based risk measures in a collective decision-making context.


2020 ◽  
Vol 26 (6) ◽  
pp. 2927-2955
Author(s):  
Mar Palmeros Parada ◽  
Lotte Asveld ◽  
Patricia Osseweijer ◽  
John Alexander Posada

AbstractBiobased production has been promoted as a sustainable alternative to fossil resources. However, controversies over its impact on sustainability highlight societal concerns, value tensions and uncertainties that have not been taken into account during its development. In this work, the consideration of stakeholders’ values in a biorefinery design project is investigated. Value sensitive design (VSD) is a promising approach to the design of technologies with consideration of stakeholders’ values, however, it is not directly applicable for complex systems like biorefineries. Therefore, some elements of VSD, such as the identification of relevant values and their connection to a technology’s features, are brought into biorefinery design practice. Midstream modulation (MM), an approach to promoting the consideration of societal aspects during research and development activities, is applied to promote reflection and value considerations during the design decision making. As result, it is shown that MM interventions during the design process led to new design alternatives in support of stakeholders' values, and allowed to recognize and respond to emerging value tensions within the scope of the project. In this way, the present work shows a novel approach for the technical investigation of VSD, especially for biorefineries. Also, based on this work it is argued that not only reflection, but also flexibility and openness are important for the application of VSD in the context of biorefinery design.


2014 ◽  
Vol 17 (03n04) ◽  
pp. 1450016 ◽  
Author(s):  
V. I. YUKALOV ◽  
D. SORNETTE

The idea is advanced that self-organization in complex systems can be treated as decision making (as it is performed by humans) and, vice versa, decision making is nothing but a kind of self-organization in the decision maker nervous systems. A mathematical formulation is suggested based on the definition of probabilities of system states, whose particular cases characterize the probabilities of structures, patterns, scenarios, or prospects. In this general framework, it is shown that the mathematical structures of self-organization and of decision making are identical. This makes it clear how self-organization can be seen as an endogenous decision making process and, reciprocally, decision making occurs via an endogenous self-organization. The approach is illustrated by phase transitions in large statistical systems, crossovers in small statistical systems, evolutions and revolutions in social and biological systems, structural self-organization in dynamical systems, and by the probabilistic formulation of classical and behavioral decision theories. In all these cases, self-organization is described as the process of evaluating the probabilities of macroscopic states or prospects in the search for a state with the largest probability. The general way of deriving the probability measure for classical systems is the principle of minimal information, that is, the conditional entropy maximization under given constraints. Behavioral biases of decision makers can be characterized in the same way as analogous to quantum fluctuations in natural systems.


Criminology ◽  
2021 ◽  
Author(s):  
James C. Oleson

The evidence-based practice (EBP) movement can be traced to a 1992 article in the Journal of the American Medical Association, although decision-making with empirical evidence (rather than tradition, anecdote, or intuition) is obviously much older. Neverthless, for the last twenty-five years, EBP has played a pivotal role in criminal justice, particularly within community corrections. While the prediction of recidivism in parole or probation decisions has attracted relatively little attention, the use of risk measures by sentencing judges is controversial. This might be because sentencing typically involves both backward-looking decisions, related to the blameworthiness of the crime, as well as forward-looking decisions, about the offender’s prospective risk of recidivism. Evidence-based sentencing quantifies the predictive aspects of decision-making by incorporating an assessment of risk factors (which increase recidivism risk), protective factors (which reduce recidivism risk), criminogenic needs (impairments that, if addressed, will reduce recidivism risk), the measurement of recidivism risk, and the identification of optimal recidivism-reducing sentencing interventions. Proponents for evidence-based sentencing claim that it can allow judges to “sentence smarter” by using data to distinguish high-risk offenders (who might be imprisoned to mitigate their recidivism risk) from low-risk offenders (who might be released into the community with relatively little danger). This, proponents suggest, can reduce unnecessary incarceration, decrease costs, and enhance community safety. Critics, however, note that risk assessment typically looks beyond criminal conduct, incorporating demographic and socioeconomic variables. Even if a risk factor is facially neutral (e.g., criminal history), it might operate as a proxy for a constitutionally protected category (e.g., race). The same objectionable variables are used widely in presentence reports, but their incorporation into an actuarial risk score has greater potential to obfuscate facts and reify underlying disparities. The evidence-based sentencing literature is dynamic and rapidly evolving, but this bibliography identifies sources that might prove useful. It first outlines the theoretical foundations of traditional (non-evidence-based) sentencing, identifying resources and overviews. It then identifies sources related to decision-making and prediction, risk assessment logic, criminogenic needs, and responsivity. The bibliography then describes and defends evidence-based sentencing, and identifies works on sentencing variables and risk assessment instruments. It then relates evidence-based sentencing to big data and identifies data issues. Several works on constitutional problems are listed, the proxies problem is described, and sources on philosophical issues are described. The bibliography concludes with a description of validation research, the politics of evidence-based sentencing, and the identification of several current initiatives.


2015 ◽  
Author(s):  
Γεώργιος Παπαγιάννης

The main aim of the present thesis is to investigate the effect of diverging priors concerning model uncertainty on decision making. One of the main issues in the thesis is to assess the effect of different notions of distance in the space of probability measures and their use as loss functionals in the process of identifying the best suited model among a set of plausible priors. Another issue, is that of addressing the problem of ``inhomogeneous" sets of priors, i.e. sets of priors that highly divergent opinions may occur, and the need to robustly treat that case. As high degrees of inhomogeneity may lead to distrust of the decision maker to the priors it may be desirable to adopt a particular prior corresponding to the set which somehow minimizes the ``variability" among the models on the set. This leads to the notion of Frechet risk measure. Finally, an important problem is the actual calculation of robust risk measures. An account of their variational definition, the problem of calculation leads to the numerical treatment of problems of the calculus of variations for which reliable and effective algorithms are proposed. The contributions of the thesis are presented in the following three chapters. In Chapter 2, a statistical learning scheme is introduced for constructing the best model compatible with a set of priors provided by different information sources of varying reliability. As various priors may model well different aspects of the phenomenon the proposed scheme is a variational scheme based on the minimization of a weighted loss function in the space of probability measures which in certain cases is shown to be equivalent to weighted quantile averaging schemes. Therefore in contrast to approaches such as minimax decision theory in which a particular element of the prior set is chosen we construct for each prior set a probability measure which is not necessarily an element of it, a fact that as shown may lead to better description of the phenomenon in question. While treating this problem we also address the issue of the effect of the choice of distance functional in the space of measures on the problem of model selection. One of the key findings in this respect is that the class of Wasserstein distances seems to have the best performance as compared to other distances such as the KL-divergence. In Chapter 3, motivated by the results of Chapter 2, we treat the problem of specifying the risk measure for a particular loss when a set of highly divergent priors concerning the distribution of the loss is available. Starting from the principle that the ``variability" of opinions is not welcome, a fact for which a strong axiomatic framework is provided (see e.g. Klibanoff (2005) and references therein) we introduce the concept of Frechet risk measures, which corresponds to a minimal variance risk measure. Here we view a set of priors as a discrete measure on the space of probability measures and by variance we mean the variance of this discrete probability measure. This requires the use of the concept of Frechet mean. By different metrizations of the space of probability measures we define a variety of Frechet risk measures, the Wasserstein, the Hellinger and the weighted entropic risk measure, and illustrate their use and performance via an example related to the static hedging of derivatives under model uncertainty. In Chapter 4, we consider the problem of numerical calculation of convex risk measures applying techniques from the calculus of variations. Regularization schemes are proposed and the theoretical convergence of the algorithms is considered.


Sign in / Sign up

Export Citation Format

Share Document