scholarly journals Resilience Decision-Making for Complex Systems

Author(s):  
Julian Salomon ◽  
Matteo Broggi ◽  
Sebastian Kruse ◽  
Stefan Weber ◽  
Michael Beer

Abstract Complex systems—such as gas turbines, industrial plants, and infrastructure networks—are of paramount importance to modern societies. However, these systems are subject to various threats. Novel research does not only focus on monitoring and improving the robustness and reliability of systems but also focus on their recovery from adverse events. The concept of resilience encompasses these developments. Appropriate quantitative measures of resilience can support decision-makers seeking to improve or to design complex systems. In this paper, we develop comprehensive and widely adaptable instruments for resilience-based decision-making. Integrating an appropriate resilience metric together with a suitable systemic risk measure, we design numerically efficient tools aiding decision-makers in balancing different resilience-enhancing investments. The approach allows for a direct comparison between failure prevention arrangements and recovery improvement procedures, leading to optimal tradeoffs with respect to the resilience of a system. In addition, the method is capable of dealing with the monetary aspects involved in the decision-making process. Finally, a grid search algorithm for systemic risk measures significantly reduces the computational effort. In order to demonstrate its wide applicability, the suggested decision-making procedure is applied to a functional model of a multistage axial compressor, and to the U-Bahn and S-Bahn system of Germany's capital Berlin.

2015 ◽  
Author(s):  
Γεώργιος Παπαγιάννης

The main aim of the present thesis is to investigate the effect of diverging priors concerning model uncertainty on decision making. One of the main issues in the thesis is to assess the effect of different notions of distance in the space of probability measures and their use as loss functionals in the process of identifying the best suited model among a set of plausible priors. Another issue, is that of addressing the problem of ``inhomogeneous" sets of priors, i.e. sets of priors that highly divergent opinions may occur, and the need to robustly treat that case. As high degrees of inhomogeneity may lead to distrust of the decision maker to the priors it may be desirable to adopt a particular prior corresponding to the set which somehow minimizes the ``variability" among the models on the set. This leads to the notion of Frechet risk measure. Finally, an important problem is the actual calculation of robust risk measures. An account of their variational definition, the problem of calculation leads to the numerical treatment of problems of the calculus of variations for which reliable and effective algorithms are proposed. The contributions of the thesis are presented in the following three chapters. In Chapter 2, a statistical learning scheme is introduced for constructing the best model compatible with a set of priors provided by different information sources of varying reliability. As various priors may model well different aspects of the phenomenon the proposed scheme is a variational scheme based on the minimization of a weighted loss function in the space of probability measures which in certain cases is shown to be equivalent to weighted quantile averaging schemes. Therefore in contrast to approaches such as minimax decision theory in which a particular element of the prior set is chosen we construct for each prior set a probability measure which is not necessarily an element of it, a fact that as shown may lead to better description of the phenomenon in question. While treating this problem we also address the issue of the effect of the choice of distance functional in the space of measures on the problem of model selection. One of the key findings in this respect is that the class of Wasserstein distances seems to have the best performance as compared to other distances such as the KL-divergence. In Chapter 3, motivated by the results of Chapter 2, we treat the problem of specifying the risk measure for a particular loss when a set of highly divergent priors concerning the distribution of the loss is available. Starting from the principle that the ``variability" of opinions is not welcome, a fact for which a strong axiomatic framework is provided (see e.g. Klibanoff (2005) and references therein) we introduce the concept of Frechet risk measures, which corresponds to a minimal variance risk measure. Here we view a set of priors as a discrete measure on the space of probability measures and by variance we mean the variance of this discrete probability measure. This requires the use of the concept of Frechet mean. By different metrizations of the space of probability measures we define a variety of Frechet risk measures, the Wasserstein, the Hellinger and the weighted entropic risk measure, and illustrate their use and performance via an example related to the static hedging of derivatives under model uncertainty. In Chapter 4, we consider the problem of numerical calculation of convex risk measures applying techniques from the calculus of variations. Regularization schemes are proposed and the theoretical convergence of the algorithms is considered.


2020 ◽  
Vol 12 (22) ◽  
pp. 9306
Author(s):  
Nikolaos A. Skondras ◽  
Demetrios E. Tsesmelis ◽  
Constantina G. Vasilakou ◽  
Christos A. Karavitis

The terms ‘resilience’ and ‘vulnerability’ have been widely used, with multiple interpretations in a plethora of disciplines. Such a variety may easily become confusing, and could create misconceptions among the different users. Policy makers who are bound to make decisions in key spatial and temporal points may especially suffer from these misconceptions. The need for decisions may become even more pressing in times of crisis, where the weaknesses of a system are exposed, and immediate actions to enhance the systemic strengths should be made. The analysis framework proposed in the current effort, and demonstrated in hypothetical forest fire cases, tries to focus on the combined use of simplified versions of the resilience and vulnerability concepts. Their relations and outcomes are also explored, in an effort to provide decision makers with an initial assessment of the information required to deal with complex systems. It is believed that the framework may offer some service towards the development of a more integrated and applicable tool, in order to further expand the concepts of resilience and vulnerability. Additionally, the results of the framework can be used as inputs in other decision making techniques and approaches. This increases the added value of the framework as a tool.


2020 ◽  
Vol 30 (Supplement_5) ◽  
Author(s):  
C A Fergus ◽  
T Allen ◽  
M Parker ◽  
G Pearson ◽  
L Storer ◽  
...  

Abstract Background The linear theories of change which ground many interventions do not account for the complex processes and systems in which they are implemented. This reductionist approach prioritises statistical methods which do not accommodate the stochastic, non-linear, dynamic interactions between humans and their environment. The inclusion of practitioners in the process of evidence development and utilisation of complex systems methods mitigates these issues and results in locally relevant, timely evidence for decision-making. Methods The aim of this work was to develop localised evidence for decision-making for schistosomiasis control in Uganda, Malawi, and Tanzania. Workshops were conducted with practitioners from the Ministries of Health at various levels and partner organisations to identify evidence needs for their decision-making processes and perceptions of disease transmission and control activities. Participatory systems mapping was used to identify factors directly and indirectly related to transmission. The maps were synthesised to a master complex systems map, which served as the blueprint for a generalised spatial agent-based model and specific ABMs tailored to the evidence needs of decision-makers. Results There was a gap in available evidence for practitioners to advocate for resources within the MoH and government budgets, as well as intervention efficacy and resource allocation. The adaptable and data-inclusive characteristics of the AMBs made them well-suited to produce localised outputs. Converted to NetLogo with a tailored user interface, these models were appropriate and responsive to the needs of decision-makers from village to national levels and across country contexts. Conclusions Used together, participatory and agent-based modelling resulted in the development of responsive and relevant evidence for practitioner decision-making. This process is generalisable and transferable to other diseases and locations outside of those in this study. Key messages The use of participatory systems mapping to develop agent-based models resulted in relevant and timely evidence for practitioner decision-making. The approach used here is transferable and generalisable outside schistosomiasis control and the contexts in this study.


2020 ◽  
Author(s):  
Denisa Banulescu-Radu ◽  
Christophe Hurlin ◽  
Jérémy Leymarie ◽  
Olivier Scaillet

This paper proposes an original approach for backtesting systemic risk measures. This backtesting approach makes it possible to assess the systemic risk measure forecasts used to identify the financial institutions that contribute the most to the overall risk in the financial system. Our procedure is based on simple tests similar to those generally used to backtest the standard market risk measures such as value-at-risk or expected shortfall. We introduce a concept of violation associated with the marginal expected shortfall (MES), and we define unconditional coverage and independence tests for these violations. We can generalize these tests to any MES-based systemic risk measures such as the systemic expected shortfall (SES), the systemic risk measure (SRISK), or the delta conditional value-at-risk ([Formula: see text]CoVaR). We study their asymptotic properties in the presence of estimation risk and investigate their finite sample performance via Monte Carlo simulations. An empirical application to a panel of U.S. financial institutions is conducted to assess the validity of MES, SRISK, and [Formula: see text]CoVaR forecasts issued from a bivariate GARCH model with a dynamic conditional correlation structure. Our results show that this model provides valid forecasts for MES and SRISK when considering a medium-term horizon. Finally, we propose an early warning system indicator for future systemic crises deduced from these backtests. Our indicator quantifies how much is the measurement error issued by a systemic risk forecast at a given point in time which can serve for the early detection of global market reversals. This paper was accepted by Kay Giesecke, finance.


2016 ◽  
Vol 17 (4) ◽  
pp. 374-389 ◽  
Author(s):  
Sascha Strobl

Purpose This study investigates the risk-taking behavior of financial institutions in the USA. Specifically, differences between taking risks that affect primarily the shareholders of the institution and risks contributing to the overall systemic risk of the financial sector are examined. Additionally, differences between risk-taking before, during and after the financial crisis of 2007/2008 are examined. Design/methodology/approach To analyze the determinants of stand-alone and systemic risk, a generalized linear model including size, governance, charter value, business cycle, competition and control variables is estimated. Furthermore, Granger causality tests are conducted. Findings The results show that systemic risk has a positive effect on valuation and that corporate governance has no significant effect on risk-taking. The influence of competition is conditional on the state of the economy and the risk measure used. Systemic risk Granger-causes idiosyncratic risk but not vice versa. Research limitations/implications The major limitations of this study are related to the analyzed subset of large financial institutions and important risk-culture variables being omitted. Practical implications The broad policy implication of this paper is that systemic risk cannot be lowered by market discipline due to the moral hazard problem. Therefore, regulatory measures are necessary to ensure that individual financial institutions are not endangering the financial system. Originality/value This study contributes to the empirical literature on bank risk-taking in several ways. First, the characteristics of systemic risk and idiosyncratic risk are jointly analyzed. Second, the direction of causality of these two risk measures is examined. Moreover, this paper contributes to the discussion of the effect of competition on risk-taking.


2018 ◽  
Vol 5 (331) ◽  
pp. 153-167
Author(s):  
Dominik Krężołek ◽  
Grażyna Trzpiot

Decision‑making process is an individual matter for each investor and the strategy they choose, reflects the level of accepted risk. Nevertheless, any investor wants to minimize huge losses while maximizing profits. As far as the measure of risk is concerned, literature is full of examples of tools which help to evaluate the risk. However, the level of the risk usually differs, depending on circumstances. In this paper we present two non‑classical risk measures: Omega performance risk measure and GlueVaR risk measure. Both of them require a threshold to be set, which reflects the starting point for the investment to be considered as a loss. The effectiveness of the Omega and GlueVaR risk measures is compared using the example of metals market investments.


Complexity ◽  
2021 ◽  
Vol 2021 ◽  
pp. 1-30
Author(s):  
Abolfazl Taghavi ◽  
Sharif Khaleghparast ◽  
Kourosh Eshghi

Making proper decisions in today’s complex world is a challenging task for decision makers. A promising approach that can support decision makers to have a better understanding of complex systems is agent-based modeling (ABM). ABM has been developing during the last few decades as a methodology with many different applications and has enabled a better description of the dynamics of complex systems. However, the prescriptive facet of these applications is rarely portrayed. Adding a prescriptive decision-making (DM) aspect to ABM can support the decision makers in making better or, in some cases, optimized decisions for the complex problems as well as explaining the investigated phenomena. In this paper, first, the literature of DM with ABM is inquired and classified based on the methods of integration. Performing a scientometric analysis on the relevant literature lets us conclude that the number of publications attempting to integrate DM and ABM has not grown during the last two decades, while analysis of the current methodologies for integrating DM and ABM indicates that they have serious drawbacks. In this regard, a novel nature-inspired model articulation called optimal agent framework (OAF) has been proposed to ameliorate the disadvantages and enhance the realization of proper decisions in ABM at a relatively low computational cost. The framework is examined with the Bass diffusion model. The results of the simulation for the customized model developed by OAF have verified the feasibility of the framework. Moreover, sensitivity analyses on different agent populations, network structures, and marketing strategies have depicted the great potential of OAF to find the optimal strategies in various stochastic and unconventional conditions which have not been addressed prior to the implementation of the framework.


2018 ◽  
Vol 08 (04) ◽  
pp. 1840007 ◽  
Author(s):  
Fabrizio Cipollini ◽  
Alessandro Giannozzi ◽  
Fiammetta Menchetti ◽  
Oliviero Roggi

Following the 2007–2008 financial crisis, advanced risk measures were proposed with the specific aim of quantifying systemic risk, since the existing systematic (market) risk measures seemed inadequate to signal the collapse of an entire financial system. The paper aims at comparing the systemic risk measures and the earlier market risk measures regarding their predictive ability toward the failure of financial companies. Focusing on the 2007–2008 period and considering 28 large US financial companies (among which nine defaulted in the period), four systematic and four systemic risk measures are used to rank the companies according to their risk and to estimate their relationship with the company’s failure through a survival Cox model. We found that the two groups of risk measures achieve similar scores in the ranking exercise, and that both show a significant effect on the time-to-default of the financial institutions. This last result appears even stronger when the Cox model uses, as covariates, the risk measures evaluated one, three and six months before. Considering this last case, the most predictive risk measures about the default risk of financial institutions were the Expected Shortfall, the Value-at-Risk, the [Formula: see text] and the [Formula: see text]. We contribute to the literature in two ways. We provide a way to compare risk measures based on their predictive ability toward a situation, the company’s failure, which is the most catastrophic event for a company. The survival model approach allows to map each risk measure in terms of probability of default over a given time horizon. We note, finally, that although focused on the Great Recession in US, the analysis can be applied to different periods and countries.


2021 ◽  
Vol 0 (0) ◽  
Author(s):  
Zachary Feinstein ◽  
Birgit Rudloff

Abstract In this paper we present results on dynamic multivariate scalar risk measures, which arise in markets with transaction costs and systemic risk. Dual representations of such risk measures are presented. These are then used to obtain the main results of this paper on time consistency; namely, an equivalent recursive formulation of multivariate scalar risk measures to multiportfolio time consistency. We are motivated to study time consistency of multivariate scalar risk measures as the superhedging risk measure in markets with transaction costs (with a single eligible asset) (Jouini and Kallal (1995), Löhne and Rudloff (2014), Roux and Zastawniak (2016)) does not satisfy the usual scalar concept of time consistency. In fact, as demonstrated in (Feinstein and Rudloff (2021)), scalar risk measures with the same scalarization weight at all times would not be time consistent in general. The deduced recursive relation for the scalarizations of multiportfolio time consistent set-valued risk measures provided in this paper requires consideration of the entire family of scalarizations. In this way we develop a direct notion of a “moving scalarization” for scalar time consistency that corroborates recent research on scalarizations of dynamic multi-objective problems (Karnam, Ma and Zhang (2017), Kováčová and Rudloff (2021)).


Sign in / Sign up

Export Citation Format

Share Document