Stand-alone vs systemic risk-taking of financial institutions

2016 ◽  
Vol 17 (4) ◽  
pp. 374-389 ◽  
Author(s):  
Sascha Strobl

Purpose This study investigates the risk-taking behavior of financial institutions in the USA. Specifically, differences between taking risks that affect primarily the shareholders of the institution and risks contributing to the overall systemic risk of the financial sector are examined. Additionally, differences between risk-taking before, during and after the financial crisis of 2007/2008 are examined. Design/methodology/approach To analyze the determinants of stand-alone and systemic risk, a generalized linear model including size, governance, charter value, business cycle, competition and control variables is estimated. Furthermore, Granger causality tests are conducted. Findings The results show that systemic risk has a positive effect on valuation and that corporate governance has no significant effect on risk-taking. The influence of competition is conditional on the state of the economy and the risk measure used. Systemic risk Granger-causes idiosyncratic risk but not vice versa. Research limitations/implications The major limitations of this study are related to the analyzed subset of large financial institutions and important risk-culture variables being omitted. Practical implications The broad policy implication of this paper is that systemic risk cannot be lowered by market discipline due to the moral hazard problem. Therefore, regulatory measures are necessary to ensure that individual financial institutions are not endangering the financial system. Originality/value This study contributes to the empirical literature on bank risk-taking in several ways. First, the characteristics of systemic risk and idiosyncratic risk are jointly analyzed. Second, the direction of causality of these two risk measures is examined. Moreover, this paper contributes to the discussion of the effect of competition on risk-taking.

Kybernetes ◽  
2021 ◽  
Vol ahead-of-print (ahead-of-print) ◽  
Author(s):  
Ming Qi ◽  
Jiawei Zhang ◽  
Jing Xiao ◽  
Pei Wang ◽  
Danyang Shi ◽  
...  

PurposeIn this paper the interconnectedness among financial institutions and the level of systemic risks of four types of Chinese financial institutions are investigated.Design/methodology/approachBy the means of RAS algorithm, the interconnection among financial institutions are illustrated. Different methods, including Linear Granger, Systemic impact index (SII), vulnerability index (VI), CoVaR, and MES are used to measure the systemic risk exposures across different institutions.FindingsThe results illustrate that big banks are more interconnected and hold the biggest scales of inter-bank transactions in the financial network. The institutions which have larger size tend to have more connection with others. Insurance and security companies contribute more to the systemic risk where as other institutions, such as trusts, financial companies, etc. may bring about severe loss and endanger the financial system as a whole.Practical implicationsSince other institutions with low levels of regulation may bring about higher extreme loss and suffer the whole system, it deserves more attention by regulators considering the contagion of potential risks in the financial system.Originality/valueThis study builds a valuable contribution by examine the systemic risks from the perspectives of both interconnection and tail risk measures. Furthermore; Four types financial institutions are investigated in this paper.


2020 ◽  
Author(s):  
Denisa Banulescu-Radu ◽  
Christophe Hurlin ◽  
Jérémy Leymarie ◽  
Olivier Scaillet

This paper proposes an original approach for backtesting systemic risk measures. This backtesting approach makes it possible to assess the systemic risk measure forecasts used to identify the financial institutions that contribute the most to the overall risk in the financial system. Our procedure is based on simple tests similar to those generally used to backtest the standard market risk measures such as value-at-risk or expected shortfall. We introduce a concept of violation associated with the marginal expected shortfall (MES), and we define unconditional coverage and independence tests for these violations. We can generalize these tests to any MES-based systemic risk measures such as the systemic expected shortfall (SES), the systemic risk measure (SRISK), or the delta conditional value-at-risk ([Formula: see text]CoVaR). We study their asymptotic properties in the presence of estimation risk and investigate their finite sample performance via Monte Carlo simulations. An empirical application to a panel of U.S. financial institutions is conducted to assess the validity of MES, SRISK, and [Formula: see text]CoVaR forecasts issued from a bivariate GARCH model with a dynamic conditional correlation structure. Our results show that this model provides valid forecasts for MES and SRISK when considering a medium-term horizon. Finally, we propose an early warning system indicator for future systemic crises deduced from these backtests. Our indicator quantifies how much is the measurement error issued by a systemic risk forecast at a given point in time which can serve for the early detection of global market reversals. This paper was accepted by Kay Giesecke, finance.


2018 ◽  
Vol 08 (04) ◽  
pp. 1840007 ◽  
Author(s):  
Fabrizio Cipollini ◽  
Alessandro Giannozzi ◽  
Fiammetta Menchetti ◽  
Oliviero Roggi

Following the 2007–2008 financial crisis, advanced risk measures were proposed with the specific aim of quantifying systemic risk, since the existing systematic (market) risk measures seemed inadequate to signal the collapse of an entire financial system. The paper aims at comparing the systemic risk measures and the earlier market risk measures regarding their predictive ability toward the failure of financial companies. Focusing on the 2007–2008 period and considering 28 large US financial companies (among which nine defaulted in the period), four systematic and four systemic risk measures are used to rank the companies according to their risk and to estimate their relationship with the company’s failure through a survival Cox model. We found that the two groups of risk measures achieve similar scores in the ranking exercise, and that both show a significant effect on the time-to-default of the financial institutions. This last result appears even stronger when the Cox model uses, as covariates, the risk measures evaluated one, three and six months before. Considering this last case, the most predictive risk measures about the default risk of financial institutions were the Expected Shortfall, the Value-at-Risk, the [Formula: see text] and the [Formula: see text]. We contribute to the literature in two ways. We provide a way to compare risk measures based on their predictive ability toward a situation, the company’s failure, which is the most catastrophic event for a company. The survival model approach allows to map each risk measure in terms of probability of default over a given time horizon. We note, finally, that although focused on the Great Recession in US, the analysis can be applied to different periods and countries.


2014 ◽  
Vol 22 (2) ◽  
pp. 159-172 ◽  
Author(s):  
Natalya A. Schenck

Purpose – This study aims to compare two distance-to-default methods, data-transformed maximum likelihood estimation and “naïve”, that are suitable for financial institutions. The links between these measures and asset size, Tier 1 and Tier 2 capital ratios, non-performing assets and operating efficiency have been examined and an alternative default risk measure has been introduced. Most of the market-based distance-to-default measures are not appropriate for banks due to their unique debt structure. Design/methodology/approach – The author has compared two distance-to-default measures and has identified their accounting determinants using Pearson’s correlation and regressions with clustered standard errors. The sample of the US-based systemically important financial institutions covers the period from 2000 to 2010. Findings – Non-performing assets and operating efficiency are found to be statistically and economically significant determinants of both distance-to-default measures. Tier 1 capital ratio is not a significant indicator of default risk. Practical implications – The results emphasize the importance of using a combination of market-based default risk measures and accounting ratios in default prediction models for the financial institutions. Originality/value – This paper identifies accounting determinants of two distance-to-default measures for large financial institutions, before and during the 2008 financial crisis. It introduces a spread between two measures as an alternative default risk indicator.


2021 ◽  
Vol ahead-of-print (ahead-of-print) ◽  
Author(s):  
Wioleta Kucharska

Purpose This study aims to understand and compare how the mechanism of innovative processes in the information technology (IT) industry – the most innovative industry worldwide – is shaped in Poland and the USA in terms of tacit knowledge awareness and sharing driven by a culture of knowledge and learning, composed of a learning climate and mistake acceptance. Design/methodology/approach Study samples were drawn from the IT industry in Poland (n = 350) and the USA (n = 370) and analyzed using the structural equation modeling method. Findings True learning derives from mistake acceptance. As a result of a risk-taking attitude and critical thinking, the IT industry in the USA is consistently innovation-oriented. Specifically, external innovations are highly correlated with internal innovations. Moreover, a knowledge culture supports a learning culture via a learning climate. A learning climate is an important facilitator for learning from mistakes. Originality/value This study revealed that a high level of mistake acceptance stimulates a risk-taking attitude that offers a high level of tacit knowledge awareness as a result of critical thinking, but critical thinking without readiness to take a risk is useless for tacit knowledge capturing.


2019 ◽  
Vol 45 (8) ◽  
pp. 1076-1091
Author(s):  
Tomoki Kitamura ◽  
Kozo Omori

Purpose The purpose of this paper is to theoretically examine the risk-taking decision of corporate defined benefits (DB) plans. The equity holders’ investment problem that is represented by the position of a vulnerable option is solved. Design/methodology/approach The simple traditional contingent claim approach is applied, which considers only the distributions of corporate cash flow, without the model expansions, such as market imperfections, needed to explain the firms’ behavior for DB plans in previous studies. Findings The authors find that the optimal solution to the equity holders’ DB investment problem is not an extreme corner solution such as 100 percent investment in equity funds as in the literature. Rather, the solution lies in the middle range, as is commonly observed in real-world economies. Originality/value The major value of this study is that it develops a clear mechanism for obtaining an internal solution for the equity holders’ DB investment problem and it provides the understanding that the base for corporate investment behavior for DB plans should incorporate the fact that in some cases the optimal solution is in the middle range. Therefore, the corporate risk-taking behavior of DB plans is harder to identify than the results of the empirical literature have predicted.


2014 ◽  
Vol 22 (1) ◽  
pp. 43-48
Author(s):  
Adolph Neidermeyer ◽  
Naomi E. Boyd ◽  
Presha Neidermeyer

Purpose – The purpose of this paper is to provide a historical perspective and going-forward assessment of the importance of private mortgage insurance (PMI) entities in the residential-lending landscape in the USA. Design/methodology/approach – Financial data from the PMI entities and federal income tax data were analyzed to comment on the importance of the PMI entities in the historical and current mortgage-lending environment. Findings – PMI entities played a critical role in expanding the population of mortgage candidates for financial institutions. Through the guarantees offered by PMI entities, financial institutions granted loans to individuals who otherwise would not have qualified for mortgages. Originality/value – No prior research has assessed the overall historical role played by these primary PMI entities.


2017 ◽  
Vol 9 (2) ◽  
pp. 155-164 ◽  
Author(s):  
Megan R. Kopkin ◽  
Stanley L. Brodsky ◽  
David DeMatteo

Purpose The legal system’s use of risk assessment has grown exponentially over the past several decades. Empirically validated risk measures are commonly implemented in parole, bail, civil commitment, and presentence proceedings. Despite their growing popularity, both policy-makers and legal scholars question their moral and legal acceptability, particularly in presentence proceedings. The purpose of this paper is to assess the current role of risk assessment in sentencing through an examination of the instrument currently under construction in the state of Pennsylvania. Design/methodology/approach Drawing on the current state of the literature, this paper evaluates the current use of risk assessment in criminal sentencing and discusses its consequences, both positive and negative. Findings Four areas for improvement in the use of risk assessment in sentencing were identified. Recommendations for change are proposed. Practical implications While the use of risk assessment within the legal system has significantly increased over the past several decades, the incorporation of risk assessment in presentence proceedings is a relatively new practice. This paper provides readers with insight on the appropriateness of using risk assessment in this context and provides suggestions for reducing ethical concerns. Recommendations for increasing the validity and clinical utility of these instruments are also discussed. Originality/value Although the literature on the use of risk assessment in legal proceedings is dense, relatively little is written about their use in criminal sentencing. This paper introduces readers to this concept by examining a risk measure proposed for use in the state of Pennsylvania’s presentence proceedings. The authors discuss concerns and propose recommendations for the future use of risk assessment in this setting.


2015 ◽  
Vol 10 (4) ◽  
pp. 711-725 ◽  
Author(s):  
Mohamed Bilel Triki ◽  
Samir Maktouf

Purpose – The purpose of this paper is to focus on whether the deviations from the cointegrating relationship possess long memory and the fractional cointegration analyses may capture a wider range of mean-reversion behaviour than standard cointegration analyses. Design/methodology/approach – This paper uses a fractional cointegration technique to test the purchasing power parity (PPP). Findings – The authors found that PPP held, but very weakly, in the long run between the Argentine, Brazil, Chile, Colombia, Indonesia, Korea, Mexico, Thailand and Venezuela and US exchange rate during our floating exchange rate period but that the deviations from it did not follow a stationary process. Nevertheless, it is also found that the deviations from PPP exists and can be characterized by a fractionally integrated process in nine out of 13 countries studied. Originality/value – The findings are consistent with the consensus of the empirical literature, reviewed earlier in this paper, on PPP between Argentine, Brazil, Chile, Colombia, Indonesia, Korea, Mexico, Thailand and Venezuela and the USA.


Author(s):  
Xue Dong He ◽  
Steven Kou ◽  
Xianhua Peng

Risk measures are used not only for financial institutions’ internal risk management but also for external regulation (e.g., in the Basel Accord for calculating the regulatory capital requirements for financial institutions). Though fundamental in risk management, how to select a good risk measure is a controversial issue. We review the literature on risk measures, particularly on issues such as subadditivity, robustness, elicitability, and backtesting. We also aim to clarify some misconceptions and confusions in the literature. In particular, we argue that, despite lacking some mathematical convenience, the median shortfall—that is, the median of the tail loss distribution—is a better option than the expected shortfall for setting the Basel Accords capital requirements due to statistical and economic considerations such as capturing tail risk, robustness, elicitability, backtesting, and surplus invariance. Expected final online publication date for the Annual Review of Statistics, Volume 9 is March 2022. Please see http://www.annualreviews.org/page/journal/pubdates for revised estimates.


Sign in / Sign up

Export Citation Format

Share Document