Risk measures computation by Fourier inversion

2017 ◽  
Vol 18 (1) ◽  
pp. 76-87 ◽  
Author(s):  
Ngoc Quynh Anh Nguyen ◽  
Thi Ngoc Trang Nguyen

Purpose The purpose of this paper is to present the method for efficient computation of risk measures using Fourier transform technique. Another objective is to demonstrate that this technique enables an efficient computation of risk measures beyond value-at-risk and expected shortfall. Finally, this paper highlights the importance of validating assumptions behind the risk model and describes its application in the affine model framework. Design/methodology/approach The method proposed is based on Fourier transform methods for computing risk measures. The authors obtain the loss distribution by fitting a cubic spline through the points where Fourier inversion of the characteristic function is applied. From the loss distribution, the authors calculate value-at-risk and expected shortfall. As for the calculation of the entropic value-at-risk, it involves the moment generating function which is closely related to the characteristic function. The expectile risk measure is calculated based on call and put option prices which are available in a semi-closed form by Fourier inversion of the characteristic function. We also consider mean loss, standard deviation and semivariance which are calculated in a similar manner. Findings The study offers practical insights into the efficient computation of risk measures as well as validation of the risk models. It also provides a detailed description of algorithms to compute each of the risk measures considered. While the main focus of the paper is on portfolio-level risk metrics, all algorithms are also applicable to single instruments. Practical implications The algorithms presented in this paper require little computational effort which makes them very suitable for real-world applications. In addition, the mathematical setup adopted in this paper provides a natural framework for risk model validation which makes the approach presented in this paper particularly appealing in practice. Originality/value This is the first study to consider the computation of entropic value-at-risk, semivariance as well as expectile risk measure using Fourier transform method.

2021 ◽  
Vol 14 (11) ◽  
pp. 540
Author(s):  
Eyden Samunderu ◽  
Yvonne T. Murahwa

Developments in the world of finance have led the authors to assess the adequacy of using the normal distribution assumptions alone in measuring risk. Cushioning against risk has always created a plethora of complexities and challenges; hence, this paper attempts to analyse statistical properties of various risk measures in a not normal distribution and provide a financial blueprint on how to manage risk. It is assumed that using old assumptions of normality alone in a distribution is not as accurate, which has led to the use of models that do not give accurate risk measures. Our empirical design of study firstly examined an overview of the use of returns in measuring risk and an assessment of the current financial environment. As an alternative to conventional measures, our paper employs a mosaic of risk techniques in order to ascertain the fact that there is no one universal risk measure. The next step involved looking at the current risk proxy measures adopted, such as the Gaussian-based, value at risk (VaR) measure. Furthermore, the authors analysed multiple alternative approaches that do not take into account the normality assumption, such as other variations of VaR, as well as econometric models that can be used in risk measurement and forecasting. Value at risk (VaR) is a widely used measure of financial risk, which provides a way of quantifying and managing the risk of a portfolio. Arguably, VaR represents the most important tool for evaluating market risk as one of the several threats to the global financial system. Upon carrying out an extensive literature review, a data set was applied which was composed of three main asset classes: bonds, equities and hedge funds. The first part was to determine to what extent returns are not normally distributed. After testing the hypothesis, it was found that the majority of returns are not normally distributed but instead exhibit skewness and kurtosis greater or less than three. The study then applied various VaR methods to measure risk in order to determine the most efficient ones. Different timelines were used to carry out stressed value at risks, and it was seen that during periods of crisis, the volatility of asset returns was higher. The other steps that followed examined the relationship of the variables, correlation tests and time series analysis conducted and led to the forecasting of the returns. It was noted that these methods could not be used in isolation. We adopted the use of a mosaic of all the methods from the VaR measures, which included studying the behaviour and relation of assets with each other. Furthermore, we also examined the environment as a whole, then applied forecasting models to accurately value returns; this gave a much more accurate and relevant risk measure as compared to the initial assumption of normality.


2018 ◽  
Vol 7 (3.7) ◽  
pp. 25
Author(s):  
Abdul Talib Bon ◽  
Muhammad Iqbal Al-Banna Ismail ◽  
Sukono . ◽  
Adhitya Ronnie Effendie

Analysis of risk in life insurance claims is very important to do by the insurance company actuary. Risk in life insurance claims are generally measured using the standard deviation or variance. The problem is, that the standard deviation or variance which is used as a measure of the risk of a claim can not accommodate any claims of risk events. Therefore, in this study developed a model called risk measures Collective Modified Value-at-Risk. Model development is done for several models of the distribution of the number of claims and the distribution of the value of the claim. Collective results of model development Modified Value-at-Risk is expected to accommodate any claims of risk events, when given a certain level of significance  


2020 ◽  
Vol 21 (5) ◽  
pp. 543-557
Author(s):  
Modisane Bennett Seitshiro ◽  
Hopolang Phillip Mashele

Purpose The purpose of this paper is to propose the parametric bootstrap method for valuation of over-the-counter derivative (OTCD) initial margin (IM) in the financial market with low outstanding notional amounts. That is, an aggregate outstanding gross notional amount of OTC derivative instruments not exceeding R20bn. Design/methodology/approach The OTCD market is assumed to have a Gaussian probability distribution with the mean and standard deviation parameters. The bootstrap value at risk model is applied as a risk measure that generates bootstrap initial margins (BIM). Findings The proposed parametric bootstrap method is in favour of the BIM amounts for the simulated and real data sets. These BIM amounts are reasonably exceeding the IM amounts whenever the significance level increases. Research limitations/implications This paper only assumed that the OTCD returns only come from a normal probability distribution. Practical implications The OTCD IM requirement in respect to transactions done by counterparties may affect the entire financial market participants under uncleared OTCD, while reducing systemic risk. Thus, reducing spillover effects by ensuring that collateral (IM) is available to offset losses caused by the default of a OTCDs counterparty. Originality/value This paper contributes to the literature by presenting a valuation of IM for the financial market with low outstanding notional amounts by using the parametric bootstrap method.


2018 ◽  
Vol 21 (03) ◽  
pp. 1850010 ◽  
Author(s):  
LAKSHITHE WAGALATH ◽  
JORGE P. ZUBELLI

This paper proposes an intuitive and flexible framework to quantify liquidation risk for financial institutions. We develop a model where the “fundamental” dynamics of assets is modified by price impacts from fund liquidations. We characterize mathematically the liquidation schedule of financial institutions and study in detail the fire sales resulting endogenously from margin constraints when a financial institution trades through an exchange. Our study enables to obtain tractable formulas for the value at risk and expected shortfall of a financial institution in the presence of fund liquidation. In particular, we find an additive decomposition for liquidation-adjusted risk measures. We show that such a measure can be expressed as a “fundamental” risk measure plus a liquidation risk adjustment that is proportional to the size of fund positions as a fraction of asset market depths. Our results can be used by risk managers in financial institutions to tackle liquidity events arising from fund liquidations better and adjust their portfolio allocations to liquidation risk more accurately.


Mathematics ◽  
2020 ◽  
Vol 8 (11) ◽  
pp. 2080
Author(s):  
Maria-Teresa Bosch-Badia ◽  
Joan Montllor-Serrats ◽  
Maria-Antonia Tarrazon-Rodon

We study the applicability of the half-normal distribution to the probability–severity risk analysis traditionally performed through risk matrices and continuous probability–consequence diagrams (CPCDs). To this end, we develop a model that adapts the financial risk measures Value-at-Risk (VaR) and Conditional Value at Risk (CVaR) to risky scenarios that face only negative impacts. This model leads to three risk indicators: The Hazards Index-at-Risk (HIaR), the Expected Hazards Damage (EHD), and the Conditional HIaR (CHIaR). HIaR measures the expected highest hazards impact under a certain probability, while EHD consists of the expected impact that stems from truncating the half-normal distribution at the HIaR point. CHIaR, in turn, measures the expected damage in the case it exceeds the HIaR. Therefore, the Truncated Risk Model that we develop generates a measure for hazards expectations (EHD) and another measure for hazards surprises (CHIaR). Our analysis includes deduction of the mathematical functions that relate HIaR, EHD, and CHIaR to one another as well as the expected loss estimated by risk matrices. By extending the model to the generalised half-normal distribution, we incorporate a shape parameter into the model that can be interpreted as a hazard aversion coefficient.


2018 ◽  
Vol 19 (2) ◽  
pp. 127-136 ◽  
Author(s):  
Stavros Stavroyiannis

Purpose The purpose of this paper is to examine the value-at-risk and related measures for the Bitcoin and to compare the findings with Standard and Poor’s SP500 Index, and the gold spot price time series. Design/methodology/approach A GJR-GARCH model has been implemented, in which the residuals follow the standardized Pearson type-IV distribution. A large variety of value-at-risk measures and backtesting criteria are implemented. Findings Bitcoin is a highly volatile currency violating the value-at-risk measures more than the other assets. With respect to the Basel Committee on Banking Supervision Accords, a Bitcoin investor is subjected to higher capital requirements and capital allocation ratio. Practical implications The risk of an investor holding Bitcoins is measured and quantified via the regulatory framework practices. Originality/value This paper is the first comprehensive approach to the risk properties of Bitcoin.


2021 ◽  
Vol 17 (3) ◽  
pp. 370-380
Author(s):  
Ervin Indarwati ◽  
Rosita Kusumawati

Portfolio risk shows the large deviations in portfolio returns from expected portfolio returns. Value at Risk (VaR) is one method for determining the maximum risk of loss of a portfolio or an asset based on a certain probability and time. There are three methods to estimate VaR, namely variance-covariance, historical, and Monte Carlo simulations. One disadvantage of VaR is that it is incoherent because it does not have sub-additive properties. Conditional Value at Risk (CVaR) is a coherent or related risk measure and has a sub-additive nature which indicates that the loss on the portfolio is smaller or equal to the amount of loss of each asset. CVaR can provide loss information above the maximum loss. Estimating portfolio risk from the CVaR value using Monte Carlo simulation and its application to PT. Bank Negara Indonesia (Persero) Tbk (BBNI.JK) and PT. Bank Tabungan Negara (Persero) Tbk (BBTN.JK) will be discussed in this study.  The  daily  closing  price  of  each  BBNI  and BBTN share from 6 January 2019 to 30 December 2019 is used to measure the CVaR of the two banks' stock portfolios with this Monte Carlo simulation. The steps taken are determining the return value of assets, testing the normality of return of assets, looking for risk measures of returning assets that form a normally distributed portfolio, simulate the return of assets with monte carlo, calculate portfolio weights, looking for returns portfolio, calculate the quartile of portfolio return as a VaR value, and calculate the average loss above the VaR value as a CVaR value. The results of portfolio risk estimation of the value of CVaR using Monte Carlo simulation on PT. Bank Negara Indonesia (Persero) Tbk and PT. Bank Tabungan Negara (Persero) Tbk at a confidence level of 90%, 95%, and 99% is 5.82%, 6.39%, and 7.1% with a standard error of 0.58%, 0.59%, and 0.59%. If the initial funds that will be invested in this portfolio are illustrated at Rp 100,000,000, it can be interpreted that the maximum possible risk that investors will receive in the future will not exceed Rp 5,820,000, Rp 6,390,000 and Rp 7,100,000 at the significant level 90%, 95%, and 99%


Author(s):  
Khreshna Syuhada

In financial and insurance industries, risks may come from several sources. It is therefore important to predict future risk by using the concept of aggregate risk. Risk measure prediction plays important role in allocating capital as well as in controlling (and avoiding) worse risk. In this paper, we consider several risk measures such as Value-at-Risk (VaR), Tail VaR (TVaR) and its extension namely Adjusted TVaR (Adj-TVaR). Specifically, we perform an upper bound for such risk measure applied for aggregate risk models. The concept and property of comonotonicity and convex order are utilized to obtain such upper bound.Keywords:        Coherent property, comonotonic rv, convex order, tail property, Value-at-Risk (VaR).


2018 ◽  
Vol 15 (4) ◽  
pp. 17-34 ◽  
Author(s):  
Tom Burdorf ◽  
Gary van Vuuren

As a risk measure, Value at Risk (VaR) is neither sub-additive nor coherent. These drawbacks have coerced regulatory authorities to introduce and mandate Expected Shortfall (ES) as a mainstream regulatory risk management metric. VaR is, however, still needed to estimate the tail conditional expectation (the ES): the average of losses that are greater than the VaR at a significance level These two risk measures behave quite differently during growth and recession periods in developed and emerging economies. Using equity portfolios assembled from securities of the banking and retail sectors in the UK and South Africa, historical, variance-covariance and Monte Carlo approaches are used to determine VaR (and hence ES). The results are back-tested and compared, and normality assumptions are tested. Key findings are that the results of the variance covariance and the Monte Carlo approach are more consistent in all environments in comparison to the historical outcomes regardless of the equity portfolio regarded. The industries and periods analysed influenced the accuracy of the risk measures; the different economies did not.


2020 ◽  
Vol 21 (2) ◽  
pp. 111-126 ◽  
Author(s):  
Athanasios Kokoris ◽  
Fragiskos Archontakis ◽  
Christos Grose

Purpose This study aims to examine whether the methodology proposed by the European Supervisory Authorities (ESAs) within Delegated Regulation (European Union) 2017/653 for the calculation of market risk of certain packaged retail and insurance-based investment products (PRIIPs) is the most appropriate. Design/methodology/approach Risk models are put into effect to validate the appropriateness of the methodology announced by ESAs. ESAs have announced that the unit-linked (UL) products, labeled as Category II PRIIPs, will be subject to the Cornish–Fisher value-at-risk (CFVaR) methodology for their market risk assessment. We test CFVaR at 97.5% confidence level on 70 UL products, and we test Cornish–Fisher expected shortfall (CFES) at the same confidence level, which acts as a counter methodology for CFVaR. Findings The paper provides empirical insights about the Cornish-Fisher (CF) expansion being a method that incorporates the possibility of financial instability. When CFVaR by ESAs is calculated, it is shown that CF is in general a more robust risk model than the simpler historical ones. However, when CFES is applied, important points are derived. First, only in half of the occasions the CF expansion can be considered as a reliable method. Second, the CFES is a more coherent risk measure than CFVaR. We conclude that the CF expansion is unable to accurately estimate the market risk of UL products when excessive fat-tailed or non-symmetrical distributions are present. Hence, we suggest that a different methodology could also be considered by the regulatory bodies which will capture the excessive values of products in financial distress. Originality/value Literature, both theoretical and applied, regarding PRIIPs, is not extended. Although business and regulators research has begun to intensify in the last two years, to our knowledge this is one of the first studies that uses the CFES methodology for market risk assessment of Category II PRIIPs. In addition, we use a unique data set from a country in the headwinds of the recent financial crisis. This research contributes both to the academic and business community by enriching the existing literature and aiding risk managers in assessing the market risk of certain Category II PRIIPs. Considering the recent efforts of the regulatory authorities at the beginning of 2020 to implement certain amendments to the PRIIPs, we indicate relative risks related with the calculation of the market risk of the aforementioned products. Our findings could contribute to regulatory authorities’ persistent efforts in wrapping up this ongoing project.


Sign in / Sign up

Export Citation Format

Share Document