Efficient estimation of financial risk by regressing the quantiles of parametric distributions: An application to CARR models

2018 ◽  
Vol 23 (2) ◽  
Author(s):  
Jennifer So Kuen Chan ◽  
Kok-Haur Ng ◽  
Thanakorn Nitithumbundit ◽  
Shelton Peiris

Abstract Risk measures such as value-at-risk (VaR) and expected shortfall (ES) may require the calculation of quantile functions from quantile regression models. In a parametric set-up, we propose to regress directly on the quantiles of a distribution and demonstrate a method through the conditional autoregressive range model which has increasing popularity in recent years. Two flexible distribution families: the generalised beta type two on positive support and the generalised-t on real support (which requires log transformation) are adopted for the range data. Then the models are extended to allow the volatility dynamic and compared in terms of goodness-of-fit. The models are implemented using the module fmincon in Matlab under the classical likelihood approach and applied to analyse the intra-day high-low price ranges from the All Ordinaries index for the Australian stock market. Quantiles and upper-tail conditional expectations evaluated via VaR and ES respectively are forecast using the proposed models.

2021 ◽  
Vol 14 (11) ◽  
pp. 540
Author(s):  
Eyden Samunderu ◽  
Yvonne T. Murahwa

Developments in the world of finance have led the authors to assess the adequacy of using the normal distribution assumptions alone in measuring risk. Cushioning against risk has always created a plethora of complexities and challenges; hence, this paper attempts to analyse statistical properties of various risk measures in a not normal distribution and provide a financial blueprint on how to manage risk. It is assumed that using old assumptions of normality alone in a distribution is not as accurate, which has led to the use of models that do not give accurate risk measures. Our empirical design of study firstly examined an overview of the use of returns in measuring risk and an assessment of the current financial environment. As an alternative to conventional measures, our paper employs a mosaic of risk techniques in order to ascertain the fact that there is no one universal risk measure. The next step involved looking at the current risk proxy measures adopted, such as the Gaussian-based, value at risk (VaR) measure. Furthermore, the authors analysed multiple alternative approaches that do not take into account the normality assumption, such as other variations of VaR, as well as econometric models that can be used in risk measurement and forecasting. Value at risk (VaR) is a widely used measure of financial risk, which provides a way of quantifying and managing the risk of a portfolio. Arguably, VaR represents the most important tool for evaluating market risk as one of the several threats to the global financial system. Upon carrying out an extensive literature review, a data set was applied which was composed of three main asset classes: bonds, equities and hedge funds. The first part was to determine to what extent returns are not normally distributed. After testing the hypothesis, it was found that the majority of returns are not normally distributed but instead exhibit skewness and kurtosis greater or less than three. The study then applied various VaR methods to measure risk in order to determine the most efficient ones. Different timelines were used to carry out stressed value at risks, and it was seen that during periods of crisis, the volatility of asset returns was higher. The other steps that followed examined the relationship of the variables, correlation tests and time series analysis conducted and led to the forecasting of the returns. It was noted that these methods could not be used in isolation. We adopted the use of a mosaic of all the methods from the VaR measures, which included studying the behaviour and relation of assets with each other. Furthermore, we also examined the environment as a whole, then applied forecasting models to accurately value returns; this gave a much more accurate and relevant risk measure as compared to the initial assumption of normality.


2015 ◽  
Vol 4 (1and2) ◽  
pp. 28
Author(s):  
Marcelo Brutti Righi ◽  
Paulo Sergio Ceretta

We investigate whether there can exist an optimal estimation window for financial risk measures. Accordingly, we propose a procedure that achieves optimal estimation window by minimizing estimation bias. Using results from a Monte Carlo simulation for Value at Risk and Expected Shortfall in distinct scenarios, we conclude that the optimal length for the estimation window is not random but has very clear patterns. Our findings can contribute to the literature, as studies have typically neglected the estimation window choice or relied on arbitrary choices.


2017 ◽  
Vol 11 (1-2) ◽  
pp. 127-136 ◽  
Author(s):  
Panna Miskolczi

In this paper we describe and clarify the definitions and the usage of the simple and logarithmic returns for financial assets like stocks or portfolios. It can be proven that the distributions of the simple and logarithmic returns are really close to each other. Because of this fact we investigate the question whether the calculated financial risk depends on the use of simple or log returns. To show the effect of the return-type on the calculations, we consider and compare the riskiness order of stocks and portfolios. For our purposes, in the empirical study we use seven Hungarian daily stock prices and for the risk calculation we focus on the following risk measures: standard deviation, semivariance, Value at Risk and Expected Shortfall. The results clearly show that the riskiness order can depend on the use of the return type (i.e. log or simple return). Generally, often – due to missing data or the nature of the analysis – one has to use approximations. We also examine the effect of these approximations on the riskiness order of stocks and of portfolios. We found differences in the riskiness order using exact or approximated values. Therefore, we believe, if this is possible, exact values instead of approximated ones should be used for calculations. Additionally, it is important that one uses the same type of return within one study and one has to be aware of the possible instabilities when comparing return results. JEL Code: C18


Risks ◽  
2019 ◽  
Vol 7 (2) ◽  
pp. 55
Author(s):  
Vytaras Brazauskas ◽  
Sahadeb Upretee

Quantiles of probability distributions play a central role in the definition of risk measures (e.g., value-at-risk, conditional tail expectation) which in turn are used to capture the riskiness of the distribution tail. Estimates of risk measures are needed in many practical situations such as in pricing of extreme events, developing reserve estimates, designing risk transfer strategies, and allocating capital. In this paper, we present the empirical nonparametric and two types of parametric estimators of quantiles at various levels. For parametric estimation, we employ the maximum likelihood and percentile-matching approaches. Asymptotic distributions of all the estimators under consideration are derived when data are left-truncated and right-censored, which is a typical loss variable modification in insurance. Then, we construct relative efficiency curves (REC) for all the parametric estimators. Specific examples of such curves are provided for exponential and single-parameter Pareto distributions for a few data truncation and censoring cases. Additionally, using simulated data we examine how wrong quantile estimates can be when one makes incorrect modeling assumptions. The numerical analysis is also supplemented with standard model diagnostics and validation (e.g., quantile-quantile plots, goodness-of-fit tests, information criteria) and presents an example of when those methods can mislead the decision maker. These findings pave the way for further work on RECs with potential for them being developed into an effective diagnostic tool in this context.


Mathematics ◽  
2020 ◽  
Vol 8 (11) ◽  
pp. 2080
Author(s):  
Maria-Teresa Bosch-Badia ◽  
Joan Montllor-Serrats ◽  
Maria-Antonia Tarrazon-Rodon

We study the applicability of the half-normal distribution to the probability–severity risk analysis traditionally performed through risk matrices and continuous probability–consequence diagrams (CPCDs). To this end, we develop a model that adapts the financial risk measures Value-at-Risk (VaR) and Conditional Value at Risk (CVaR) to risky scenarios that face only negative impacts. This model leads to three risk indicators: The Hazards Index-at-Risk (HIaR), the Expected Hazards Damage (EHD), and the Conditional HIaR (CHIaR). HIaR measures the expected highest hazards impact under a certain probability, while EHD consists of the expected impact that stems from truncating the half-normal distribution at the HIaR point. CHIaR, in turn, measures the expected damage in the case it exceeds the HIaR. Therefore, the Truncated Risk Model that we develop generates a measure for hazards expectations (EHD) and another measure for hazards surprises (CHIaR). Our analysis includes deduction of the mathematical functions that relate HIaR, EHD, and CHIaR to one another as well as the expected loss estimated by risk matrices. By extending the model to the generalised half-normal distribution, we incorporate a shape parameter into the model that can be interpreted as a hazard aversion coefficient.


Author(s):  
Omer Hadzic ◽  
Smajo Bisanovic

The power trading and ancillary services provision comprise technical and financial risks and therefore require a structured risk management. Focus in this paper is on financial risk management that is important for the system operator faces when providing and using ancillary services for balancing of power system. Risk on ancillary services portfolio is modeled through value at risk and conditional value at risk measures. The application of these risk measures in power system is given in detail to show how to using the risk concept in practice. Conditional value at risk optimization is analysed in the context of portfolio selection and how to apply this optimization for hedging a portfolio consisting of different types of ancillary services.


2011 ◽  
Vol 204-210 ◽  
pp. 537-540
Author(s):  
Yu Ling Wang ◽  
Jun Hai Ma ◽  
Yu Hua Xu

Mean-variance model, value at risk and Conditional Value at Risk are three chief methods to measure financial risk recently. The demonstrative research shows that three optional questions are equivalence when the security rates have a multivariate normal distribution and the given confidence level is more than a special value. Applications to real data provide empirical support to this methodology. This result has provided new methods for us about further research of risk portfolios.


2014 ◽  
Vol 32 (1) ◽  
pp. 122-153 ◽  
Author(s):  
Ke-Li Xu

Understanding uncertainty in estimating risk measures is important in modern financial risk management. In this paper we consider a nonparametric framework that incorporates auxiliary information available in covariates and propose a family of inferential methods for the value at risk, expected shortfall, and related risk measures. A two-step generalized empirical likelihood test statistic is constructed and is shown to be asymptotically pivotal without requiring variance estimation. We also show its validity when applied to a semiparametric index model. Asymptotic theories are established allowing for serially dependent data. Simulations and an empirical application to Canadian stock return index illustrate the finite sample behavior of the methodologies proposed.


2014 ◽  
Vol 12 (21) ◽  
pp. 105
Author(s):  
Јулија Церовић

Резиме: Концепт вриједности при ризику (Value at risk - VaR) је мјера која се све више користи за оцјену степена изложености ризику учесника на финансијским тржиштима. Циљ овог концепта који је почео да преовладава у свијету управљања ризиком од 1994. године, јесте оцјена максималног губитка финансијске позиције у одређеном временском периоду за дату вјероватноћу. Постоји велики број мјера које квантификују ризик, и циљ рада је да се ове мјере изложе, са посебним акцентом на VaR. Такође, код мјерења финансијског ризика треба имати у виду особине финансијских временских серија, па су стога посебно истакнуте у раду. Други дио рада објашњава како су ове мјере ризика обухваћене правном регулативом у контроли ризика. Задатак рада је да се анализира контрола ризика у Црној Гори, као и важност стандарда који су на снази, у доприносу побољшања контроле ризика. Идеја рада је мотивисана жељом да се у Црној Гори озбиљније приступи квантификовању ризика, као и самом управљању ризиком. У наредном периоду, у оквиру мјера Централне банке Црне Горе за јачање финансијског система, континуирано ће се пратити и анализирати стање у банкарском систему, уз предузимање благовремених корективних мјера у управљању ризицима у банкама, као и даља имплементација међународно прихваћених стандарда и принципа пословања у овој области.Summary: The concept of value at risk (Value at Risk - VaR) is a measure that is increasingly used for assessing the level of exposure of financial markets’ participants. The aim of this concept, which has begun to prevail in the world of risk management since 1994, is estimation of the maximum loss of financial position at a given time for a given probability. Many methods have been developed to quantify risk. There are a number of measures to quantify the risk, and the aim of this paper is to expose these measures, with special emphasis on VaR. Also, when measuring financial risk, characteristics of financial time series should be taken into account, and therefore are particularly prominent in the work. The second part of the paper explains how these risk measures are covered by the regulations in risk control. The task of this paper is to analyze the risk control in Montenegro, and the importance of standards in force in contribution to the improvement of risk control. The idea of this paper is motivated by the desire to approach quantifying and managing risk in Montenegro more seriously. In the future, within the framework of the measures of the Central Bank to strengthen the financial system, the situation in the banking system will be continuously monitored and analyzed, by taking timely corrective measures in risk management in banks, as well as the further implementation of internationally accepted standards and principles in this field.


2021 ◽  
pp. 1-30
Author(s):  
Hansjörg Albrecher ◽  
José Carlos Araujo-Acuna ◽  
Jan Beirlant

Abstract In various applications of heavy-tail modelling, the assumed Pareto behaviour is tempered ultimately in the range of the largest data. In insurance applications, claim payments are influenced by claim management and claims may, for instance, be subject to a higher level of inspection at highest damage levels leading to weaker tails than apparent from modal claims. Generalizing earlier results of Meerschaert et al. (2012) and Raschke (2020), in this paper we consider tempering of a Pareto-type distribution with a general Weibull distribution in a peaks-over-threshold approach. This requires to modulate the tempering parameters as a function of the chosen threshold. Modelling such a tempering effect is important in order to avoid overestimation of risk measures such as the value-at-risk at high quantiles. We use a pseudo maximum likelihood approach to estimate the model parameters and consider the estimation of extreme quantiles. We derive basic asymptotic results for the estimators, give illustrations with simulation experiments and apply the developed techniques to fire and liability insurance data, providing insight into the relevance of the tempering component in heavy-tail modelling.


Sign in / Sign up

Export Citation Format

Share Document