scholarly journals GENERALIZED PARETO DISTRIBUTION UNTUK PENGUKURAN VALUE AT RISK PADA PORTOFOLIO SAHAM SYARIAH DAN APLIKASINYA MENGGUNAKAN GUI MATLAB

2018 ◽  
Vol 7 (3) ◽  
pp. 224-235
Author(s):  
Desi Nur Rahma ◽  
Di Asih I Maruddani ◽  
Tarno Tarno

The capital market is one of long-term investment alternative. One of the traded products is stock, including sharia stock. The risk measurement is an important thing for investor in other that can decrease investment loss. One of the popular methods now is Value at Risk (VaR). There are many financial data that have heavy tailed, because of extreme values, so Value at Risk Generalized Pareto Distribution is used for this case. This research also result a Matlab GUI programming application that can help users to measure the VaR. The purpose of this research is to analyze VaR with GPD approach with GUI Matlab for helping the computation in sharia stock. The data that is used in this case are PT XL Axiata Tbk, PT Waskita Karya (Persero) Tbk, dan PT Charoen Pokphand Indonesia Tbk on January, 2nd 2017 until May, 31st 2017. The results of VaRGPD are: EXCL single stock VaR 8,76% of investment, WSKT single stock VaR 4% of investment, CPIN single stock VaR 5,86% of investment, 2 assets portfolio (EXCL and WSKT) 4,09% of investment, 2 assets portfolio (EXCL and CPIN) 5,28% of investment, 2 assets portfolio (WSKT and CPIN) 3,68% of investment, and 3 assets portfolio (EXCL, WSKT, and CPIN) 3,75% of investment. It can be concluded that the portfolios more and more, the risk is smaller. It is because the possibility of all stocks of the company dropped together is small. Keywords: Generalized Pareto Distribution, Value at Risk, Graphical User Interface, sharia stock

2012 ◽  
Vol 22 (2) ◽  
pp. 297-311 ◽  
Author(s):  
Jelena Jockovic

Generalized Pareto distributions (GPD) are widely used for modeling excesses over high thresholds (within the framework of the POT-approach to modeling extremes). The aim of the paper is to give the review of the classical techniques for estimating GPD quantiles, and to apply these methods in finance - to estimate the Value-at-Risk (VaR) parameter, and discuss certain difficulties related to this subject.


2019 ◽  
Vol 17 (4) ◽  
pp. 56
Author(s):  
Jaime Enrique Lincovil ◽  
Chang Chiann

<p>Evaluating forecasts of risk measures, such as value–at–risk (VaR) and expected shortfall (ES), is an important process for financial institutions. Backtesting procedures were introduced to assess the efficiency of these forecasts. In this paper, we compare the empirical power of new classes of backtesting, for VaR and ES, from the statistical literature. Further, we employ these procedures to evaluate the efficiency of the forecasts generated by both the Historical Simulation method and two methods based on the Generalized Pareto Distribution. To evaluate VaR forecasts, the empirical power of the Geometric–VaR class of backtesting was, in general, higher than that of other tests in the simulated scenarios. This supports the advantages of using defined time periods and covariates in the test procedures. On the other hand, to evaluate ES forecasts, backtesting methods based on the conditional distribution of returns to the VaR performed well with large sample sizes. Additionally, we show that the method based on the generalized Pareto distribution using durations and covariates has optimal performance in forecasts of VaR and ES, according to backtesting.</p>


Author(s):  
Ngozi J. Amachukwu ◽  
Happiness O. Obiora-Ilouno ◽  
Edwin I. Obisue

Background and objective: Crude oil is an essential commodity in many countries of the world. This work studies the risk involved in the extreme crude oil price, using the daily crude oil price of the Brent and the West Texas benchmark from year 1990 to 2019. Materials and methods: The Peak Over Threshold (POT) approach of the Generalized Pareto Distribution (GPD) was used to model the extreme crude oil price while the value at risk and the expected shortfall was used to quantify the risk involved in extreme price of crude oil. The GPD, using the Q-Q plot was found to be a good model for the extreme values of the crude oil price. Results: The Value at Risk (VaR) and the Expected Shortfall (ES) calculated at 90%, 95% and 99% with the Maximum Likelihood estimators of GPD parameters and the threshold values were found to decrease with increase in quantile for both benchmark. This shows that risk involved in extreme crude oil price will be borne only by the investors and public. Conclusion: It was also found that the VaR and ES of the Brent are higher than that of West Texas. This implies that it is safer to invest in West Texas crude oil.


2019 ◽  
Vol 13 (1) ◽  
pp. 63-72
Author(s):  
Yanur Akhmadi ◽  
Iqbal Mustofa ◽  
Hotmauly Media Rika ◽  
Dewi Hanggraeni

Perkembangan sektor perbankan di Indonesia dalam 10 tahun mengalami pertumbuhan yang agresif, tetapi juga memperhatikan rasio modal berdasarkan risiko sesuai dengan ketentuan otoritas. Bank BUMN di Indonesia menguasasi ?45% dari total aset pada sektor perbankan, dan memiliki rasio penyediaan modal minimum yang lebih tinggi dari yang disyaratkan otoritas. Penelitian in imembahas perhitungan VaR dengan metode GEV dan GPD, serta membandingkannya dengan rasio penyediaan modal minimum bank. Hasil perhitungan dengan menggunakan metode GPD paling mendekati nilai rasio modal bank, selain itu kedua hasil perhitungan baik GEV dan GPD lebih tinggi saat dibandingkan dengan perhitungan VaR dengan metode lainnya ataupun ketentuan yang ditetapkan oleh otoritas.


Mathematics ◽  
2019 ◽  
Vol 7 (5) ◽  
pp. 406 ◽  
Author(s):  
Xu Zhao ◽  
Zhongxian Zhang ◽  
Weihu Cheng ◽  
Pengyue Zhang

Techniques used to analyze exceedances over a high threshold are in great demand for research in economics, environmental science, and other fields. The generalized Pareto distribution (GPD) has been widely used to fit observations exceeding the tail threshold in the peaks over threshold (POT) framework. Parameter estimation and threshold selection are two critical issues for threshold-based GPD inference. In this work, we propose a new GPD-based estimation approach by combining the method of moments and likelihood moment techniques based on the least squares concept, in which the shape and scale parameters of the GPD can be simultaneously estimated. To analyze extreme data, the proposed approach estimates the parameters by minimizing the sum of squared deviations between the theoretical GPD function and its expectation. Additionally, we introduce a recently developed stopping rule to choose the suitable threshold above which the GPD asymptotically fits the exceedances. Simulation studies show that the proposed approach performs better or similar to existing approaches, in terms of bias and the mean square error, in estimating the shape parameter. In addition, the performance of three threshold selection procedures is assessed by estimating the value-at-risk (VaR) of the GPD. Finally, we illustrate the utilization of the proposed method by analyzing air pollution data. In this analysis, we also provide a detailed guide regarding threshold selection.


2012 ◽  
Vol 1 (33) ◽  
pp. 42
Author(s):  
Pietro Bernardara ◽  
Franck Mazas ◽  
Jérôme Weiss ◽  
Marc Andreewsky ◽  
Xavier Kergadallan ◽  
...  

In the general framework of over-threshold modelling (OTM) for estimating extreme values of met-ocean variables, such as waves, surges or water levels, the threshold selection logically requires two steps: the physical declustering of time series of the variable in order to obtain samples of independent and identically distributed data then the application of the extreme value theory, which predicts the convergence of the upper part of the sample toward the Generalized Pareto Distribution. These two steps were often merged and confused in the past. A clear framework for distinguishing them is presented here. A review of the methods available in literature to carry out these two steps is given here together with the illustration of two simple and practical examples.


2019 ◽  
Author(s):  
Riccardo Zucca ◽  
Xerxes D. Arsiwalla ◽  
Hoang Le ◽  
Mikail Rubinov ◽  
Antoni Gurguí ◽  
...  

ABSTRACTAre degree distributions of human brain functional connectivity networks heavy-tailed? Initial claims based on least-square fitting suggested that brain functional connectivity networks obey power law scaling in their degree distributions. This interpretation has been challenged on methodological grounds. Subsequently, estimators based on maximum-likelihood and non-parametric tests involving surrogate data have been proposed. No clear consensus has emerged as results especially depended on data resolution. To identify the underlying topological distribution of brain functional connectivity calls for a closer examination of the relationship between resolution and statistics of model fitting. In this study, we analyze high-resolution functional magnetic resonance imaging (fMRI) data from the Human Connectome Project to assess its degree distribution across resolutions. We consider resolutions from one thousand to eighty thousand regions of interest (ROIs) and test whether they follow a heavy or short-tailed distribution. We analyze power law, exponential, truncated power law, log-normal, Weibull and generalized Pareto probability distributions. Notably, the Generalized Pareto distribution is of particular interest since it interpolates between heavy-tailed and short-tailed distributions, and it provides a handle on estimating the tail’s heaviness or shortness directly from the data. Our results show that the statistics support the short-tailed limit of the generalized Pareto distribution, rather than a power law or any other heavy-tailed distribution. Working across resolutions of the data and performing cross-model comparisons, we further establish the overall robustness of the generalized Pareto model in explaining the data. Moreover, we account for earlier ambiguities by showing that down-sampling the data systematically affects statistical results. At lower resolutions models cannot easily be differentiated on statistical grounds while their plausibility consistently increases up to an upper bound. Indeed, more power law distributions are reported at low resolutions (5K) than at higher ones (50K or 80K). However, we show that these positive identifications at low resolutions fail cross-model comparisons and that down-sampling data introduces the risk of detecting spurious heavy-tailed distributions. This dependence of the statistics of degree distributions on sampling resolution has broader implications for neuroinformatic methodology, especially, when several analyses rely on down-sampled data, for instance, due to a choice of anatomical parcellations or measurement technique. Our findings that node degrees of human brain functional networks follow a short-tailed distribution have important implications for claims of brain organization and function. Our findings do not support common simplistic representations of the brain as a generic complex system with optimally efficient architecture and function, modeled with simple growth mechanisms. Instead these findings reflect a more nuanced picture of a biological system that has been shaped by longstanding and pervasive developmental and architectural constraints, including wiring-cost constraints on the centrality architecture of individual nodes.


Sign in / Sign up

Export Citation Format

Share Document