scholarly journals An Optimal Tail Selection in Risk Measurement

Risks ◽  
2021 ◽  
Vol 9 (4) ◽  
pp. 70
Author(s):  
Małgorzata Just ◽  
Krzysztof Echaust

The appropriate choice of a threshold level, which separates the tails of the probability distribution of a random variable from its middle part, is considered to be a very complex and challenging task. This paper provides an empirical study on various methods of the optimal tail selection in risk measurement. The results indicate which method may be useful in practice for investors and financial and regulatory institutions. Some methods that perform well in simulation studies, based on theoretical distributions, may not perform well when real data are in use. We analyze twelve methods with different parameters for forty-eight world indices using returns from the period of 2000–Q1 2020 and four sub-periods. The research objective is to compare the methods and to identify those which can be recognized as useful in risk measurement. The results suggest that only four tail selection methods, i.e., the Path Stability algorithm, the minimization of the Asymptotic Mean Squared Error approach, the automated Eyeball method with carefully selected tuning parameters and the Hall single bootstrap procedure may be useful in practical applications.

Mathematics ◽  
2020 ◽  
Vol 8 (1) ◽  
pp. 114 ◽  
Author(s):  
Krzysztof Echaust ◽  
Małgorzata Just

A conditional Extreme Value Theory (GARCH-EVT) approach is a two-stage hybrid method that combines a Generalized Autoregressive Conditional Heteroskedasticity (GARCH) filter with the Extreme Value Theory (EVT). The approach requires pre-specification of a threshold separating distribution tails from its middle part. The appropriate choice of a threshold level is a demanding task. In this paper we use four different optimal tail selection algorithms, i.e., the path stability method, the automated Eye-Ball method, the minimization of asymptotic mean squared error method and the distance metric method with a mean absolute penalty function, to estimate out-of-sample Value at Risk (VaR) forecasts and compare them to the fixed threshold approach. Unlike other studies, we update the optimal fraction of the tail for each rolling window of the returns. The research objective is to verify to what extent optimization procedures can improve VaR estimates compared to the fixed threshold approach. Results are presented for a long and a short position applying 10 world stock indices in the period from 2000 to June 2019. Although each approach generates different threshold levels, the GARCH-EVT model produces similar Value at Risk estimates. Therefore, no improvement of VaR accuracy may be observed relative to the conservative approach taking the 95th quantile of returns as a threshold.


2021 ◽  
Vol 2021 ◽  
pp. 1-16
Author(s):  
Najma Salahuddin ◽  
Alamgir Khalil ◽  
Wali Khan Mashwani ◽  
Sharifah Alrajhi ◽  
Sanaa Al-Marzouki ◽  
...  

In this paper, a new generalization of the Generalized Pareto distribution is proposed using the generator suggested in [1], named as Khalil Extended Generalized Pareto (KEGP) distribution. Various shapes of the suggested model and important mathematical properties are investigated that includes moments, quantile function, moment-generating function, measures of entropy, and order statistics. Parametric estimation of the model is discussed using the technique of maximum likelihood. A simulation study is performed for the assessment of the maximum likelihood estimates in terms of their bias and mean squared error using simulated sample estimates. The practical applications are illustrated via two real data sets from survival and reliability theory. The suggested model provided better fits than the other considered models.


Author(s):  
Parisa Torkaman

The generalized inverted exponential distribution is introduced as a lifetime model with good statistical properties. This paper, the estimation of the probability density function and the cumulative distribution function of with five different estimation methods: uniformly minimum variance unbiased(UMVU), maximum likelihood(ML), least squares(LS), weighted least squares (WLS) and percentile(PC) estimators are considered. The performance of these estimation procedures, based on the mean squared error (MSE) by numerical simulations are compared. Simulation studies express that the UMVU estimator performs better than others and when the sample size is large enough the ML and UMVU estimators are almost equivalent and efficient than LS, WLS and PC. Finally, the result using a real data set are analyzed.


Extremes ◽  
2021 ◽  
Author(s):  
Laura Fee Schneider ◽  
Andrea Krajina ◽  
Tatyana Krivobokova

AbstractThreshold selection plays a key role in various aspects of statistical inference of rare events. In this work, two new threshold selection methods are introduced. The first approach measures the fit of the exponential approximation above a threshold and achieves good performance in small samples. The second method smoothly estimates the asymptotic mean squared error of the Hill estimator and performs consistently well over a wide range of processes. Both methods are analyzed theoretically, compared to existing procedures in an extensive simulation study and applied to a dataset of financial losses, where the underlying extreme value index is assumed to vary over time.


2021 ◽  
Vol 0 (0) ◽  
Author(s):  
Tianyi Wang ◽  
Chengxiang Wang ◽  
Kequan Zhao ◽  
Wei Yu ◽  
Min Huang

Abstract Limited-angle computed tomography (CT) reconstruction problem arises in some practical applications due to restrictions in the scanning environment or CT imaging device. Some artifacts will be presented in image reconstructed by conventional analytical algorithms. Although some regularization strategies have been proposed to suppress the artifacts, such as total variation (TV) minimization, there is still distortion in some edge portions of image. Guided image filtering (GIF) has the advantage of smoothing the image as well as preserving the edge. To further improve the image quality and protect the edge of image, we propose a coupling method, that combines ℓ 0 {\ell_{0}} gradient minimization and GIF. An intermediate result obtained by ℓ 0 {\ell_{0}} gradient minimization is regarded as a guidance image of GIF, then GIF is used to filter the result reconstructed by simultaneous algebraic reconstruction technique (SART) with nonnegative constraint. It should be stressed that the guidance image is dynamically updated as the iteration process, which can transfer the edge to the filtered image. Some simulation and real data experiments are used to evaluate the proposed method. Experimental results show that our method owns some advantages in suppressing the artifacts of limited angle CT and in preserving the edge of image.


1994 ◽  
Vol 21 (6) ◽  
pp. 1074-1080 ◽  
Author(s):  
J. Llamas ◽  
C. Diaz Delgado ◽  
M.-L. Lavertu

In this paper, an improved probabilistic method for flood analysis using the probable maximum flood, the beta function, and orthogonal Jacobi’s polynomials is proposed. The shape of the beta function depends on the sample's characteristics and the bounds of the phenomenon. On the other hand, a serial of Jacobi’s polynomials has been used improving the beta function and increasing its convergence degree toward the real flood probability density function. This mathematical model has been tested using a sample of 1000 generated beta random data. Finally, some practical applications with real data series, from important Quebec's rivers, have been performed; the model solutions for these rivers showed the accuracy of this new method in flood frequency estimation. Key words: probable maximum flood, beta function, orthogonal polynomials, distribution function, flood frequency estimation, data generation, convergency.


2013 ◽  
Vol 2013 ◽  
pp. 1-11 ◽  
Author(s):  
Jia-Rou Liu ◽  
Po-Hsiu Kuo ◽  
Hung Hung

Large-p-small-ndatasets are commonly encountered in modern biomedical studies. To detect the difference between two groups, conventional methods would fail to apply due to the instability in estimating variances int-test and a high proportion of tied values in AUC (area under the receiver operating characteristic curve) estimates. The significance analysis of microarrays (SAM) may also not be satisfactory, since its performance is sensitive to the tuning parameter, and its selection is not straightforward. In this work, we propose a robust rerank approach to overcome the above-mentioned diffculties. In particular, we obtain a rank-based statistic for each feature based on the concept of “rank-over-variable.” Techniques of “random subset” and “rerank” are then iteratively applied to rank features, and the leading features will be selected for further studies. The proposed re-rank approach is especially applicable for large-p-small-ndatasets. Moreover, it is insensitive to the selection of tuning parameters, which is an appealing property for practical implementation. Simulation studies and real data analysis of pooling-based genome wide association (GWA) studies demonstrate the usefulness of our method.


Filomat ◽  
2018 ◽  
Vol 32 (17) ◽  
pp. 5931-5947
Author(s):  
Hatami Mojtaba ◽  
Alamatsaz Hossein

In this paper, we propose a new transformation of circular random variables based on circular distribution functions, which we shall call inverse distribution function (id f ) transformation. We show that M?bius transformation is a special case of our id f transformation. Very general results are provided for the properties of the proposed family of id f transformations, including their trigonometric moments, maximum entropy, random variate generation, finite mixture and modality properties. In particular, we shall focus our attention on a subfamily of the general family when id f transformation is based on the cardioid circular distribution function. Modality and shape properties are investigated for this subfamily. In addition, we obtain further statistical properties for the resulting distribution by applying the id f transformation to a random variable following a von Mises distribution. In fact, we shall introduce the Cardioid-von Mises (CvM) distribution and estimate its parameters by the maximum likelihood method. Finally, an application of CvM family and its inferential methods are illustrated using a real data set containing times of gun crimes in Pittsburgh, Pennsylvania.


2019 ◽  
Vol 35 (3) ◽  
pp. 1373-1392 ◽  
Author(s):  
Dong Ding ◽  
Axel Gandy ◽  
Georg Hahn

Abstract We consider a statistical test whose p value can only be approximated using Monte Carlo simulations. We are interested in deciding whether the p value for an observed data set lies above or below a given threshold such as 5%. We want to ensure that the resampling risk, the probability of the (Monte Carlo) decision being different from the true decision, is uniformly bounded. This article introduces a simple open-ended method with this property, the confidence sequence method (CSM). We compare our approach to another algorithm, SIMCTEST, which also guarantees an (asymptotic) uniform bound on the resampling risk, as well as to other Monte Carlo procedures without a uniform bound. CSM is free of tuning parameters and conservative. It has the same theoretical guarantee as SIMCTEST and, in many settings, similar stopping boundaries. As it is much simpler than other methods, CSM is a useful method for practical applications.


Sign in / Sign up

Export Citation Format

Share Document