scholarly journals ESTIMATING THE VOLATILITY OCCUPATION TIME VIA REGULARIZED LAPLACE INVERSION

2015 ◽  
Vol 32 (5) ◽  
pp. 1253-1288 ◽  
Author(s):  
Jia Li ◽  
Viktor Todorov ◽  
George Tauchen

We propose a consistent functional estimator for the occupation time of the spot variance of an asset price observed at discrete times on a finite interval with the mesh of the observation grid shrinking to zero. The asset price is modeled nonparametrically as a continuous-time Itô semimartingale with nonvanishing diffusion coefficient. The estimation procedure contains two steps. In the first step we estimate the Laplace transform of the volatility occupation time and, in the second step, we conduct a regularized Laplace inversion. Monte Carlo evidence suggests that the proposed estimator has good small-sample performance and in particular it is far better at estimating lower volatility quantiles and the volatility median than a direct estimator formed from the empirical cumulative distribution function of local spot volatility estimates. An empirical application shows the use of the developed techniques for nonparametric analysis of variation of volatility.

Atmosphere ◽  
2021 ◽  
Vol 12 (4) ◽  
pp. 475
Author(s):  
Hassen Babaousmail ◽  
Rongtao Hou ◽  
Brian Ayugi ◽  
Moses Ojara ◽  
Hamida Ngoma ◽  
...  

This study assesses the performance of historical rainfall data from the Coupled Model Intercomparison Project phase 6 (CMIP6) in reproducing the spatial and temporal rainfall variability over North Africa. Datasets from Climatic Research Unit (CRU) and Global Precipitation Climatology Centre (GPCC) are used as proxy to observational datasets to examine the capability of 15 CMIP6 models’ and their ensemble in simulating rainfall during 1951–2014. In addition, robust statistical metrics, empirical cumulative distribution function (ECDF), Taylor diagram (TD), and Taylor skill score (TSS) are utilized to assess models’ performance in reproducing annual and seasonal and monthly rainfall over the study domain. Results show that CMIP6 models satisfactorily reproduce mean annual climatology of dry/wet months. However, some models show a slight over/under estimation across dry/wet months. The models’ overall top ranking from all the performance analyses ranging from mean cycle simulation, trend analysis, inter-annual variability, ECDFs, and statistical metrics are as follows: EC-Earth3-Veg, UKESM1-0-LL, GFDL-CM4, NorESM2-LM, IPSL-CM6A-LR, and GFDL-ESM4. The mean model ensemble outperformed the individual CMIP6 models resulting in a TSS ratio (0.79). For future impact studies over the study domain, it is advisable to employ the multi-model ensemble of the best performing models.


Entropy ◽  
2018 ◽  
Vol 20 (9) ◽  
pp. 642 ◽  
Author(s):  
Erlandson Saraiva ◽  
Adriano Suzuki ◽  
Luis Milan

In this paper, we study the performance of Bayesian computational methods to estimate the parameters of a bivariate survival model based on the Ali–Mikhail–Haq copula with marginal distributions given by Weibull distributions. The estimation procedure was based on Monte Carlo Markov Chain (MCMC) algorithms. We present three version of the Metropolis–Hastings algorithm: Independent Metropolis–Hastings (IMH), Random Walk Metropolis (RWM) and Metropolis–Hastings with a natural-candidate generating density (MH). Since the creation of a good candidate generating density in IMH and RWM may be difficult, we also describe how to update a parameter of interest using the slice sampling (SS) method. A simulation study was carried out to compare the performances of the IMH, RWM and SS. A comparison was made using the sample root mean square error as an indicator of performance. Results obtained from the simulations show that the SS algorithm is an effective alternative to the IMH and RWM methods when simulating values from the posterior distribution, especially for small sample sizes. We also applied these methods to a real data set.


2021 ◽  
Vol 13 (21) ◽  
pp. 4243
Author(s):  
Mona Morsy ◽  
Ruhollah Taghizadeh-Mehrjardi ◽  
Silas Michaelides ◽  
Thomas Scholten ◽  
Peter Dietrich ◽  
...  

Water depletion is a growing problem in the world’s arid and semi-arid areas, where groundwater is the primary source of fresh water. Accurate climatic data must be obtained to protect municipal water budgets. Unfortunately, the majority of these arid regions have a sparsely distributed number of rain gauges, which reduces the reliability of the spatio-temporal fields generated. The current research proposes a series of measures to address the problem of data scarcity, in particular regarding in-situ measurements of precipitation. Once the issue of improving the network of ground precipitation measurements is settled, this may pave the way for much-needed hydrological research on topics such as the spatiotemporal distribution of precipitation, flash flood prevention, and soil erosion reduction. In this study, a k-means cluster analysis is used to determine new locations for the rain gauge network at the Eastern side of the Gulf of Suez in Sinai. The clustering procedure adopted is based on integrating a digital elevation model obtained from The Shuttle Radar Topography Mission (SRTM 90 × 90 m) and Integrated Multi-Satellite Retrievals for GPM (IMERG) for four rainy events. This procedure enabled the determination of the potential centroids for three different cluster sizes (3, 6, and 9). Subsequently, each number was tested using the Empirical Cumulative Distribution Function (ECDF) in an effort to determine the optimal one. However, all the tested centroids exhibited gaps in covering the whole range of elevations and precipitation of the test site. The nine centroids with the five existing rain gauges were used as a basis to calculate the error kriging. This procedure enabled decreasing the error by increasing the number of the proposed gauges. The resulting points were tested again by ECDF and this confirmed the optimum of thirty-one suggested additional gauges in covering the whole range of elevations and precipitation records at the study site.


Author(s):  
Jan Ditzen

In this article, I introduce a new command, xtdcce2, that fits a dynamic common-correlated effects model with heterogeneous coefficients in a panel with a large number of observations over cross-sectional units and time periods. The estimation procedure mainly follows Chudik and Pesaran (2015b, Journal of Econometrics 188: 393–420) but additionally supports the common correlated effects estimator (Pesaran, 2006, Econometrica 74: 967–1012), the mean group estimator (Pesaran and Smith, 1995, Journal of Econometrics 68: 79–113), and the pooled mean group estimator (Pesaran, Shin, and Smith, 1999, Journal of the American Statistical Association, 94: 621–634). xtdcce2 allows heterogeneous or homogeneous coefficients and supports instrumental-variable regressions and unbalanced panels. The cross-sectional dependence test is automatically calculated and presented in the estimation output. Small-sample time-series bias can be corrected by “half-panel” jackknife correction or recursive mean adjustment. I carry out a simulation to prove the estimator's consistency.


2016 ◽  
Vol 61 (3) ◽  
pp. 489-496
Author(s):  
Aleksander Cianciara

Abstract The paper presents the results of research aimed at verifying the hypothesis that the Weibull distribution is an appropriate statistical distribution model of microseismicity emission characteristics, namely: energy of phenomena and inter-event time. It is understood that the emission under consideration is induced by the natural rock mass fracturing. Because the recorded emission contain noise, therefore, it is subjected to an appropriate filtering. The study has been conducted using the method of statistical verification of null hypothesis that the Weibull distribution fits the empirical cumulative distribution function. As the model describing the cumulative distribution function is given in an analytical form, its verification may be performed using the Kolmogorov-Smirnov goodness-of-fit test. Interpretations by means of probabilistic methods require specifying the correct model describing the statistical distribution of data. Because in these methods measurement data are not used directly, but their statistical distributions, e.g., in the method based on the hazard analysis, or in that that uses maximum value statistics.


2013 ◽  
Vol 860-863 ◽  
pp. 2083-2087
Author(s):  
Xian Jun Qi ◽  
Jia Yi Shi ◽  
Xiang Tian Peng

Probability box (P-box) and interval probability (IP) were used to express both variability and imprecision of wind speed and output power of WTGs. The p-box of WTG's output power was constructed by empirical cumulative distribution function and K.S. confidence limits. The discrete IP distribution of WTG's output power was elicited from the p-box. The optimization model of imprecise generating capacity adequacy assessment incorporating wind power was established and solved by genetic algorithm (GA). Case study on RBTSW system shows the rationality of presented method.


Methodology ◽  
2011 ◽  
Vol 7 (3) ◽  
pp. 111-120 ◽  
Author(s):  
Omar Paccagnella

In a multilevel framework several researches have investigated the behavior of estimates in finite samples, particularly for continuous dependent variables. Some findings show poor precise estimates for the variance components. On the other hand, discrete response multilevel models have been investigated less widely. In this paper we analyze the influence of different factors on the accuracy of estimates and standard errors of estimates in a binary response 2-level model, through a Monte Carlo simulation study. We investigate the hypothesis of: (a) small sample sizes; (b) different intraclass correlation coefficients; (c) different numbers of quadrature points in the estimation procedure. Standard errors of estimates are studied through a noncoverage indicator. In all instances we have considered, the point estimates are unbiased (even with very small sample sizes), while the variance components are underestimated. The accuracy of the standard errors of variance estimates needs a very large number of groups.


2021 ◽  
Author(s):  
Bruno Majone ◽  
Diego Avesani ◽  
Patrick Zulian ◽  
Aldo Fiori ◽  
Alberto Bellin

Abstract. Climate change impact studies on hydrological extremes often rely on the use of hydrological models with parameters inferred by using observational data of daily streamflow. In this work we show that this is an error prone procedure when the interest is to develop reliable Empirical Cumulative Distribution Function curves of annual streamflow maximum. As an alternative approach we introduce a methodology, coined Hydrological Calibration of eXtremes (HyCoX), in which the calibration of the hydrological model is carried out by directly targeting the probability distribution of high flow extremes. In particular, hydrological simulations conducted during a reference period, as driven by climate models’ outputs, are constrained to maximize the probability that the modeled and observed high flow extremes belong to the same population. The application to the Adige river catchment (southeastern Alps, Italy) by means of HYPERstreamHS, a distributed hydrological model, showed that this procedure preserves statistical coherence and produce reliable quantiles of the annual maximum streamflow to be used in assessment studies.


Sign in / Sign up

Export Citation Format

Share Document