block maxima
Recently Published Documents


TOTAL DOCUMENTS

52
(FIVE YEARS 27)

H-INDEX

7
(FIVE YEARS 1)

Atmosphere ◽  
2021 ◽  
Vol 12 (9) ◽  
pp. 1176
Author(s):  
Yan-Qing Chen ◽  
Sheng Zheng ◽  
Yan-Shan Xiao ◽  
Shu-Guang Zeng ◽  
Tuan-Hui Zhou ◽  
...  

Based on the daily sunspot number (SN) data (1954–2011) from the Purple Mountain Observatory, the extreme value theory (EVT) is employed for the research of the long-term solar activity. It is the first time that the EVT is applied on the Chinese SN. Two methods are used for the research of the extreme events with EVT. One method is the block maxima (BM) approach, which picks the maximum SN value of each block. Another one is the peaks-over-threshold (POT) approach. After a declustering process, a threshold value (here it is 300) is set to pick the extreme values. The negative shape parameters are obtained by the two methods, respectively, indicating that there is an upper bound for the extreme SN value. Only one value of the N-year return level (RL) is estimated: N = 19 years. For N = 19 years, the RL values of SN obtained by two methods are similar with each other. The RL values are found to be 420 for the POT method and the BM method. Here, the trend of 25th solar cycle is predicted to be stronger, indicating that the length of meridional forms of atmospheric circulation will be increased.


2021 ◽  
Vol 2 (2) ◽  
pp. 06-15
Author(s):  
Mamadou Cisse ◽  
Aliou Diop ◽  
Souleymane Bognini ◽  
Nonvikan Karl-Augustt ALAHASSA

In extreme values theory, there exist two approaches about data treatment: block maxima and peaks-over-threshold (POT) methods, which take in account data over a fixed value. But, those approaches are limited. We show that if a certain geometry is modeled with stochastic graphs, probabilities computed with Generalized Extreme Value (GEV) Distribution can be deflated. In other words, taking data geometry in account change extremes distribution. Otherwise, it appears that if the density characterizing the states space of data system is uniform, and if the quantile studied is positive, then the Weibull distribution is insensitive to data geometry, when it is an area attraction, and the Fréchet distribution becomes the less inflationary.


2021 ◽  
Vol 5 (2) ◽  
pp. 405-414
Author(s):  
Hasna Afifah Rusyda ◽  
Fajar Indrayatna ◽  
Lienda Noviyanti

This paper will discuss the risk estimation of a portfolio based on value at risk (VaR) using a copula-based asymmetric Glosten – Jagannathan – Runkle - Generalized Autoregressive Conditional Heteroskedasticity (GJR-GARCH). There is non-linear correlation for dependent model structure among the variables that lead to the inaccurate VaR estimation so that we use copula functions to model the joint probability of large market movements. Data is GEV distributed. Therefore, we use Block Maxima consisting of fitting an extreme value distribution as a tail distribution to count VaR. The results show VaR can estimate the risk of portfolio return reasonably because the model has captured the data properties. Data volatility can be accommodated by GJR-GARCH, Copula can capture dependence between stocks, and Block maxima can accommodate extreme tail behavior of the data.


Author(s):  
Nurulkamal Masseran ◽  
Muhammad Aslam Mohd Safari

This article proposes a novel data selection technique called the mixed peak-over-threshold–block-maxima (POT-BM) approach for modeling unhealthy air pollution events. The POT technique is employed to obtain a group of blocks containing data points satisfying extreme-event criteria that are greater than a particular threshold u. The selected groups are defined as POT blocks. In parallel with that, a declustering technique is used to overcome the problem of dependency behaviors that occurs among adjacent POT blocks. Finally, the BM concept is integrated to determine the maximum data points for each POT block. Results show that the extreme data points determined by the mixed POT-BM approach satisfy the independent properties of extreme events, with satisfactory fitted model precision results. Overall, this study concludes that the mixed POT-BM approach provides a balanced tradeoff between bias and variance in the statistical modeling of extreme-value events. A case study was conducted by modeling an extreme event based on unhealthy air pollution events with a threshold u > 100 in Klang, Malaysia.


2021 ◽  
Author(s):  
Philomène Le Gall ◽  
Pauline Rivoire ◽  
Anne-Catherine Favre ◽  
Philippe Naveau ◽  
Olivia Romppainen-Martius

<p>Extreme precipitation often cause floods and lead to important societal and economical damages. Rainfall is subject to local orography features and their intensities can be highly variable. In this context, identifying climatically coherent regions for extremes is paramount to understand and analyze rainfall at the correct spatial scale. We assume that the region of interest can be partitioned into homogeneous regions. In other words, sub-regions with common marginal distribution except a scale factor. As an example, considering extremes as block maxima or excesses over a threshold, a sub-region corresponds to a constant shape parameter. We develop a non-parametric clustering algorithm based on a ratio of Probability Weighted Moments to identify these homogeneous regions and gather weather stations. By construction this ratio does not depend on the location and scale parameters for the Generalized Extreme Value and Generalized Pareto distributions. Our method has the advantage to only rely on raw precipitation data and not on station covariates.</p><p>A simulation data study is performed based on the extended GPD distribution that appears to well capture low, moderate and heavy rainfall intensities. Sensitivity to the number of clusters is analyzed. Results of simulation reveal that the method detects homogeneous regions. We apply our clustering algorithm on ERA-5 precipitation over Europe. We obtain coherent homogeneous regions consistent with local orography. The marginal precipitation behaviour is analyzed through regional fitting of an extended GPD.</p>


Author(s):  
Guilherme Isaias Debom Machado ◽  
Fabian Luis Vargas ◽  
Celso Maciel da Costa

The execution time is a requirement as much important as the computed result when designing real-time systems for critical applications. It is imperative to know the possible execution times, especially when some system delay may incur in equipment damages or even in crew injuries. With that in mind, the current work analyzes different techniques to define the Probabilistic Worst Case Execution Time (pWCET) using the Extreme Value Theory (EVT). Since probabilistic methodologies have been widely explored, this study aims to assure how accurate the pWCET estimations are when applying EVT knowledge. This analysis aims to compare system pWCET estimations to this real behavior, predicting the upper bound execution limits of two algorithms on MIPS processor. Further, this work regards the Block Maxima technique, which select the highest measured values to define a probabilistic distribution that represents the analyzed system. Based on the outcomes the Block Maxima technique points some limitations as requiring a large number of samples to get a reliable analysis. The obtained results have shown that EVT is a useful and trustworthy technique to define pWCET estimations.


2021 ◽  
Author(s):  
Matthew Sjaarda ◽  
Alain Nussbaumer ◽  
Dimitrios Papastergiou

<p>The Eurocode LM1 for traffic loads on bridges features side-by-side tandem axles, as well as uniformly distributed lane loads. This LM is mirrored in the Swiss code SIA 261, for new structures, as well as SIA 269, for existing structures, where updating is permitted based on existing traffic in the form of updated alpha factors, αQ1 and αQ2. The research herein uses an extensive WIM database to update alpha factors for Swiss traffic. For the first (slow) lane, this is done using simple block maxima of tandem axle statistics (daily, weekly, and yearly block maxima results are compared) with log-normal fitting to the extreme value statistic. For the second lane, a novel approach is used which reconstructs real multiple-presence scenarios from the WIM data to predict the total joint load across both lanes. The result of the single lane and joint analyses are recommended updated alpha factors reduced by a factor of one third as compared to those mandated for new construction.</p>


Sign in / Sign up

Export Citation Format

Share Document