Dependence modelling in ultra high dimensions with vine copulas and the Graphical Lasso

2019 ◽  
Vol 137 ◽  
pp. 211-232 ◽  
Author(s):  
Dominik Müller ◽  
Claudia Czado
2018 ◽  
Vol 29 (2) ◽  
pp. 269-287 ◽  
Author(s):  
Dominik Müller ◽  
Claudia Czado

Water ◽  
2021 ◽  
Vol 13 (7) ◽  
pp. 964
Author(s):  
Wafaa El Hannoun ◽  
Salah-Eddine El Adlouni ◽  
Abdelhak Zoglat

This paper features an application of Regular Vine (R-vine) copulas, a recently developed statistical tool to assess composite risk. Copula-based dependence modelling is a popular tool in conditional risk assessment, but is usually applied to pairs of variables. By contrast, Vine copulas provide greater flexibility and permit the modelling of complex dependency patterns using a wide variety of bivariate copulas which may be arranged and analysed in a tree structure to explore multiple dependencies. This study emphasises the use of R-vine copulas in an analysis of the co-dependencies of five reservoirs in the cascade of the Saint-John River basin in Eastern Canada. The developed R-vine copulas lead to the joint and conditional return periods of maximum volumes, for hydrologic design and cascade reservoir management in the basin. The main attraction of this approach to risk modelling is the flexibility in the choice of distributions used to model heavy-tailed marginals and co-dependencies.


2017 ◽  
Vol 28 (2) ◽  
pp. 323-341 ◽  
Author(s):  
Matthias Killiches ◽  
Daniel Kraus ◽  
Claudia Czado
Keyword(s):  

2011 ◽  
Vol 11 (3) ◽  
pp. 272
Author(s):  
Ivan Gavrilyuk ◽  
Boris Khoromskij ◽  
Eugene Tyrtyshnikov

Abstract In the recent years, multidimensional numerical simulations with tensor-structured data formats have been recognized as the basic concept for breaking the "curse of dimensionality". Modern applications of tensor methods include the challenging high-dimensional problems of material sciences, bio-science, stochastic modeling, signal processing, machine learning, and data mining, financial mathematics, etc. The guiding principle of the tensor methods is an approximation of multivariate functions and operators with some separation of variables to keep the computational process in a low parametric tensor-structured manifold. Tensors structures had been wildly used as models of data and discussed in the contexts of differential geometry, mechanics, algebraic geometry, data analysis etc. before tensor methods recently have penetrated into numerical computations. On the one hand, the existing tensor representation formats remained to be of a limited use in many high-dimensional problems because of lack of sufficiently reliable and fast software. On the other hand, for moderate dimensional problems (e.g. in "ab-initio" quantum chemistry) as well as for selected model problems of very high dimensions, the application of traditional canonical and Tucker formats in combination with the ideas of multilevel methods has led to the new efficient algorithms. The recent progress in tensor numerical methods is achieved with new representation formats now known as "tensor-train representations" and "hierarchical Tucker representations". Note that the formats themselves could have been picked up earlier in the literature on the modeling of quantum systems. Until 2009 they lived in a closed world of those quantum theory publications and never trespassed the territory of numerical analysis. The tremendous progress during the very recent years shows the new tensor tools in various applications and in the development of these tools and study of their approximation and algebraic properties. This special issue treats tensors as a base for efficient numerical algorithms in various modern applications and with special emphases on the new representation formats.


Sign in / Sign up

Export Citation Format

Share Document