scholarly journals A copula-based multivariate drought indicator to design and monitor nature-based solutions

Author(s):  
Sisay Debele ◽  
Jeetendra Sahani ◽  
Federico Porcù ◽  
Leonardo Aragão ◽  
Christos Spyrou ◽  
...  

<p><strong>Abstract </strong></p><p>Droughts are comprehensive and complex naturally occurring hazards in any climatic region around the world and often result in the loss of life and severe ecosystem damage. Drought monitoring is usually based on single-variables that may not represent the corresponding risk appropriately to its multiple causation and impact characteristics under current and future climate scenarios. In order to address this issue, the multidimensional copulas function, which is a flexible statistical tool, could be applied to develop multivariate drought indicators and solve the complicated and nonlinear associations. The aim of this paper is to develop reliable designing, monitoring and prediction indicators for the proper assessment and intervention of drought risk by nature-based solutions (NBS). Using a copula-based multivariate drought indicator (CMDI) that considers all possible variables related to meteorological, agricultural and hydrological droughts is essential for better drought risk assessment and intervention. The CMDI was developed by integrating univariate marginal cumulative distribution functions of meteorological (precipitation), agricultural (soil moisture) and hydrological (streamflow) variables into their joint cumulative distribution function. CMDI was then applied to the selected study catchment (Po Valley, Italy and Spercheios River, Greece) using hydro-meteorological data from gauging stations and ERA5 gridded data for the period 1979-2017.  The result of CMDI showed moderate, severe and extreme drought frequencies in the two selected catchments. The constructed CMDI captured more severe to extreme drought occurrence than the considered single drought indicators. This proved that the CMDI could appropriately represent the complex and interrelated natural variables. The uncertainty analysis based on Monte Carlo experiments confirmed that CMDI is a more robust and reliable approach for assessing, planning and designing a nature-based intervention for drought risk. The findings of this research can provide a reliable way to develop approaches that can be used for assessing and predicting non-linearly related variables or any risk that may occur simultaneously or cumulatively over time.   </p><p>Keywords: Drought risk; multidimensional copulas; multivariate indicators, uncertainty analysis; frequency   </p><p><strong>Acknowledgements</strong>: This work is carried out under the framework of OPERANDUM (OPEn-air laboRAtories for Nature baseD solUtions to Manage hydro-meteo risks) project, which is funded by the Horizon 2020 under the Grant Agreement No: 776848.</p>

2021 ◽  
Author(s):  
Stephanie Thiesen ◽  
Uwe Ehret

<p>Uncertainty analysis is a critical subject for many environmental studies. We have previously combined statistical learning and Information Theory in a geostatistical framework for overcoming parameterization with functions and uncertainty trade-offs present in many traditional interpolators (Thiesen et al. 2020). The so-called Histogram via entropy reduction (HER) relaxes normality assumptions, avoiding the risk of adding information not available in the data. The authors showed that, by construction, the method provides a proper framework for uncertainty estimation which accounts for both spatial configuration and data values, while allowing one to introduce or infer properties of the field through the aggregation method. In this study, we explore HER method in the light of uncertainty analysis. In general, uncertainty at any particular unsampled location (local uncertainty) is frequently assessed by nonlinear interpolators such as indicator and multi-gaussian kriging. HER has shown to be a unique approach for dealing with uncertainty estimation in a fine resolution without the need of modeling multiple indicator semivariograms, order-relation violations, interpolation/extrapolation of conditional cumulative distribution functions, or stronger hypotheses of data distribution. In this work, this nonparametric geostatistical framework is adapted to address local and spatial uncertainty in the context of risk mapping. We investigate HER for handling estimations of threshold-exceeding probabilities to map the risk of soil contamination by lead in the well-known dataset of the region of Swiss Jura. Finally, HER method is extended to assess spatial uncertainty (uncertainty when several locations are considered together) through sequential simulation. Its results are compared to indicator kriging and benchmark models available in the literature generated for this particular dataset.</p><p>Thiesen S, Vieira DM, Mälicke M, Loritz R, Wellmann JF, Ehret U (2020) Histogram via entropy reduction (HER): an information-theoretic alternative for geostatistics. Hydrol Earth Syst Sci 24:4523–4540. https://doi.org/https://doi.org/10.5194/hess-24-4523-2020</p>


Author(s):  
Xiaoyu Zheng ◽  
Hiroto Itoh ◽  
Hitoshi Tamaki ◽  
Yu Maruyama

The quantitative evaluation of the fission product release to the environment during a severe accident is of great importance. In the present analysis, integral severe accident code MELCOR 1.8.5 has been applied to estimating uncertainty of source term for the accident at Unit 2 of the Fukushima Daiichi nuclear power plant (NPP) as an example and to discussing important models or parameters influential to the source term. Forty-two parameters associated with models for the transportation of radioactive materials were chosen and narrowed down to 18 through a set of screening analysis. These 18 parameters in addition to 9 parameters relevant to in-vessel melt progression obtained by the preceding uncertainty study were input to the subsequent sensitivity analysis by Morris method. This one-factor-at-a-time approach can preliminarily identify inputs which have important effects on an output, and 17 important parameters were selected from the total of 27 parameters through this approach. The selected parameters have been integrated into uncertainty analysis by means of Latin Hypercube Sampling technique and Iman-Conover method, taking into account correlation between parameters. Cumulative distribution functions of representative source terms were obtained through the present uncertainty analysis assuming the failure of suppression chamber. Correlation coefficients between the outputs and uncertain input parameters have been calculated to identify parameters of great influences on source terms, which include parameters related to models on core components failure, models of aerosol dynamic process and pool scrubbing.


2020 ◽  
Vol 501 (1) ◽  
pp. 994-1001
Author(s):  
Suman Sarkar ◽  
Biswajit Pandey ◽  
Snehasish Bhattacharjee

ABSTRACT We use an information theoretic framework to analyse data from the Galaxy Zoo 2 project and study if there are any statistically significant correlations between the presence of bars in spiral galaxies and their environment. We measure the mutual information between the barredness of galaxies and their environments in a volume limited sample (Mr ≤ −21) and compare it with the same in data sets where (i) the bar/unbar classifications are randomized and (ii) the spatial distribution of galaxies are shuffled on different length scales. We assess the statistical significance of the differences in the mutual information using a t-test and find that both randomization of morphological classifications and shuffling of spatial distribution do not alter the mutual information in a statistically significant way. The non-zero mutual information between the barredness and environment arises due to the finite and discrete nature of the data set that can be entirely explained by mock Poisson distributions. We also separately compare the cumulative distribution functions of the barred and unbarred galaxies as a function of their local density. Using a Kolmogorov–Smirnov test, we find that the null hypothesis cannot be rejected even at $75{{\ \rm per\ cent}}$ confidence level. Our analysis indicates that environments do not play a significant role in the formation of a bar, which is largely determined by the internal processes of the host galaxy.


2021 ◽  
Vol 13 (6) ◽  
pp. 1096
Author(s):  
Soi Ahn ◽  
Sung-Rae Chung ◽  
Hyun-Jong Oh ◽  
Chu-Yong Chung

This study aimed to generate a near real time composite of aerosol optical depth (AOD) to improve predictive model ability and provide current conditions of aerosol spatial distribution and transportation across Northeast Asia. AOD, a proxy for aerosol loading, is estimated remotely by various spaceborne imaging sensors capturing visible and infrared spectra. Nevertheless, differences in satellite-based retrieval algorithms, spatiotemporal resolution, sampling, radiometric calibration, and cloud-screening procedures create significant variability among AOD products. Satellite products, however, can be complementary in terms of their accuracy and spatiotemporal comprehensiveness. Thus, composite AOD products were derived for Northeast Asia based on data from four sensors: Advanced Himawari Imager (AHI), Geostationary Ocean Color Imager (GOCI), Moderate Infrared Spectroradiometer (MODIS), and Visible Infrared Imaging Radiometer Suite (VIIRS). Cumulative distribution functions were employed to estimate error statistics using measurements from the Aerosol Robotic Network (AERONET). In order to apply the AERONET point-specific error, coefficients of each satellite were calculated using inverse distance weighting. Finally, the root mean square error (RMSE) for each satellite AOD product was calculated based on the inverse composite weighting (ICW). Hourly AOD composites were generated (00:00–09:00 UTC, 2017) using the regression equation derived from the comparison of the composite AOD error statistics to AERONET measurements, and the results showed that the correlation coefficient and RMSE values of composite were close to those of the low earth orbit satellite products (MODIS and VIIRS). The methodology and the resulting dataset derived here are relevant for the demonstrated successful merging of multi-sensor retrievals to produce long-term satellite-based climate data records.


Author(s):  
Rama Subba Reddy Gorla

Heat transfer from a nuclear fuel rod bumper support was computationally simulated by a finite element method and probabilistically evaluated in view of the several uncertainties in the performance parameters. Cumulative distribution functions and sensitivity factors were computed for overall heat transfer rates due to the thermodynamic random variables. These results can be used to identify quickly the most critical design variables in order to optimize the design and to make it cost effective. The analysis leads to the selection of the appropriate measurements to be used in heat transfer and to the identification of both the most critical measurements and the parameters.


2020 ◽  
Vol 2020 (1) ◽  
Author(s):  
Thabet Abdeljawad ◽  
Saima Rashid ◽  
Zakia Hammouch ◽  
İmdat İşcan ◽  
Yu-Ming Chu

Abstract The present article addresses the concept of p-convex functions on fractal sets. We are able to prove a novel auxiliary result. In the application aspect, the fidelity of the local fractional is used to establish the generalization of Simpson-type inequalities for the class of functions whose local fractional derivatives in absolute values at certain powers are p-convex. The method we present is an alternative in showing the classical variants associated with generalized p-convex functions. Some parts of our results cover the classical convex functions and classical harmonically convex functions. Some novel applications in random variables, cumulative distribution functions and generalized bivariate means are obtained to ensure the correctness of the present results. The present approach is efficient, reliable, and it can be used as an alternative to establishing new solutions for different types of fractals in computer graphics.


2011 ◽  
Vol 18 (2) ◽  
pp. 223-234 ◽  
Author(s):  
R. Haas ◽  
K. Born

Abstract. In this study, a two-step probabilistic downscaling approach is introduced and evaluated. The method is exemplarily applied on precipitation observations in the subtropical mountain environment of the High Atlas in Morocco. The challenge is to deal with a complex terrain, heavily skewed precipitation distributions and a sparse amount of data, both spatial and temporal. In the first step of the approach, a transfer function between distributions of large-scale predictors and of local observations is derived. The aim is to forecast cumulative distribution functions with parameters from known data. In order to interpolate between sites, the second step applies multiple linear regression on distribution parameters of observed data using local topographic information. By combining both steps, a prediction at every point of the investigation area is achieved. Both steps and their combination are assessed by cross-validation and by splitting the available dataset into a trainings- and a validation-subset. Due to the estimated quantiles and probabilities of zero daily precipitation, this approach is found to be adequate for application even in areas with difficult topographic circumstances and low data availability.


Author(s):  
Aniruddha Choudhary ◽  
Ian T. Voyles ◽  
Christopher J. Roy ◽  
William L. Oberkampf ◽  
Mayuresh Patil

Our approach to the Sandia Verification and Validation Challenge Problem is to use probability bounds analysis (PBA) based on probabilistic representation for aleatory uncertainties and interval representation for (most) epistemic uncertainties. The nondeterministic model predictions thus take the form of p-boxes, or bounding cumulative distribution functions (CDFs) that contain all possible families of CDFs that could exist within the uncertainty bounds. The scarcity of experimental data provides little support for treatment of all uncertain inputs as purely aleatory uncertainties and also precludes significant calibration of the models. We instead seek to estimate the model form uncertainty at conditions where the experimental data are available, then extrapolate this uncertainty to conditions where no data exist. The modified area validation metric (MAVM) is employed to estimate the model form uncertainty which is important because the model involves significant simplifications (both geometric and physical nature) of the true system. The results of verification and validation processes are treated as additional interval-based uncertainties applied to the nondeterministic model predictions based on which the failure prediction is made. Based on the method employed, we estimate the probability of failure to be as large as 0.0034, concluding that the tanks are unsafe.


Sign in / Sign up

Export Citation Format

Share Document