Statistical Tools for Choices between Probability Distributions for Hydrological Frequency Modelling

Author(s):  
Fahim Ashkar
2020 ◽  
Vol 12 (24) ◽  
pp. 10522
Author(s):  
Dariusz Młyński ◽  
Anna Młyńska ◽  
Krzysztof Chmielowski ◽  
Jan Pawełek

The paper presents modelling of wastewater treatment plant (WWTP) operation work efficiency using a two-stage method based on selected probability distributions and the Monte Carlo method. Calculations were carried out in terms of sewage susceptibility to biodegradability. Pollutant indicators in raw sewage and in sewage after mechanical treatment and biological treatment were analysed: BOD5, COD, total suspended solids (TSS), total nitrogen (TN) and total phosphorus (TP). The compatibility of theoretical and empirical distributions was assessed using the Anderson–Darling test. The best-fitted statistical distributions were selected using Akaike criterion. Performed calculations made it possible to state that out of all proposed methods, the Gaussian mixture model (GMM) for distribution proved to be the best-fitted. Obtained simulation results proved that the statistical tools used in this paper describe the changes of pollutant indicators correctly. The calculations allowed us to state that the proposed calculation method can be an effective tool for predicting the course of subsequent sewage treatment stages. Modelling results can be used to make a reliable assessment of sewage susceptibility to biodegradability expressed by the BOD5/COD, BOD5/TN and BOD5/TP ratios. New data generated this way can be helpful for the assessment of WWTP operation work and for preparing different possible scenarios for their operation.


2020 ◽  
Author(s):  
Ronan Le Bras ◽  
Ehsan Qorbani

<p>The Comprehensive Nuclear-Test-Ban Treaty (CTBT) calls for a verification regime which involves interactions between the International Data Centre (IDC) component of the Provisional Technical Secretariat (PTS) established in Vienna, Austria, and the National Data Centres (NDC) of Member States of the Treaty. The results of location estimates of the same event by the two organizations is obtained using similar methods and software but potentially involve different seismo-acoustic networks and therefore a direct comparison of the distances and time differences is not sufficient and the different error estimates for the event should be taken into account. Most methods of location are using iterative linear inversions and the probability distributions are Gaussian, using the covariance matrix resulting from the last step of the iterative inversion process as the parameters of the Gaussian distributions. We explored the statistical tools available to compare two multi-dimensional distributions and measure a distance between them in an objective manner, including the Hellinger distance, the Bhattacharyya distance, and the Mahalanobis distance and we will show examples of application to the seismo-acoustic location problem. </p>


1997 ◽  
Vol 161 ◽  
pp. 197-201 ◽  
Author(s):  
Duncan Steel

AbstractWhilst lithopanspermia depends upon massive impacts occurring at a speed above some limit, the intact delivery of organic chemicals or other volatiles to a planet requires the impact speed to be below some other limit such that a significant fraction of that material escapes destruction. Thus the two opposite ends of the impact speed distributions are the regions of interest in the bioastronomical context, whereas much modelling work on impacts delivers, or makes use of, only the mean speed. Here the probability distributions of impact speeds upon Mars are calculated for (i) the orbital distribution of known asteroids; and (ii) the expected distribution of near-parabolic cometary orbits. It is found that cometary impacts are far more likely to eject rocks from Mars (over 99 percent of the cometary impacts are at speeds above 20 km/sec, but at most 5 percent of the asteroidal impacts); paradoxically, the objects impacting at speeds low enough to make organic/volatile survival possible (the asteroids) are those which are depleted in such species.


2020 ◽  
Vol 3 (1) ◽  
pp. 10501-1-10501-9
Author(s):  
Christopher W. Tyler

Abstract For the visual world in which we operate, the core issue is to conceptualize how its three-dimensional structure is encoded through the neural computation of multiple depth cues and their integration to a unitary depth structure. One approach to this issue is the full Bayesian model of scene understanding, but this is shown to require selection from the implausibly large number of possible scenes. An alternative approach is to propagate the implied depth structure solution for the scene through the “belief propagation” algorithm on general probability distributions. However, a more efficient model of local slant propagation is developed as an alternative.The overall depth percept must be derived from the combination of all available depth cues, but a simple linear summation rule across, say, a dozen different depth cues, would massively overestimate the perceived depth in the scene in cases where each cue alone provides a close-to-veridical depth estimate. On the other hand, a Bayesian averaging or “modified weak fusion” model for depth cue combination does not provide for the observed enhancement of perceived depth from weak depth cues. Thus, the current models do not account for the empirical properties of perceived depth from multiple depth cues.The present analysis shows that these problems can be addressed by an asymptotic, or hyperbolic Minkowski, approach to cue combination. With appropriate parameters, this first-order rule gives strong summation for a few depth cues, but the effect of an increasing number of cues beyond that remains too weak to account for the available degree of perceived depth magnitude. Finally, an accelerated asymptotic rule is proposed to match the empirical strength of perceived depth as measured, with appropriate behavior for any number of depth cues.


2017 ◽  
Vol 25 (2) ◽  
pp. 927-960
Author(s):  
Jarod Jacobs

In this article, I discuss three statistical tools that have proven pivotal in linguistic research, particularly those studies that seek to evaluate large datasets. These tools are the Gaussian Curve, significance tests, and hierarchical clustering. I present a brief description of these tools and their general uses. Then, I apply them to an analysis of the variations between the “biblical” DSS and our other witnesses, focusing upon variations involving particles. Finally, I engage the recent debate surrounding the diachronic study of Biblical Hebrew. This article serves a dual function. First, it presents statistical tools that are useful for many linguistic studies. Second, it develops an analysis of the he-locale, as it is used in the “biblical” Dead Sea Scrolls, Masoretic Text, and Samaritan Pentateuch. Through that analysis, this article highlights the value of inferential statistical tools as we attempt to better understand the Hebrew of our ancient witnesses.


Think India ◽  
2019 ◽  
Vol 22 (3) ◽  
pp. 553-562
Author(s):  
Dr. Devarajappa S

The Main objective of the paper is to examine the current trends and progress of the venture capital in India and the paper also highlights the concept and stages of financing of venture capital. To meet the aim objective of the study the researcher used secondary sources. The required secondary information has been collected through various articles, reports, magazines’ and websites. To examine the trends of venture capital in India, IVCA (Indian Venture Capital Association) report is used.  For the purpose of examine the data; the statistical tools like Mean, Standard Deviation, Charts and ANOVA, Correlation coefficient have been employed.   The study concludes that, the venture capital investment has been increasing in India and this is the positive indication for the country, to curb the unemployment, economic empowerment of people through maximizing startups in India


2019 ◽  
Vol 8 (9) ◽  
pp. 22-30
Author(s):  
SONIA HOODA

The study has made an attempt on resource use and economic efficiency of cucumber production under poly-house farming and open field farming. Primary data collected by using purposive sampling technique from selected districts. Sample of 50 farmers (25 Poly-house farmers and 25 Open field farmers) was taken from each district on the basis of availability. Secondary data was collected from Horticulture Department. For data analysis statistical tools average, percentage and Linear Cobb-Douglas Production Function was used. The study found that the yield of cucumber was more under poly-house farming as compare to open field farming system. The reason behind this was long harvesting period and more number of fruits per plant under poly-house farming conditions. The data specifies higher net returns per acre of cucumber under poly-house farming over open field farming, which implicit poly-house farming not only highly profitable but also economically viable as compared to open field farming in study area.


2017 ◽  
Vol 3 (2) ◽  
pp. 30-36
Author(s):  
E. Amankwah, V. Hans-Jürgen

Agriculture in the Upper West region is primarily subsistence and rain-fed, and irrigation practice is significantly furrow andthe use of traditional watering can. This historical approach to agriculture is predicted to suffer severe setbacks due to climatechange. This research therefore explores farmers’ perception of climate change and its impact and how the farmers can cope withthe changing climate. The primary data was gathered through field observation, interviews and administration of questionnairesto about 400 irrigation farmers in three districts of the Upper West region. The data was analysed using 1. Statistical Packagefor Social Sciences (SPSS) and basic statistical tools. It was discovered that 62% of the farmers had no formal education withmajority above 50 years of age. Over 80% have observed rising temperatures and declining rainfall over the last few decades.This has led to higher evaporation and siltation of irrigation dams, higher transpiration of crops and water stress resultingin low crop yield, crop failure and food insecurity. The research also highlights anthropogenic activities that have influencedclimate variability and food production in the region. The research was concluded with suggested strategies to facilitate farmers’adaptation to climate variability.


2015 ◽  
Vol 6 (01-02) ◽  
Author(s):  
P. Paramanandam ◽  
K. Sangeetha

Locus of control reflects the extent to which individuals believe that what happens to them is within their control, or beyond it. The objective of the present study was to study locus of control and employee engagement among the employees of automobile industry. A convenience sample consisting of ninety employees working in automobile industry participated in the study. By administering questionnaires locus of control and employee engagement among the employees were assessed. The collected data was analysed with various statistical tools like Mean, Standard Deviation, Correlation, regression and ANOVA tests. A higher level of locus of control was observed among the above 50 years age group and a higher level of employee engagement among the 41-50 age group. A higher level of locus of control and employee engagement was observed among the above 30000 income group. There were significant differences in locus of control and employee engagement among the respondents of different income groups. There was a significant positive correlation between locus of control and employee engagement. Approximately 18% of the variance of employee engagement was explained by locus of control.


2019 ◽  
Vol 4 (1) ◽  
pp. 185
Author(s):  
Ya’ti Ikhwani Nasution

The purpose of this study is to find out whether there is an influence of Islamic business ethics with the variables of unity, equilibrium, free will, responsibility, benevolence and the welfare of traders in the Pusat Pasar Medan. This research is a quantitative research and the analysis used is multiple regression analysis. The data collection technique used is the questionnaire method obtained directly from the respondent, namely the Pusat Pasar Medan Trader. Analyzed using statistical tools, namely SPSS Version 22. Based on the results of data processing has shown that there is a significant influence as partially and simultaneously among the unity, equilibrium, free will, responsibility and benovelence towards the welfare of traders in the Medan Market Center. For unity, free will, responsibility and benovelence have a positive effect on the welfare of traders in Medan Market Center. While the equilibrium variable has a negative effect on the welfare of Medan Market Center traders. The adjusted R square value is 0.345. This means that 34.5% increase in welfare can be explained by independent variables, namely the variables of unity, equilibrium, free will, responsibility and kindness. While 65.5% is explained by other factors.


Sign in / Sign up

Export Citation Format

Share Document