discrete probability distribution
Recently Published Documents


TOTAL DOCUMENTS

30
(FIVE YEARS 8)

H-INDEX

9
(FIVE YEARS 1)

2021 ◽  
pp. 65-82
Author(s):  
Anindya Ghosh ◽  
Bapi Saha ◽  
Prithwiraj Mal

Author(s):  
Federico D’Ambrosio ◽  
Hans L. Bodlaender ◽  
Gerard T. Barkema

AbstractIn this paper, we consider several efficient data structures for the problem of sampling from a dynamically changing discrete probability distribution, where some prior information is known on the distribution of the rates, in particular the maximum and minimum rate, and where the number of possible outcomes N is large. We consider three basic data structures, the Acceptance–Rejection method, the Complete Binary Tree and the Alias method. These can be used as building blocks in a multi-level data structure, where at each of the levels, one of the basic data structures can be used, with the top level selecting a group of events, and the bottom level selecting an element from a group. Depending on assumptions on the distribution of the rates of outcomes, different combinations of the basic structures can be used. We prove that for particular data structures the expected time of sampling and update is constant when the rate distribution follows certain conditions. We show that for any distribution, combining a tree structure with the Acceptance–Rejection method, we have an expected time of sampling and update of $$O\left( \log \log {r_{max}}/{r_{min}}\right) $$ O log log r max / r min is possible, where $$r_{max}$$ r max is the maximum rate and $$r_{min}$$ r min the minimum rate. We also discuss an implementation of a Two Levels Acceptance–Rejection data structure, that allows expected constant time for sampling, and amortized constant time for updates, assuming that $$r_{max}$$ r max and $$r_{min}$$ r min are known and the number of events is sufficiently large. We also present an experimental verification, highlighting the limits given by the constraints of a real-life setting.


2020 ◽  
Vol 11 (4) ◽  
pp. 574-587 ◽  
Author(s):  
Sicheng Zhao ◽  
Guiguang Ding ◽  
Yue Gao ◽  
Xin Zhao ◽  
Youbao Tang ◽  
...  

Author(s):  
Abiodun Tinuoye Oladipo

The close-to-convex analogue of a starlike functions by means of generalized discrete probability distribution and Poisson distribution was considered. Some coefficient inequalities and their connection to classical Fekete-Szego theorem are obtained. Our results provide strong connection between Geometric Function Theory and Statistics.


2020 ◽  
Vol 25 (5) ◽  
pp. 423-432
Author(s):  
Susumu Cato

AbstractThis paper aims to consider the meaning of the dismal theorem, as presented by Martin Weitzman [(2009) On modeling and interpreting the economics of catastrophic climate change. Review of Economics and Statistics91, 1–19]. The theorem states that a standard cost–benefit analysis breaks down if there is a possibility of catastrophes occurring. This result has a significant influence on debates regarding the economics of climate change. In this study, we present an intuitive similarity between the dismal theorem and the St. Petersburg paradox using a simple discrete probability distribution.


2020 ◽  
Vol 45 (5) ◽  
pp. 515-533 ◽  
Author(s):  
Richard A. Feinberg ◽  
Matthias von Davier

The literature showing that subscores fail to add value is vast; yet despite their typical redundancy and the frequent presence of substantial statistical errors, many stakeholders remain convinced of their necessity. This article describes a method for identifying and reporting unexpectedly high or low subscores by comparing each examinee’s observed subscore with a discrete probability distribution of subscores conditional on the examinee’s overall ability. The proposed approach turns out to be somewhat conservative due to the nature of subscores as finite sums of item scores associated with a subdomain. Thus, the method may be a compromise that satisfies score users by reporting subscore information as well as psychometricians by limiting misinterpretation, at most, to the rates of Type I and Type II error.


2018 ◽  
Vol 2018 ◽  
pp. 1-5 ◽  
Author(s):  
Saurabh Porwal

The purpose of the present paper is to introduce a generalized discrete probability distribution and obtain some results regarding moments, mean, variance, and moment generating function for this distribution. Further, we show that for specific values it reduces to various well-known distributions. Finally, we give a beautiful application of this distribution on certain analytic univalent functions.


Author(s):  
Sicheng Zhao ◽  
Guiguang Ding ◽  
Yue Gao ◽  
Jungong Han

Existing works on image emotion recognition mainly assigned the dominant emotion category or average dimension values to an image based on the assumption that viewers can reach a consensus on the emotion of images. However, the image emotions perceived by viewers are subjective by nature and highly related to the personal and situational factors. On the other hand, image emotions can be conveyed by different features, such as semantics and aesthetics. In this paper, we propose a novel machine learning approach that formulates the categorical image emotions as a discrete probability distribution (DPD). To associate emotions with the extracted visual features, we present a weighted multi-modal shared sparse leaning to learn the combination coefficients, with which the DPD of an unseen image can be predicted by linearly integrating the DPDs of the training images. The representation abilities of different modalities are jointly explored and the optimal weight of each modality is automatically learned. Extensive experiments on three datasets verify the superiority of the proposed method, as compared to the state-of-the-art.


Sign in / Sign up

Export Citation Format

Share Document