Computer examination of some basic probability models

1974 ◽  
Vol 51 (11) ◽  
pp. 717 ◽  
Author(s):  
Michael E. Starzak
Author(s):  
Andrew Gelman ◽  
Deborah Nolan

This chapter contains many classroom activities and demonstrations to help students understand basic probability calculations, including conditional probability and Bayes rule. Many of the activities alert students to misconceptions about randomness. They create dramatic settings where the instructor discerns real coin flips from fake ones, students modify dice and coins in order to load them, students “accused” of lying based on the outcome of an inaccurate simulated lie detector face their classmates. Additionally, probability models of real outcomes offer good value: first we can do the probability calculations, and then can go back and discuss the potential flaws of the model.


Author(s):  
M Pourmahdian ◽  
R Zoghifard

Abstract This paper provides some model-theoretic analysis for probability (modal) logic ($PL$). It is known that this logic does not enjoy the compactness property. However, by passing into the sublogic of $PL$, namely basic probability logic ($BPL$), it is shown that this logic satisfies the compactness property. Furthermore, by drawing some special attention to some essential model-theoretic properties of $PL$, a version of Lindström characterization theorem is investigated. In fact, it is verified that probability logic has the maximal expressive power among those abstract logics extending $PL$ and satisfying both the filtration and disjoint unions properties. Finally, by alternating the semantics to the finitely additive probability models ($\mathcal{F}\mathcal{P}\mathcal{M}$) and introducing positive sublogic of $PL$ including $BPL$, it is proved that this sublogic possesses the compactness property with respect to $\mathcal{F}\mathcal{P}\mathcal{M}$.


Data Mining ◽  
2011 ◽  
pp. 260-277
Author(s):  
Eitel J.M. Lauria ◽  
Giri Kumar Tayi

One of the major problems faced by data-mining technologies is how to deal with uncertainty. The prime characteristic of Bayesian methods is their explicit use of probability for quantifying uncertainty. Bayesian methods provide a practical method to make inferences from data using probability models for values we observe and about which we want to draw some hypotheses. Bayes’ Theorem provides the means of calculating the probability of a hypothesis (posterior probability) based on its prior probability, the probability of the observations, and the likelihood that the observational data fits the hypothesis. The purpose of this chapter is twofold: to provide an overview of the theoretical framework of Bayesian methods and its application to data mining, with special emphasis on statistical modeling and machine-learning techniques; and to illustrate each theoretical concept covered with practical examples. We will cover basic probability concepts, Bayes’ Theorem and its implications, Bayesian classification, Bayesian belief networks, and an introduction to simulation techniques.


1997 ◽  
Vol 48 (8) ◽  
pp. 848-848
Author(s):  
A C Borthakur ◽  
H Choudhury (Eds.)
Keyword(s):  

Informatica ◽  
2016 ◽  
Vol 27 (2) ◽  
pp. 323-334 ◽  
Author(s):  
James M. Calvin

Sign in / Sign up

Export Citation Format

Share Document