probabilistic beliefs
Recently Published Documents


TOTAL DOCUMENTS

28
(FIVE YEARS 9)

H-INDEX

6
(FIVE YEARS 1)

Author(s):  
Jianli Wang ◽  
Yingrong Su ◽  
Jingyuan Li ◽  
Ho Yin Yick

2021 ◽  
Author(s):  
Samuel J. Gershman ◽  
Mina Cikara

Why, when, and how do stereotypes change? This paper develops a computational account based on the principles of structure learning: stereotypes are governed by probabilistic beliefs about the assignment of individuals to groups. Two aspects of this account are particularly important. First, groups are flexibly constructed based on the distribution of traits across individuals. This allows the model to explain the phenomenon of subtyping, whereby deviant individuals are segregated from a group, thus protecting the group’s stereotype. Second, groups are hierarchically structured, such that groups can be nested. This allows the model to explain the phenomenon of subgrouping, whereby a collection of deviant individuals is organized into a refinement of the group. The structure learning account also sheds light on several factors that determine stereotype change, including perceived group variability, individual typicality, cognitive load, and sample size.


2021 ◽  
Vol 118 (19) ◽  
pp. e2010144118
Author(s):  
Nika Haghtalab ◽  
Matthew O. Jackson ◽  
Ariel D. Procaccia

We present two models of how people form beliefs that are based on machine learning theory. We illustrate how these models give insight into observed human phenomena by showing how polarized beliefs can arise even when people are exposed to almost identical sources of information. In our first model, people form beliefs that are deterministic functions that best fit their past data (training sets). In that model, their inability to form probabilistic beliefs can lead people to have opposing views even if their data are drawn from distributions that only slightly disagree. In the second model, people pay a cost that is increasing in the complexity of the function that represents their beliefs. In this second model, even with large training sets drawn from exactly the same distribution, agents can disagree substantially because they simplify the world along different dimensions. We discuss what these models of belief formation suggest for improving people’s accuracy and agreement.


2021 ◽  
Vol 1 ◽  
pp. 13
Author(s):  
Nikitas Pittis ◽  
Phoebe Koundouri ◽  
Panagiotis Samartzis ◽  
Nikolaos Englezos ◽  
Andreas Papandreou

The central question of this paper is whether a rational agent under uncertainty can exhibit ambiguity aversion (AA). The answer to this question depends on the way the agent forms her probabilistic beliefs: classical Bayesianism (CB) vs modern Bayesianism (MB). We revisit Schmeidler's coin-based example and show that a rational MB agent operating in the context of a "small world", cannot exhibit AA. Hence we argue that the motivation of AA based on Schmeidler's coin-based and Ellsberg's classic urn-based examples, is poor, since they correspond to cases of "small worlds". We also argue that MB, not only avoids AA, but also proves to be normatively superior to CB because an MB agent (i) avoids logical inconsistencies akin to the relation between her subjective probability and objective chance, (ii) resolves the problem of "old evidence" and (iii) allows psychological detachment from actual evidence, hence avoiding the problem of "cognitive dissonance". As far as AA is concerned, we claim that it may be thought of as a (potential) property of large worlds, because in such worlds MB is likely to be infeasible.


2021 ◽  
Vol 0 (0) ◽  
Author(s):  
Tomasz Sadzik

Abstract Bayesian game theory investigates strategic interaction of players with full awareness but incomplete information about their environment. We extend the analysis to players with incomplete awareness, who might not be able to reason about all contingencies in the first place. We develop three logical systems for knowledge, probabilistic beliefs and awareness, and characterize their axiom systems. Bayesian equilibrium is extended to games with incomplete awareness and we show that it is consistent with common prior and speculative trade, when common knowledge of rationality is violated.


Author(s):  
Andrew Mackenzie

Abstract For qualitative probability spaces, monotone continuity and third-order atom-swarming are together sufficient for a unique countably additive probability measure representation that may have atoms (Mackenzie in Theor Econ 14:709–778, 2019). We provide a new proof by appealing to a theorem of Luce (Ann Math Stat 38:780–786, 1967), highlighting the usefulness of extensive measurement theory (Krantz et al. in Foundations of Measurement Volume I: Additive and Polynomial Representations. Academic Press, New York, 1971) for economists.


2019 ◽  
Vol 23 (3) ◽  
pp. 279-292 ◽  
Author(s):  
J. Y. Tsao ◽  
C. L. Ting ◽  
C. M. Johnson

Two perspectives are used to reframe Simonton’s recent three-factor definition of creative outcome. The first perspective is functional: that creative ideas are those that add significantly to knowledge by providing both utility and learning. The second perspective is calculational: that learning can be estimated by the change in probabilistic beliefs about an idea’s utility before and after it has played out in its environment. The results of the reframing are proposed conceptual and mathematical definitions of (a) creative outcome as the product of two overarching factors (utility and learning) and (b) learning as a function of two subsidiary factors (blindness reduction and surprise). Learning will be shown to depend much more strongly on surprise than on blindness reduction, so creative outcome may then also be defined as “implausible utility.”


2019 ◽  
Vol 116 (21) ◽  
pp. 10323-10328
Author(s):  
Rossella Argenziano ◽  
Itzhak Gilboa

Agents make predictions based on similar past cases, while also learning the relative importance of various attributes in judging similarity. We ask whether the resulting “empirically optimal similarity function” (EOSF) is unique and how easy it is to find it. We show that with many observations and few relevant variables, uniqueness holds. By contrast, when there are many variables relative to observations, nonuniqueness is the rule, and finding the EOSF is computationally hard. The results are interpreted as providing conditions under which rational agents who have access to the same observations are likely to converge on the same predictions and conditions under which they may entertain different probabilistic beliefs.


2019 ◽  
Vol 14 (2) ◽  
pp. 709-778 ◽  
Author(s):  
Andrew Mackenzie

We propose two novel axioms for qualitative probability spaces: (i) unlikely atoms, which requires that there is an event containing no atoms that is at least as likely as its complement; and (ii) third‐order atom‐swarming, which requires that for each atom, there is a countable pairwise‐disjoint collection of less‐likely events that can be partitioned into three groups, each with union at least as likely as the given atom. We prove that under monotone continuity, each of these axioms is sufficient to guarantee a unique countably‐additive probability measure representation, generalizing work by Villegas to allow atoms. Unlike previous contributions that allow atoms, we impose no cancellation or solvability axiom.


Sign in / Sign up

Export Citation Format

Share Document