scholarly journals Bayesian or biased? Analytic thinking and political belief updating

2019 ◽  
Author(s):  
Ben M Tappin ◽  
Gordon Pennycook ◽  
David Gertler Rand

A surprising finding from U.S. opinion surveys is that political disagreements tend to be greatest among the most cognitively sophisticated opposing partisans. Recent experiments suggest a hypothesis that could explain this pattern: cognitive sophistication magnifies politically biased processing of new information. However, the designs of these experiments tend to contain several limitations that complicate their support for this hypothesis. In particular, they tend to (i) focus on people’s worldviews and political identities, at the expense of their other, more specific prior beliefs, (ii) lack direct comparison with a politically unbiased benchmark, and (iii) focus on people’s judgments of new information, rather than on their posterior beliefs following exposure to the information. We report two studies designed to address these limitations. In our design, U.S. subjects received noisy but informative signals about the truth or falsity of partisan political questions, and we measured their prior and posterior beliefs, and cognitive sophistication, operationalized as analytic thinking inferred via performance on the Cognitive Reflection Test. We compared subjects’ posterior beliefs to an unbiased Bayesian benchmark. We found little evidence that analytic thinking magnified politically biased deviations from the benchmark. In contrast, we found consistent evidence that greater analytic thinking was associated with posterior beliefs closer to the benchmark. Together, these results are inconsistent with the hypothesis that cognitive sophistication magnifies politically biased processing. We discuss differences between our design and prior work that can inform future tests of this hypothesis.

2021 ◽  
Author(s):  
Tobias Kube ◽  
Lukas Kirchner ◽  
Gunnar Lemmer ◽  
Julia Glombiewski

Aberrant belief updating has been linked to psychopathology, e.g., depressive symptoms. While previous research used to treat belief-confirming vs. -disconfirming information as binary concepts, the present research varied the extent to which new information deviates from prior beliefs and examined its influence on belief updating. In a false feedback task (Study 1; N = 379) and a social interaction task (Study 2; N = 292), participants received slightly positive, moderately positive or extremely positive information in relation to their prior beliefs. In both studies, new information was deemed most reliable if it was moderately positive. Yet, differences in the positivity of new information had only small effects on belief updating. In Study 1, depressive symptoms were related to difficulties in generalizing positive new learning experiences. The findings suggest that, contrary to traditional learning models, the larger the differences between prior beliefs and new information, the more beliefs are not updated.


2022 ◽  
Author(s):  
Tobias Kube

When updating beliefs in light of new information, people preferentially integrate information that is consistent with their prior beliefs and helps them construe a coherent view of the world. Such a selective integration of new information likely contributes to belief polarisation and compromises public discourse. Therefore, it is crucial to understand the factors that underlie biased belief updating. To this end, I conducted three pre-registered experiments covering different controversial political issues (i.e., Experiment 1: climate change, Experiment 2: speed limit on highways, Experiment 3: immigration in relation to violent crime). The main hypothesis was that negative reappraisal of new information (referred to as “cognitive immunisation”) hinders belief updating. Support for this hypothesis was found only in Experiment 2. In all experiments, the magnitude of the prediction error (i.e., the discrepancy between prior beliefs and new information) was strongly related to belief updating. Across experiments, participants’ general attitudes regarding the respective issue influenced the strength of beliefs, but not their update. The present findings provide some indication that the engagement in cognitive immunisation can lead to the maintenance of beliefs despite disconfirming information. However, by far the largest association with belief updating was with the magnitude of the prediction error.


2020 ◽  
pp. 008124632095371
Author(s):  
Casper JJ van Zyl

Thinking dispositions are considered important predictors of analytic thinking. While several thinking dispositions have been found to predict responses on a range of analytic thinking tasks, this field is arguably underdeveloped. There are likely many relevant dispositional variables associated with analytic thinking that remains to be explored. This study examines one such dispositional variable, namely, attitude to ambiguity. The disposition is implied in the literature given that internal conflict – likely with associated ambiguity – is typically experienced in cognitive tasks used to study thinking and reasoning. In this article, the association between attitude to ambiguity and analytic thinking is empirically examined using Bayesian methods. A total of 313 adults (mean age = 29.31, SD = 12.19) completed the Multidimensional Attitude Toward Ambiguity (MAAS) scale, along with the Cognitive Reflection Test and a syllogism-based measure of belief bias. Results found one component of the MAAS scale, Moral Absolutism, to be a robust predictor of scores on both the Cognitive Reflection Test and the measure of belief bias.


2020 ◽  
Author(s):  
Tobias Kube ◽  
Julia Glombiewski

People update their beliefs selectively in response to good news and disregard bad news, referred to as the optimism bias. Yet, the precise cognitive mechanisms underlying this asymmetry in belief updating are largely unknown. In three experiments, we tested the hypothesis that cognitive immunisation against new information contributes to optimistic belief updating (e.g. through questioning the reliability of new information). In each study, participants received new information in relation to their prior beliefs, and we examined the influence of cognitive immunisation on belief updating by using a three-group modulation protocol: In one group, cognitive immunisation against new information was promoted; in another group, cognitive immunisation was inhibited; and a control group received no manipulation. This modulation protocol was applied to beliefs about the self, i.e. performance expectations (Experiment 1&2; N=99 and N=93), and beliefs about climate change (Experiment 3; N=227) as an example of factual beliefs. The results of Experiments 1&2 showed that the cognitive immunisation manipulation had no influence on the update of performance-related expectations. In Experiment 3, we did find significant group differences in belief updating, and this effect interacted with participants’ general attitudes towards climate change: people who were sceptical about man-made climate change lowered their estimates of the projected temperature rise particularly if they perceived scientific information on climate change as being fraught with uncertainty. These findings suggest that the importance of cognitive immunisation in belief updating may depend on the content of beliefs (i.e. self-related vs. factual) and participants’ attitudes to the subject in question.


2021 ◽  
Author(s):  
Emily Hird

Computational cognitive theory proposes that our experiences represent an optimisation between new information and prior beliefs, which are updated to best reflect reality. However, experiences can be biased. One example is the placebo response (PR), where a persistent belief in the effectiveness of a treatment relieves symptoms, even though the ‘treatment’ is an inert sugar pill. Another example is psychosis, which is characterised by unusual percepts and beliefs in the form of hallucinations and delusions. Antipsychotic medication, the primary treatment for psychosis, is often ineffective and accompanied by severe side-effects, but we have not identified an effective alternative. This is likely because the large and heterogenous placebo response in psychosis is likely to create noise in trials and so disrupts attempts to identify new treatments. This well-recognised issue could be solved if we can predict how an individual is likely to respond to placebo treatment and account for placebo responses. Importantly, biomarkers predicting the placebo response have been identified chiefly in pain and depression, but not in psychosis. Quantifying individual belief-updating, and tendency to rely on prior beliefs versus new information, would provide a sensitive method to predict the PR in psychosis.


2016 ◽  
Vol 7 (2) ◽  
pp. e97-103 ◽  
Author(s):  
Shu Wen Tay ◽  
Paul Macdara Ryan ◽  
C Anthony Ryan

Background: Diagnostic decision-making is made through a combination of Systems 1 (intuition or pattern-recognition) and Systems 2 (analytic) thinking. The purpose of this study was to use the Cognitive Reflection Test (CRT) to evaluate and compare the level of Systems 1 and 2 thinking among medical students in pre-clinical and clinical programs.Methods: The CRT is a three-question test designed to measure the ability of respondents to activate metacognitive processes and switch to System 2 (analytic) thinking where System 1 (intuitive) thinking would lead them astray. Each CRT question has a correct analytical (System 2) answer and an incorrect intuitive (System 1) answer. A group of medical students in Years 2 & 3 (pre-clinical) and Years 4 (in clinical practice) of a 5-year medical degree were studied.Results: Ten percent (13/128) of students had the intuitive answers to the three questions (suggesting they generally relied on System 1 thinking) while almost half (44%) answered all three correctly (indicating full analytical, System 2 thinking). Only 3-13% had incorrect answers (i.e. that were neither the analytical nor the intuitive responses). Non-native English speaking students (n = 11) had a lower mean number of correct answers compared to native English speakers (n = 117: 1.0 s 2.12 respectfully: p < 0.01). As students progressed through questions 1 to 3, the percentage of correct System 2 answers increased and the percentage of intuitive answers decreased in both the pre-clinical and clinical students. Conclusions: Up to half of the medical students demonstrated full or partial reliance on System 1 (intuitive) thinking in response to these analytical questions. While their CRT performance has no claims to make as to their future expertise as clinicians, the test may be used in helping students to understand the importance of awareness and regulation of their thinking processes in clinical practice.


2021 ◽  
Author(s):  
Aaron Erlich ◽  
Calvin Garner ◽  
Gordon Pennycook ◽  
David Gertler Rand

Ukraine has been the target of a long-running Russian disinformation campaign. We investigate susceptibility to this pro-Kremlin disinformation from a cognitive science perspective. Is greater analytic thinking associated with less belief in disinformation, as per classical theories of reasoning? Or does analytic thinking amplify motivated reasoning, such that analytic thinking is associated with more polarized beliefs (and thus more belief in pro-Kremlin disinformation among pro-Russia Ukrainians)? In online (N=1,974) and face-to-face representative (N=9,474) samples of Ukrainians, we find support for the classical reasoning account. Analytic thinking, as measured using the Cognitive Reflection Test, was associated with greater ability to discern truth from disinformation – even for Ukrainians who are strongly oriented towards Russia. We found similar, albeit somewhat weaker, results when operationalizing analytic thinking using the self-report Active Open-minded Thinking scale. These results demonstrate a similar pattern to prior work using American participants. Thus, the positive association between analytic thinking and the ability to discern truth versus falsehood generalizes to the qualitatively different information environment of post-communist Ukraine. Despite low trust in government and media, weak journalistic standards, and years of exposure to Russian disinformation, Ukrainians who engage in more analytic thinking are better able to tell truth from falsehood.


2021 ◽  
Author(s):  
Christie Newton ◽  
Justin Feeney ◽  
Gordon Pennycook

A common claim is that people vary not just in what they think, but how they think. In fact, there are a large number of scales that have been developed to ostensibly measure thinking styles. These measures share a lot of conceptual overlap and, in particular, most purport to index some aspect of the disposition to think more analytically and effortfully rather than relying more on intuitions and gut feelings. To address this issue, we gave a sample of 774 participants a subset of 90 items from 15 scales and narrowed the list of items down to 50 by isolating items that were meaningfully correlated with the Cognitive Reflection Test, a behavioral measure of individual differences in analytic thinking. Then, across six studies with 1149 participants, we systematically narrowed down the items and tested the underlying factor structure. This revealed that a four-factor correlated structure was best: Actively Open-minded Thinking, Close-Minded Thinking, Preference for Intuitive Thinking, and Preference for Effortful Thinking. Predictive validity for the resulting 24-item (6 items per sub-scale) Comprehensive Thinking Style Questionnaire (CTSQ) was established using a set of cognitive ability measures as well as several outcome measures (e.g., epistemically suspect beliefs, bullshit receptivity, empathy, moral judgments, among others), with some subscales having stronger predictive validity for some outcomes but not others. The CTSQ helps alleviate the jangle fallacy in thinking styles research and allows for the assessment of separate aspects thinking styles in a single comprehensive measure.


2018 ◽  
Vol 39 (2) ◽  
pp. 99-106 ◽  
Author(s):  
Michał Białek ◽  
Przemysław Sawicki

Abstract. In this work, we investigated individual differences in cognitive reflection effects on delay discounting – a preference for smaller sooner over larger later payoff. People are claimed to prefer more these alternatives they considered first – so-called reference point – over the alternatives they considered later. Cognitive reflection affects the way individuals process information, with less reflective individuals relying predominantly on the first information they consider, thus, being more susceptible to reference points as compared to more reflective individuals. In Experiment 1, we confirmed that individuals who scored high on the Cognitive Reflection Test discount less strongly than less reflective individuals, but we also show that such individuals are less susceptible to imposed reference points. Experiment 2 replicated these findings additionally providing evidence that cognitive reflection predicts discounting strength and (in)dependency to reference points over and above individual difference in numeracy.


Sign in / Sign up

Export Citation Format

Share Document