scholarly journals Money Illusion: Reconsidered in the Light of Cognitive Science

2019 ◽  
Vol 69 (2) ◽  
pp. 191-215
Author(s):  
János Vincze

A basic principle of economics is that people always prefer a larger set of opportunities. Money illusion can be considered as the phenomenon when people may not correctly perceive their budget constraints, and may act in ways that run counter to this preference. In this interpretation, money illusion is a cognitive bias, worthwhile to overcome. Herein I argue that taking a view of human decision-making based on certain strands of cognitive psychology, one can reinterpret the evidence for money illusion in two ways. First, I claim that money illusion is inescapable to some extent, and saying that we suffer from it is similar to alleging that we experience optical illusions, only because we are unable to see, say, individual atoms. Second, taking a view on “preferences” different from the traditional one, I contend that it may bring little benefit to get rid of money illusion even in the cases where it is possible to do so. To follow up the visual analogy, even if we can improve our eyesight it is not obviously desirable. These arguments seem to lead to a Candidean disposition: there is no possible improvement on the state of affairs as far as “money illusion” is concerned. Nonetheless, I will make some positive proposals concerning economic policy and economics research.

1974 ◽  
Vol 125 (584) ◽  
pp. 60-64 ◽  
Author(s):  
Clive Payne ◽  
Sarah McCabe ◽  
Nigel Walker

In 1963–4 the Oxford University Penal Research Unit managed to collect information about 90 per cent of the offender-patients who were admitted to N.H.S. and Special Hospitals under hospital orders made by criminal courts: this cohort has been described in Crime and Insanity, Vol. II, by Walker and McCabe (1973). One of the by-products of the follow-up of these offender-patients was a rough and ready scoring system for predicting reconvictions (within a two-year follow-up) of 456 offender-patients who were allowed to leave hospital within a year of admission. A prediction system can be used to assist human decision-making (though it should not be a substitute for it)∗ and can also be used to assess the efficacy of measures—such as after-care—by seeing whether individuals with equal predicted reconviction-rates do better with than without the measure (Mannheim and Wilkins, 1955).


2020 ◽  
Vol 2 (4) ◽  
Author(s):  
Bradley J Langford ◽  
Nick Daneman ◽  
Valerie Leung ◽  
Dale J Langford

Abstract The way clinicians think about decision-making is evolving. Human decision-making shifts between two modes of thinking, either fast/intuitive (Type 1) or slow/deliberate (Type 2). In the healthcare setting where thousands of decisions are made daily, Type 1 thinking can reduce cognitive load and help ensure decision making is efficient and timely, but it can come at the expense of accuracy, leading to systematic errors, also called cognitive biases. This review provides an introduction to cognitive bias and provides explanation through patient vignettes of how cognitive biases contribute to suboptimal antibiotic prescribing. We describe common cognitive biases in antibiotic prescribing both from the clinician and the patient perspective, including hyperbolic discounting (the tendency to favour small immediate benefits over larger more distant benefits) and commission bias (the tendency towards action over inaction). Management of cognitive bias includes encouraging more mindful decision making (e.g., time-outs, checklists), improving awareness of one’s own biases (i.e., meta-cognition), and designing an environment that facilitates safe and accurate decision making (e.g., decision support tools, nudges). A basic understanding of cognitive biases can help explain why certain stewardship interventions are more effective than others and may inspire more creative strategies to ensure antibiotics are used more safely and more effectively in our patients.


2019 ◽  
Vol 49 (2) ◽  
pp. 165-176 ◽  
Author(s):  
W Neil Gowensmith ◽  
Kate E McCallum

When psychological evaluators are asked to provide their expert opinions in legal proceedings, they are expected to do so in an objective and unbiased way. The statutory requirements regarding the admissibility of expert testimony in many countries often cite objectivity and reliability as standards. However, as is true in many realms of human decision-making, the field of forensic psychological assessment is fraught with bias. In this article, we discuss several lines of research that have investigated bias in forensic psychological evaluations. We also discuss emerging lines of research involving methods to measure and reduce bias. We conclude with a call for structured self-monitoring as an important strategy for forensic evaluators to mitigate bias in their work.


2020 ◽  
Vol 2 (4) ◽  
pp. 382-389
Author(s):  
Vilert A Loving ◽  
Elizabeth M Valencia ◽  
Bhavika Patel ◽  
Brian S Johnston

Abstract Cognitive bias is an unavoidable aspect of human decision-making. In breast radiology, these biases contribute to missed or erroneous diagnoses and mistaken judgments. This article introduces breast radiologists to eight cognitive biases commonly encountered in breast radiology: anchoring, availability, commission, confirmation, gambler’s fallacy, omission, satisfaction of search, and outcome. In addition to illustrative cases, this article offers suggestions for radiologists to better recognize and counteract these biases at the individual level and at the organizational level.


2005 ◽  
Vol 07 (04) ◽  
pp. 619-650 ◽  
Author(s):  
TILMANN RAVE

While the reform of environmentally harmful subsidies has often been identified as a potential means to simultaneously realise environmental, economic and fiscal benefits, little guidance is available on designing possible paths for subsidy reform. This paper aims to better conceptualise a reform process for Germany. It argues that there is room for designing a broader framework for reform moving beyond isolated and, sometimes, inefficient steps at an environmentally oriented subsidy reform. To do so, the broader policy context is described, characteristics and underlying problem structures are identified and obstacles to policy reform are mentioned. As a result, a number of critical requirements for a potentially successful reform process can be formulated. Using available impact analyses as a "tool box", we draw on experiences with Strategic Environmental Assessments (SEA) as a useful and sufficiently flexible organisational and procedural framework for subsidy reform. Based on SEA concepts, the paper treats various important linkages, steps and actor constellations that the reform process is likely to encounter. Finally, the critical link between assessment and decision-making is addressed and some suggestions on a follow-up for the assessment process are made.


2020 ◽  
Author(s):  
Thomas F. Frotvedt ◽  
Øystein Bondevik ◽  
Vanessa T. Seeligmann ◽  
Bjørn Sætrevik

Some heuristics and biases are assumed to be universal for human decision-making, and may thus be expected to appear consistently and need to be considered when planning for real-life decision-making. Yet results are mixed when exploring the biases in applied settings, and few studies have attempted to robustly measure the combined impact of various biases during a decision-making process. We performed three pre-registered classroom experiments in which trained medical students read case descriptions and explored follow-up information in order to reach and adjust mental health diagnoses (∑N = 224). We tested whether the order of presenting the symptoms led to a primacy effect, whether there was a congruence bias in selecting follow-up questions, and whether confidence increased during the decision process. Our results showed increased confidence for participants that did not change their decision or sought disconfirming information. There was some indication of a weak congruence bias in selecting follow-up questions. There was no indication of a stronger congruence bias when confidence was high, and there was no support for a primacy effect of the order of symptom presentation. We conclude that the biases are difficult to demonstrate in pre-registered analyses of complex decision-making processes in applied settings.


2017 ◽  
Author(s):  
J. E. Korteling ◽  
Anne-Marie Brouwer ◽  
Alexander Toet

Human decision making shows systematic simplifications and deviations from the tenets of rationality (‘heuristics’) that may lead to suboptimal decisional outcomes (‘cognitive biases’). There are currently three prevailing theoretical perspectives on the origin of heuristics and cognitive biases: a cognitive-psychological, an ecological and an evolutionary perspective. However, these perspectives are mainly descriptive and none of them provides an overall explanatory framework for the underlying mechanisms of cognitive biases.To enhance our understanding of cognitive heuristics and biases we propose a neural network framework for cognitive biases, which explains why our brain systematically tends to default to heuristic (‘Type 1’) decision making. We argue that many cognitive biases arise from intrinsic brain mechanisms that are fundamental for the working of biological neural networks. In order to substantiate our viewpoint, we discern and explain four basic neural network principles: (1) Association, (2) Compatibility (3) Retainment, and (4) Focus. These principles are inherent to (all) neural networks which were originally optimized to perform concrete biological, perceptual, and motor functions. They form the basis for our inclinations to associate and combine (unrelated) information, to prioritize information that is compatible with our present state (such as knowledge, opinions and expectations), to retain given information that sometimes could better be ignored, and to focus on dominant information while ignoring relevant information that is not directly activated. The supposed mechanisms are complementary and not mutually exclusive. For different cognitive biases they may all contribute in varying degrees to distortion of information. The present viewpoint not only complements the earlier three viewpoints, but also provides a unifying and binding framework for many cognitive bias phenomena.


2011 ◽  
Vol 20 (4) ◽  
pp. 121-123
Author(s):  
Jeri A. Logemann

Evidence-based practice requires astute clinicians to blend our best clinical judgment with the best available external evidence and the patient's own values and expectations. Sometimes, we value one more than another during clinical decision-making, though it is never wise to do so, and sometimes other factors that we are unaware of produce unanticipated clinical outcomes. Sometimes, we feel very strongly about one clinical method or another, and hopefully that belief is founded in evidence. Some beliefs, however, are not founded in evidence. The sound use of evidence is the best way to navigate the debates within our field of practice.


2013 ◽  
Author(s):  
Scott D. Brown ◽  
Pete Cassey ◽  
Andrew Heathcote ◽  
Roger Ratcliff

Sign in / Sign up

Export Citation Format

Share Document