scholarly journals The automatic nature of motivated belief updating

2018 ◽  
Vol 3 (1) ◽  
pp. 87-103 ◽  
Author(s):  
ANDREAS KAPPES ◽  
TALI SHAROT

AbstractPeople's risk estimates often do not align with the evidence available to them. In particular, people tend to discount bad news (such as evidence suggesting their risk of being involved in a car accident is higher than they thought) as compared to good news (evidence suggesting it is lower) – this is known as the belief update bias. It has been assumed that individuals use motivated reasoning to rationalise away unwanted evidence (e.g., “I am a safe driver, thus these statistics do not apply to me”). However, whether reasoning is required to discount bad news has not been tested directly. Here, we restrict cognitive resources using a cognitive load (Experiment 1) and a time restriction manipulation (Experiment 3) and find that while these manipulations diminish learning in general, they do not diminish the bias. Furthermore, we also show that the relative neglect of bad news happens the moment new evidence is presented, not when participants are subsequently prompted to state their belief (Experiment 2). Our findings suggest that reasoning is not required for bad news to be discounted as compared to good news.

2018 ◽  
Author(s):  
Cass R Sunstein ◽  
Sebastian Bobadilla-Suarez ◽  
Stephanie C. Lazzaro ◽  
Tali Sharot

102 Cornell L. Rev. 1431 (2017)People are frequently exposed to competing evidence about climate change. We examined how new information alters people’s beliefs. We find that people who are not sure that man-made climate change is occurring, and who do not favor an international agreement to reduce greenhouse gas emissions, show a form of asymmetrical updating: They change their beliefs in response to unexpected good news (suggesting that average temperature rise is likely to be less than previously thought) and fail to change their beliefs in response to unexpected bad news (suggesting that average temperature rise is likely to be greater than previously thought). By contrast, people who strongly believe that manmade climate change is occurring, and who favor an international agreement, show the opposite asymmetry: They change their beliefs far more in response to unexpected bad news (suggesting that average temperature rise is likely to be greater than previously thought) than in response to unexpected good news (suggesting that average temperature rise is likely to be smaller than previously thought). The results suggest that exposure to varied scientific evidence about climate change may increase polarization within a population due to asymmetrical updating. We explore the implications of our findings for how people will update their beliefs upon receiving new evidence about climate change, and also for other beliefs relevant to politics and law.


Author(s):  
Clara Xiaoling Chen ◽  
Ryan Hudgins ◽  
William F. Wright

We use an experiment to examine how advice valence (i.e. whether the advice suggests good news or bad news) affects the perceived source credibility of data analytics compared to human experts as a result of motivated reasoning. We predict that individuals will perceive data analytics as less credible than human experts, but only when the advice suggests bad news. Using a forecasting task in which individuals are seeking advice from either a human expert or data analytics, we find evidence consistent with our prediction. Furthermore, we find that this effect is mediated by the perceived competence of the advice source. We contribute to the nascent accounting literature on data analytics by providing evidence on a potential impediment to successfully transitioning to the use of analytics for decision-making in organizations.


2011 ◽  
Author(s):  
Angela Legg ◽  
Kate Sweeny
Keyword(s):  
Bad News ◽  

Sign in / Sign up

Export Citation Format

Share Document