scholarly journals A confirmation bias in perceptual decision-making due to hierarchical approximate inference

2021 ◽  
Vol 17 (11) ◽  
pp. e1009517
Author(s):  
Richard D. Lange ◽  
Ankani Chattoraj ◽  
Jeffrey M. Beck ◽  
Jacob L. Yates ◽  
Ralf M. Haefner

Making good decisions requires updating beliefs according to new evidence. This is a dynamical process that is prone to biases: in some cases, beliefs become entrenched and resistant to new evidence (leading to primacy effects), while in other cases, beliefs fade over time and rely primarily on later evidence (leading to recency effects). How and why either type of bias dominates in a given context is an important open question. Here, we study this question in classic perceptual decision-making tasks, where, puzzlingly, previous empirical studies differ in the kinds of biases they observe, ranging from primacy to recency, despite seemingly equivalent tasks. We present a new model, based on hierarchical approximate inference and derived from normative principles, that not only explains both primacy and recency effects in existing studies, but also predicts how the type of bias should depend on the statistics of stimuli in a given task. We verify this prediction in a novel visual discrimination task with human observers, finding that each observer’s temporal bias changed as the result of changing the key stimulus statistics identified by our model. The key dynamic that leads to a primacy bias in our model is an overweighting of new sensory information that agrees with the observer’s existing belief—a type of ‘confirmation bias’. By fitting an extended drift-diffusion model to our data we rule out an alternative explanation for primacy effects due to bounded integration. Taken together, our results resolve a major discrepancy among existing perceptual decision-making studies, and suggest that a key source of bias in human decision-making is approximate hierarchical inference.

Sensors ◽  
2021 ◽  
Vol 21 (7) ◽  
pp. 2461
Author(s):  
Alexander Kuc ◽  
Vadim V. Grubov ◽  
Vladimir A. Maksimenko ◽  
Natalia Shusharina ◽  
Alexander N. Pisarchik ◽  
...  

Perceptual decision-making requires transforming sensory information into decisions. An ambiguity of sensory input affects perceptual decisions inducing specific time-frequency patterns on EEG (electroencephalogram) signals. This paper uses a wavelet-based method to analyze how ambiguity affects EEG features during a perceptual decision-making task. We observe that parietal and temporal beta-band wavelet power monotonically increases throughout the perceptual process. Ambiguity induces high frontal beta-band power at 0.3–0.6 s post-stimulus onset. It may reflect the increasing reliance on the top-down mechanisms to facilitate accumulating decision-relevant sensory features. Finally, this study analyzes the perceptual process using mixed within-trial and within-subject design. First, we found significant percept-related changes in each subject and then test their significance at the group level. Thus, observed beta-band biomarkers are pronounced in single EEG trials and may serve as control commands for brain-computer interface (BCI).


2016 ◽  
Vol 115 (2) ◽  
pp. 915-930 ◽  
Author(s):  
Matthew A. Carland ◽  
Encarni Marcos ◽  
David Thura ◽  
Paul Cisek

Perceptual decision making is often modeled as perfect integration of sequential sensory samples until the accumulated total reaches a fixed decision bound. In that view, the buildup of neural activity during perceptual decision making is attributed to temporal integration. However, an alternative explanation is that sensory estimates are computed quickly with a low-pass filter and combined with a growing signal reflecting the urgency to respond and it is the latter that is primarily responsible for neural activity buildup. These models are difficult to distinguish empirically because they make similar predictions for tasks in which sensory information is constant within a trial, as in most previous studies. Here we presented subjects with a variant of the classic constant-coherence motion discrimination (CMD) task in which we inserted brief motion pulses. We examined the effect of these pulses on reaction times (RTs) in two conditions: 1) when the CMD trials were blocked and subjects responded quickly and 2) when the same CMD trials were interleaved among trials of a variable-motion coherence task that motivated slower decisions. In the blocked condition, early pulses had a strong effect on RTs but late pulses did not, consistent with both models. However, when subjects slowed their decision policy in the interleaved condition, later pulses now became effective while early pulses lost their efficacy. This last result contradicts models based on perfect integration of sensory evidence and implies that motion signals are processed with a strong leak, equivalent to a low-pass filter with a short time constant.


2020 ◽  
Vol 30 (10) ◽  
pp. 5471-5483
Author(s):  
Y Yau ◽  
M Dadar ◽  
M Taylor ◽  
Y Zeighami ◽  
L K Fellows ◽  
...  

Abstract Current models of decision-making assume that the brain gradually accumulates evidence and drifts toward a threshold that, once crossed, results in a choice selection. These models have been especially successful in primate research; however, transposing them to human fMRI paradigms has proved it to be challenging. Here, we exploit the face-selective visual system and test whether decoded emotional facial features from multivariate fMRI signals during a dynamic perceptual decision-making task are related to the parameters of computational models of decision-making. We show that trial-by-trial variations in the pattern of neural activity in the fusiform gyrus reflect facial emotional information and modulate drift rates during deliberation. We also observed an inverse-urgency signal based in the caudate nucleus that was independent of sensory information but appeared to slow decisions, particularly when information in the task was ambiguous. Taken together, our results characterize how decision parameters from a computational model (i.e., drift rate and urgency signal) are involved in perceptual decision-making and reflected in the activity of the human brain.


2021 ◽  
Author(s):  
Jennifer Laura Lee ◽  
Rachel N. Denison ◽  
Wei Ji Ma

Perceptual decision-making is often conceptualized as the process of comparing an internal decision variable to a categorical boundary, or criterion. How the mind sets such a criterion has been studied from at least two perspectives. First, researchers interested in consciousness have proposed that criterion-crossing determines whether a stimulus is consciously perceived. Second, researchers interested in decision-making have studied how the criterion depends on a range of stimulus and task variables. Both communities have considered the question of how the criterion behaves when sensory information is weak or uncertain. Interestingly, however, they have arrived at different conclusions. Consciousness researchers investigating a phenomenon called "subjective inflation" – a form of metacognitive mismatch in which observers overestimate the quality of their sensory representations in the periphery or at an unattended location – have proposed that the criterion governing subjective visibility is fixed. That is, it does not adjust to changes in sensory uncertainty. Decision-making researchers, on the other hand, have concluded that the criterion does adjust to account for sensory uncertainty, including under inattention. Here, we mathematically demonstrate that previous empirical findings supporting subjective inflation are consistent with either a fixed or a flexible decision criterion. We further show that specific experimental task requirements are necessary to make inferences about the flexibility of the criterion: 1) a clear mapping from decision variable space to stimulus feature space, and 2) a task incentive for observers to adjust their decision criterion as response variance increases. We conclude that the fixed-criterion model of subjective inflation requires re-thinking in light of new evidence from the probabilistic reasoning literature that decision criteria flexibly adjust according to response variance.


Author(s):  
Ana Gómez-Granados ◽  
Deborah A Barany ◽  
Margaret Schrayer ◽  
Isaac L. Kurtzer ◽  
Cédrick T Bonnet ◽  
...  

Many goal-directed actions that require rapid visuomotor planning and perceptual decision-making are affected in older adults, causing difficulties in execution of many functional activities of daily living. Visuomotor planning and perceptual decision-making are mediated by the dorsal and ventral visual streams, respectively, but it is unclear how age-induced changes in sensory processing in these streams contribute to declines in goal-directed actions. Previously, we have shown that in healthy adults, task demands influence movement strategies during visuomotor decision-making, reflecting differential integration of sensory information between the two streams. Here, we asked the question if older adults would exhibit larger declines in interactions between the two streams during demanding motor tasks. Older adults (n=15) and young controls (n=26) performed reaching or interception movements towards virtual objects. In some blocks of trials, participants also had to select an appropriate movement goal based on the shape of the object. Our results showed that older adults corrected fewer initial decision errors during both reaching and interception movements. During the interception decision task, older adults made more decision- and execution-related errors than young adults, which were related to early initiation of their movements. Together, these results suggest that older adults have a reduced ability to integrate new perceptual information to guide online action, which may reflect impaired ventral-dorsal stream interactions.


2010 ◽  
Vol 103 (3) ◽  
pp. 1179-1194 ◽  
Author(s):  
Andrew S. Kayser ◽  
Bradley R. Buchsbaum ◽  
Drew T. Erickson ◽  
Mark D'Esposito

Our ability to make rapid decisions based on sensory information belies the complexity of the underlying computations. Recently, “accumulator” models of decision making have been shown to explain the activity of parietal neurons as macaques make judgments concerning visual motion. Unraveling the operation of a decision-making circuit, however, involves understanding both the responses of individual components in the neural circuitry and the relationships between them. In this functional magnetic resonance imaging study of the decision process in humans, we demonstrate that an accumulator model predicts responses to visual motion in the intraparietal sulcus (IPS). Significantly, the metrics used to define responses within the IPS also reveal distinct but interacting nodes in a circuit, including early sensory detectors in visual cortex, the visuomotor integration system of the IPS, and centers of cognitive control in the prefrontal cortex, all of which collectively define a perceptual decision-making network.


eLife ◽  
2015 ◽  
Vol 4 ◽  
Author(s):  
Nela Cicmil ◽  
Bruce G Cumming ◽  
Andrew J Parker ◽  
Kristine Krug

Effective perceptual decisions rely upon combining sensory information with knowledge of the rewards available for different choices. However, it is not known where reward signals interact with the multiple stages of the perceptual decision-making pathway and by what mechanisms this may occur. We combined electrical microstimulation of functionally specific groups of neurons in visual area V5/MT with performance-contingent reward manipulation, while monkeys performed a visual discrimination task. Microstimulation was less effective in shifting perceptual choices towards the stimulus preferences of the stimulated neurons when available reward was larger. Psychophysical control experiments showed this result was not explained by a selective change in response strategy on microstimulated trials. A bounded accumulation decision model, applied to analyse behavioural performance, revealed that the interaction of expected reward with microstimulation can be explained if expected reward modulates a sensory representation stage of perceptual decision-making, in addition to the better-known effects at the integration stage.


2018 ◽  
Vol 41 ◽  
Author(s):  
Alan A. Stocker

AbstractOptimal or suboptimal, Rahnev & Denison (R&D) rightly argue that this ill-defined distinction is not useful when comparing models of perceptual decision making. However, what they miss is how valuable the focus on optimality has been in deriving these models in the first place. Rather than prematurely abandon the optimality assumption, we should refine this successful normative hypothesis with additional constraints that capture specific limitations of (sensory) information processing in the brain.


2019 ◽  
Author(s):  
Y. Yau ◽  
M. Dadar ◽  
M. Taylor ◽  
Y. Zeighami ◽  
L.K. Fellows ◽  
...  

AbstractCurrent models of decision-making assume that the brain gradually accumulates evidence and drifts towards a threshold which, once crossed, results in a choice selection. These models have been especially successful in primate research, however transposing them to human fMRI paradigms has proved challenging. Here, we exploit the face-selective visual system and test whether decoded emotional facial features from multivariate fMRI signals during a dynamic perceptual decision-making task are related to the parameters of computational models of decision-making. We show that trial-by-trial variations in the pattern of neural activity in the fusiform gyrus reflect facial emotional information and modulate drift rates during deliberation. We also observed an inverse-urgency signal based in the caudate nucleus that was independent of sensory information but appeared to slow decisions, particularly when information in the task was ambiguous. Taken together, our results characterize how decision parameters from a computational model (i.e., drift rate and urgency signal) are involved in perceptual decision-making and reflected in the activity of the human brain.


2021 ◽  
Author(s):  
Matthijs N Oude Lohuis ◽  
Jean L Pie ◽  
Pietro Marchesi ◽  
Jorrit S Montijn ◽  
Christiaan P J de Kock ◽  
...  

The transformation of sensory inputs into behavioral outputs is characterized by an interplay between feedforward and feedback operations in cortical hierarchies. Even in simple sensorimotor transformations, recurrent processing is often expressed in primary cortices in a late phase of the cortical response to sensory stimuli. This late phase is engaged by attention and stimulus complexity, and also encodes sensory-independent factors, including movement and report-related variables. However, despite its pervasiveness, the nature and function of late activity in perceptual decision-making remain unclear. We tested whether the function of late activity depends on the complexity of a sensory change-detection task. Complexity was based on increasing processing requirements for the same sensory stimuli. We found that the temporal window in which V1 is necessary for perceptual decision-making was extended when we increased task complexity, independently of the presented visual stimulus. This window overlapped with the emergence of report-related activity and decreased noise correlations in V1. The onset of these co-occurring activity patterns was time-locked to and preceded reaction time, and predicted the reduction in behavioral performance obtained by optogenetically silencing late V1 activity (>200 ms after stimulus onset), a result confirmed by a second multisensory task with different requirements. Thus, although early visual response components encode all sensory information necessary to solve the task, V1 is not simply relaying information to higher-order areas transforming it into behavioral responses. Rather, task complexity determines the temporal extension of a loop of recurrent activity, which overlaps with report-related activity and determines how perceptual decisions are built.


Sign in / Sign up

Export Citation Format

Share Document