sensory uncertainty
Recently Published Documents


TOTAL DOCUMENTS

49
(FIVE YEARS 21)

H-INDEX

9
(FIVE YEARS 1)

2022 ◽  
Author(s):  
Jonathan S Tsay ◽  
Steven Tan ◽  
Marlena Chu ◽  
Richard B Ivry ◽  
Emily A Cooper

Successful goal-directed actions require constant fine-tuning in response to errors introduced by changes in the body and environment. This implicit adaptive process has been assumed to operate in a statistically optimal fashion, reducing its sensitivity to errors when sensory uncertainty is high. However, recent work has shown that visual uncertainty attenuates implicit adaptation for small errors, but not large errors, a result that is at odds with an optimal integration hypothesis. This error size interaction has motivated a new hypothesis that sensory uncertainty impacts the distribution of the perceived error locations but not the system's sensitivity to errors. To examine these competing hypotheses, previous studies have experimentally manipulated uncertainty. But it is unknown which hypothesis best describes motor adaptation to sensory uncertainty experienced during daily life. To address this question, we recruited individuals with low vision due to diverse clinical conditions impacting visual uncertainty and matched controls. The groups were tested on visuomotor tasks designed to isolate implicit adaptation and maintain tight control over the error size. In two experiments, low vision was associated with attenuated implicit adaptation only for small errors, but not for large errors. Taken together with prior work in which visual uncertainty was experimentally manipulated, these results support the notion that increasing sensory uncertainty increases the likelihood that errors are mis-localized but does not affect error sensitivity, offering a novel account for the motor learning deficits seen in low vision.


2021 ◽  
Author(s):  
Shannon M Locke ◽  
Michael S Landy ◽  
Pascal Mamassian

Perceptual confidence is an important internal signal about the certainty of our decisions and there is a substantial debate on how it is computed. We highlight three confidence metric types from the literature: observers either use 1) the full probability distribution to compute probability correct (Probability metrics), 2) point estimates from the perceptual decision process to estimate uncertainty (Evidence-Strength metrics), or 3) heuristic confidence from stimulus-based cues to uncertainty (Heuristic metrics). These metrics are rarely tested against one another, so we examined models of all three types on a suprathreshold spatial discrimination task. Observers were shown a cloud of dots sampled from a dot generating distribution and judged if the mean of the distribution was left or right of centre. In addition to varying the horizontal position of the mean, there were two sensory uncertainty manipulations: the number of dots sampled and the spread of the generating distribution. After every two perceptual decisions, observers made a confidence forced-choice judgement whether they were more confident in the first or second decision. Model results showed that observers were on average best-fit by a Heuristic model that used dot cloud position, spread, and number of dots as cues. However, almost half of the observers were best-fit by an Evidence-Strength model that uses the distance between the discrimination criterion and a point estimate, scaled according to sensory uncertainty, to compute confidence. This signal-to-noise ratio model outperformed the standard unscaled distance from criterion model favoured by many researchers and suggests that this latter simple model may not be suitable for mixed-difficulty designs. An accidental repetition of some sessions also allowed for the measurement of confidence agreement for identical pairs of stimuli. This N-pass analysis revealed that human observers were more consistent than their best-fitting model would predict, indicating there are still aspects of confidence that are not captured by our model. As such, we propose confidence agreement as a useful technique for computational studies of confidence. Taken together, these findings highlight the idiosyncratic nature of confidence computations for complex decision contexts and the need to consider different potential metrics and transformations in the confidence computation.


2021 ◽  
Vol 11 (1) ◽  
Author(s):  
Kathryn Bonnen ◽  
Jonathan S. Matthis ◽  
Agostino Gibaldi ◽  
Martin S. Banks ◽  
Dennis M. Levi ◽  
...  

AbstractCoordination between visual and motor processes is critical for the selection of stable footholds when walking in uneven terrains. While recent work (Matthis et al. in Curr Biol 8(28):1224–1233, 2018) demonstrates a tight link between gaze (visual) and gait (motor), it remains unclear which aspects of visual information play a role in this visuomotor control loop, and how the loss of this information affects that relationship. Here we examine the role of binocular information in the visuomotor control of walking over complex terrain. We recorded eye and body movements while normally-sighted participants walked over terrains of varying difficulty, with intact vision or with vision in one eye blurred to disrupt binocular vision. Gaze strategy was highly sensitive to the complexity of the terrain, with more fixations dedicated to foothold selection as the terrain became more difficult. The primary effect of increased sensory uncertainty due to disrupted binocular vision was a small bias in gaze towards closer footholds, indicating greater pressure on the visuomotor control process. Participants with binocular vision losses due to developmental disorders (i.e., amblyopia, strabismus), who have had the opportunity to develop alternative strategies, also biased their gaze towards closer footholds. Across all participants, we observed a relationship between an individual’s typical level of binocular visual function and the degree to which gaze is shifted toward the body. Thus the gaze–gait relationship is sensitive to the level of sensory uncertainty, and deficits in binocular visual function (whether transient or long-standing) have systematic effects on gaze strategy in complex terrains. We conclude that binocular vision provides useful information for locating footholds during locomotion. Furthermore, we have demonstrated that combined eye/body tracking in natural environments can be used to provide a more detailed understanding of the impact of a type of vision loss on the visuomotor control process of walking, a vital everyday task.


2021 ◽  
Author(s):  
Ling-Qi Zhang ◽  
Alan A Stocker

Bayesian inference provides an elegant theoretical framework for understanding the characteristic biases and discrimination thresholds in visual speed perception. However, the framework is difficult to validate due to its flexibility and the fact that suitable constraints on the structure of the sensory uncertainty have been missing. Here, we demonstrate that a Bayesian observer model constrained by efficient coding not only well fits extensive psychophysical data of human visual speed perception but also provides an accurate quantitative account of the tuning characteristics of neurons known for representing visual speed. Specifically, we found that the population coding accuracy for visual speed in area MT ("neural prior") is precisely predicted by the power-law, slow-speed prior extracted from fitting the Bayesian model to the psychophysical data ("behavioral prior"), to the point that they are indistinguishable in a model cross-validation comparison. Our results demonstrate a quantitative validation of the Bayesian observer model constrained by efficient coding at both the behavioral and neural levels.


2021 ◽  
Author(s):  
Jennifer Laura Lee ◽  
Rachel N. Denison ◽  
Wei Ji Ma

Perceptual decision-making is often conceptualized as the process of comparing an internal decision variable to a categorical boundary, or criterion. How the mind sets such a criterion has been studied from at least two perspectives. First, researchers interested in consciousness have proposed that criterion-crossing determines whether a stimulus is consciously perceived. Second, researchers interested in decision-making have studied how the criterion depends on a range of stimulus and task variables. Both communities have considered the question of how the criterion behaves when sensory information is weak or uncertain. Interestingly, however, they have arrived at different conclusions. Consciousness researchers investigating a phenomenon called "subjective inflation" – a form of metacognitive mismatch in which observers overestimate the quality of their sensory representations in the periphery or at an unattended location – have proposed that the criterion governing subjective visibility is fixed. That is, it does not adjust to changes in sensory uncertainty. Decision-making researchers, on the other hand, have concluded that the criterion does adjust to account for sensory uncertainty, including under inattention. Here, we mathematically demonstrate that previous empirical findings supporting subjective inflation are consistent with either a fixed or a flexible decision criterion. We further show that specific experimental task requirements are necessary to make inferences about the flexibility of the criterion: 1) a clear mapping from decision variable space to stimulus feature space, and 2) a task incentive for observers to adjust their decision criterion as response variance increases. We conclude that the fixed-criterion model of subjective inflation requires re-thinking in light of new evidence from the probabilistic reasoning literature that decision criteria flexibly adjust according to response variance.


eLife ◽  
2020 ◽  
Vol 9 ◽  
Author(s):  
Ulrik Beierholm ◽  
Tim Rohe ◽  
Ambra Ferrari ◽  
Oliver Stegle ◽  
Uta Noppeney

To form a more reliable percept of the environment, the brain needs to estimate its own sensory uncertainty. Current theories of perceptual inference assume that the brain computes sensory uncertainty instantaneously and independently for each stimulus. We evaluated this assumption in four psychophysical experiments, in which human observers localized auditory signals that were presented synchronously with spatially disparate visual signals. Critically, the visual noise changed dynamically over time continuously or with intermittent jumps. Our results show that observers integrate audiovisual inputs weighted by sensory uncertainty estimates that combine information from past and current signals consistent with an optimal Bayesian learner that can be approximated by exponential discounting. Our results challenge leading models of perceptual inference where sensory uncertainty estimates depend only on the current stimulus. They demonstrate that the brain capitalizes on the temporal dynamics of the external world and estimates sensory uncertainty by combining past experiences with new incoming sensory signals.


2020 ◽  
Vol 10 (1) ◽  
Author(s):  
Sabrina Trapp ◽  
Iris Vilares

AbstractA rich body of empirical work has addressed the question of how stress changes the way we memorize, learn, and make high-level decisions in complex scenarios. There is evidence that stress also changes the way we perceive the world, indicating influences on decision-making at lower levels. Surprisingly, as of yet, little research has been conducted in this domain. A few studies suggest that under stress, humans tend to eschew existing knowledge, and instead focus on novel input or information from bottom-up. Decision-making in the perceptual domain has been modeled with Bayesian frameworks. Here, existing knowledge about structures and statistics of our environment is referred to as prior, whereas sensory data are termed likelihood. In this study, we directly assessed whether stress, as induced by the socially evaluated cold pressure task (SECPT), would modulate low-level decisions, specifically the weight given to sensory information, and how people reacted to changes in prior and sensory uncertainty. We found that while the stress-inducing procedure successfully elicited subjective stress ratings as well as stress relevant physiological paramters, it did not change participants’ average reliance on sensory information. Furthermore, it did not affect participants’ sensitivity to changes in prior and sensory uncertainty, with both groups able to detect it and modulate their behavior accordingly, in a way predicted by Bayesian statistics. Our results suggest that, contrary to our predictions, stress may not directly affect lower-level sensory-motor decisions. We discuss the findings in context of time scales of the stress reaction, linked to different neural and functional consequences.


2020 ◽  
Vol 16 (11) ◽  
pp. e1006308
Author(s):  
Yanli Zhou ◽  
Luigi Acerbi ◽  
Wei Ji Ma

Perceptual organization is the process of grouping scene elements into whole entities. A classic example is contour integration, in which separate line segments are perceived as continuous contours. Uncertainty in such grouping arises from scene ambiguity and sensory noise. Some classic Gestalt principles of contour integration, and more broadly, of perceptual organization, have been re-framed in terms of Bayesian inference, whereby the observer computes the probability that the whole entity is present. Previous studies that proposed a Bayesian interpretation of perceptual organization, however, have ignored sensory uncertainty, despite the fact that accounting for the current level of perceptual uncertainty is one of the main signatures of Bayesian decision making. Crucially, trial-by-trial manipulation of sensory uncertainty is a key test to whether humans perform near-optimal Bayesian inference in contour integration, as opposed to using some manifestly non-Bayesian heuristic. We distinguish between these hypotheses in a simplified form of contour integration, namely judging whether two line segments separated by an occluder are collinear. We manipulate sensory uncertainty by varying retinal eccentricity. A Bayes-optimal observer would take the level of sensory uncertainty into account—in a very specific way—in deciding whether a measured offset between the line segments is due to non-collinearity or to sensory noise. We find that people deviate slightly but systematically from Bayesian optimality, while still performing “probabilistic computation” in the sense that they take into account sensory uncertainty via a heuristic rule. Our work contributes to an understanding of the role of sensory uncertainty in higher-order perception.


Sign in / Sign up

Export Citation Format

Share Document