visual motion perception
Recently Published Documents


TOTAL DOCUMENTS

202
(FIVE YEARS 26)

H-INDEX

32
(FIVE YEARS 1)

2021 ◽  
Author(s):  
Johannes Bill ◽  
Samuel J Gershman ◽  
Jan Drugowitsch

Identifying the structure of motion relations in the environment is critical for navigation, tracking, prediction, and pursuit. Yet, little is known about the mental and neural computations that allow the visual system to infer this structure online from a volatile stream of visual information. We propose online hierarchical Bayesian inference as a principled solution for how the brain might solve this complex perceptual task. We derive an online Expectation-Maximization algorithm that explains human percepts qualitatively and quantitatively for a diverse set of stimuli, covering classical psychophysics experiments, ambiguous motion scenes, and illusory motion displays. We thereby identify normative explanations for the origin of human motion structure perception and make testable predictions for new psychophysics experiments. The algorithm furthermore affords a neural network implementation which shares properties with motion-sensitive cortical areas and motivates a novel class of experiments to reveal the neural representations of latent structure.


2021 ◽  
Vol 15 ◽  
Author(s):  
Ymie J. van der Zee ◽  
Peter L. J. Stiers ◽  
Lieven Lagae ◽  
Heleen M. Evenhuis

Aim: In this study, we examined (1) the presence of abnormally low scores (below 10th percentile) in various visual motion perception aspects in children with brain damage, while controlling for their cognitive developmental delay; (2) whether the risk is increased in comparison with the observation and expectation in a healthy control group and healthy population.Methods: Performance levels of 46 children with indications of brain damage (Mage = 7y4m, SD = 2y4m) on three visual motion perception aspects (global motion, motion speed, motion-defined form) were evaluated. We used developmental age as entry of a preliminary reference table to classify the patient’s performance levels. Then we compared the percentages of abnormally low scores with percentages expected in the healthy population using estimated base rates and the observed percentages in the control sample (n = 119).Results: When using developmental age as reference level, the percentage of low scores on at least one of the three tasks was significantly higher than expected in the healthy population [19/46, 41% (95%CI: 28–56%), p = 0.03]. In 15/19 (79% [95%CI: 61–97%] patients only one aspect of motion perception was affected. Four patients performed abnormally low on two out of three tasks, which is also higher than expected (4/46, 8.7%, 95%CI: 2.4–20.8% vs. 2.1%; z = 2.61, p < 0.01). The observed percentages in the patient group were also higher than found in the control group.Interpretation: There is some evidence that children with early brain damage have an increased risk of isolated and combined motion perception problems, independent of their performance IQ.


2021 ◽  
Vol 11 (1) ◽  
Author(s):  
L. A. M. H. Kirkels ◽  
W. Zhang ◽  
Z. Rezvani ◽  
R. J. A. van Wezel ◽  
M. M. van Wanrooij

AbstractVisual motion perception depends on readout of direction selective sensors. We investigated in mice whether the response to bidirectional transparent motion, activating oppositely tuned sensors, reflects integration (averaging) or winner-take-all (mutual inhibition) mechanisms. We measured whole body opto-locomotor reflexes (OLRs) to bidirectional oppositely moving random dot patterns (leftward and rightward) and compared the response to predictions based on responses to unidirectional motion (leftward or rightward). In addition, responses were compared to stimulation with stationary patterns. When comparing OLRs to bidirectional and unidirectional conditions, we found that the OLR to bidirectional motion best fits an averaging model. These results reflect integration mechanisms in neural responses to contradicting sensory evidence as has been documented for other sensory and motor domains.


2021 ◽  
Vol 11 (1) ◽  
Author(s):  
Sichao Yang ◽  
Johannes Bill ◽  
Jan Drugowitsch ◽  
Samuel J. Gershman

AbstractMotion relations in visual scenes carry an abundance of behaviorally relevant information, but little is known about how humans identify the structure underlying a scene’s motion in the first place. We studied the computations governing human motion structure identification in two psychophysics experiments and found that perception of motion relations showed hallmarks of Bayesian structural inference. At the heart of our research lies a tractable task design that enabled us to reveal the signatures of probabilistic reasoning about latent structure. We found that a choice model based on the task’s Bayesian ideal observer accurately matched many facets of human structural inference, including task performance, perceptual error patterns, single-trial responses, participant-specific differences, and subjective decision confidence—especially, when motion scenes were ambiguous and when object motion was hierarchically nested within other moving reference frames. Our work can guide future neuroscience experiments to reveal the neural mechanisms underlying higher-level visual motion perception.


PLoS ONE ◽  
2020 ◽  
Vol 15 (12) ◽  
pp. e0243430
Author(s):  
Takeshi Miyamoto ◽  
Kenichiro Miura ◽  
Tomohiro Kizuka ◽  
Seiji Ono

A large number of psychophysical and neurophysiological studies have demonstrated that smooth pursuit eye movements are tightly related to visual motion perception. This could be due to the fact that visual motion sensitive cortical areas such as meddle temporal (MT), medial superior temporal (MST) areas are involved in motion perception as well as pursuit initiation. Although the directional-discrimination and perceived target velocity tasks are used to evaluate visual motion perception, it is still uncertain whether the speed of visual motion perception, which is determined by visuomotor reaction time (RT) to a small target, is related to pursuit initiation. Therefore, we attempted to determine the relationship between pursuit latency/acceleration and the visual motion RT which was measured to the visual motion stimuli that moved leftward or rightward. The participants were instructed to fixate on a stationary target and press one of the buttons corresponding to the direction of target motion as soon as possible once the target starts to move. We applied five different visual motion stimuli including first- and second-order motion for smooth pursuit and visual motion RT tasks. It is well known that second-order motion induces lower retinal image motion, which elicits weaker responses in MT and MST compared to first-order motion stimuli. Our results showed that pursuit initiation including latency and initial eye acceleration were suppressed by second-order motion. In addition, second-order motion caused a delay in visual motion RT. The better performances in both pursuit initiation and visual motion RT were observed for first-order motion, whereas second-order (theta motion) induced remarkable deficits in both variables. Furthermore, significant Pearson’s correlation and within-subjects correlation coefficients were obtained between visual motion RT and pursuit latency/acceleration. Our findings support the suggestion that there is a common neuronal pathway involved in both pursuit initiation and the speed of visual motion perception.


2020 ◽  
Author(s):  
Sichao Yang ◽  
Johannes Bill ◽  
Jan Drugowitsch ◽  
Samuel J. Gershman

AbstractMotion relations in visual scenes carry an abundance of behaviorally relevant information, but little is known about the computations underlying the identification of visual motion structure by humans. We addressed this gap in two psychophysics experiments and found that participants identified hierarchically organized motion relations in close correspondence with Bayesian structural inference. We demonstrate that, for our tasks, a choice model based on the Bayesian ideal observer can accurately match many facets of human structural inference, including task performance, perceptual error patterns, single-trial responses, participant-specific differences, and subjective decision confidence, particularly when motion scenes are ambiguous. Our work can guide future neuroscience experiments to reveal the neural mechanisms underlying higher-level visual motion perception.


2020 ◽  
Author(s):  
Xiuyun Wu ◽  
Austin C. Rothwell ◽  
Miriam Spering ◽  
Anna Montagnini

AbstractSmooth pursuit eye movements and visual motion perception rely on the integration of current sensory signals with past experience. Experience shapes our expectation of current visual events and can drive eye movement responses made in anticipation of a target, such as anticipatory pursuit. Previous research revealed consistent effects of expectation on anticipatory pursuit—eye movements follow the expected target direction or speed—and contrasting effects on motion perception, but most studies considered either eye movement or perceptual responses. The current study directly compared effects of direction expectation on perception and anticipatory pursuit within the same direction discrimination task to investigate whether both types of responses are affected similarly or differently. Observers (n = 10) viewed high-coherence random-dot kinematograms (RDKs) moving rightward and leftward with a probability of 50, 70, or 90% in a given block of trials to build up an expectation of motion direction. They were asked to judge motion direction of interleaved low-coherence RDKs (0-15%). Perceptual judgements were compared to changes in anticipatory pursuit eye movements as a function of probability. Results show that anticipatory pursuit velocity scaled with probability and followed direction expectation (attraction bias), whereas perceptual judgments were biased opposite to direction expectation (repulsion bias). Control experiments suggest that the repulsion bias in perception was not caused by retinal slip induced by anticipatory pursuit, or by motion adaptation. We conclude that direction expectation can be processed differently for perception and anticipatory pursuit.


2020 ◽  
Vol 117 (39) ◽  
pp. 24581-24589
Author(s):  
Johannes Bill ◽  
Hrag Pailian ◽  
Samuel J. Gershman ◽  
Jan Drugowitsch

In the real world, complex dynamic scenes often arise from the composition of simpler parts. The visual system exploits this structure by hierarchically decomposing dynamic scenes: When we see a person walking on a train or an animal running in a herd, we recognize the individual’s movement as nested within a reference frame that is, itself, moving. Despite its ubiquity, surprisingly little is understood about the computations underlying hierarchical motion perception. To address this gap, we developed a class of stimuli that grant tight control over statistical relations among object velocities in dynamic scenes. We first demonstrate that structured motion stimuli benefit human multiple object tracking performance. Computational analysis revealed that the performance gain is best explained by human participants making use of motion relations during tracking. A second experiment, using a motion prediction task, reinforced this conclusion and provided fine-grained information about how the visual system flexibly exploits motion structure.


2020 ◽  
Author(s):  
Orly Halperin ◽  
Roie Karni ◽  
Simon Israeli-Korn ◽  
Sharon Hassin-Baer ◽  
Adam Zaidel

AbstractBackgroundIncreased dependence on visual cues in Parkinson’s disease (PD) can unbalance the perception-action loop, impair multisensory integration, and affect everyday function of PD patients. It is currently unknown why PD patients seem to be more reliant on their visual cues.ObjectivesWe hypothesized that PD patients may be overconfident in the reliability (precision) of their visual cues. In this study we tested coherent visual motion perception in PD, and probed subjective (self-reported) confidence in their visual motion perception.Methods20 patients with idiopathic PD, 21 healthy aged-matched controls and 20 healthy young adult participants were presented with visual stimuli of moving dots (random dot kinematograms). They were asked to report: (1) whether the aggregate motion of dots was to the left or to the right, and (2) how confident they were that their perceptual discrimination was correct.ResultsVisual motion discrimination thresholds were similar (unimpaired) in PD compared to the other groups. By contrast, PD patients were significantly overconfident in their visual perceptual decisions (p=0.002 and p<0.001 vs. the age-matched and young adult groups, respectively).ConclusionsThese results suggest intact visual motion perception, but overestimation of visual cue reliability, in PD. Overconfidence in visual (vs. other, e.g., somatosensory) cues could underlie accounts of increased visual dependence and impaired multisensory integration in PD, and could contribute to gait and balance impairments. Future work should investigate PD confidence in somatosensory function. A better understanding of altered sensory reliance in PD might open up new avenues to treat debilitating symptoms.


Sign in / Sign up

Export Citation Format

Share Document