scholarly journals Congruent audio-visual stimulation during adaptation modulates the subsequently experienced visual motion aftereffect

2019 ◽  
Vol 9 (1) ◽  
Author(s):  
Minsun Park ◽  
Randolph Blake ◽  
Yeseul Kim ◽  
Chai-Youn Kim

AbstractSensory information registered in one modality can influence perception associated with sensory information registered in another modality. The current work focuses on one particularly salient form of such multisensory interaction: audio-visual motion perception. Previous studies have shown that watching visual motion and listening to auditory motion influence each other, but results from those studies are mixed with regard to the nature of the interactions promoting that influence and where within the sequence of information processing those interactions transpire. To address these issues, we investigated whether (i) concurrent audio-visual motion stimulation during an adaptation phase impacts the strength of the visual motion aftereffect (MAE) during a subsequent test phase, and (ii) whether the magnitude of that impact was dependent on the congruence between auditory and visual motion experienced during adaptation. Results show that congruent direction of audio-visual motion during adaptation induced a stronger initial impression and a slower decay of the MAE than did the incongruent direction, which is not attributable to differential patterns of eye movements during adaptation. The audio-visual congruency effects measured here imply that visual motion perception emerges from integration of audio-visual motion information at a sensory neural stage of processing.

PLoS ONE ◽  
2011 ◽  
Vol 6 (3) ◽  
pp. e17499 ◽  
Author(s):  
Souta Hidaka ◽  
Wataru Teramoto ◽  
Yoichi Sugita ◽  
Yuko Manaka ◽  
Shuichi Sakamoto ◽  
...  

i-Perception ◽  
10.1068/ic890 ◽  
2011 ◽  
Vol 2 (8) ◽  
pp. 890-890
Author(s):  
Souta Hidaka ◽  
Wataru Teramoto ◽  
Yoichi Sugita ◽  
Yuko Manaka ◽  
Shuichi Sakamoto ◽  
...  

Perception ◽  
1994 ◽  
Vol 23 (10) ◽  
pp. 1257-1264 ◽  
Author(s):  
Michael T Swanston

Evidence concerning the origin of the motion aftereffect (MAE) is assessed in terms of a model of levels of representation in visual motion perception proposed by Wade and Swanston. Very few experiments have been designed so as to permit unambiguous conclusions to be drawn. The requirements for such experiments are identified. Whereas retinocentric motion could in principle give rise to the MAE, data are not available which would enable a conclusion to be drawn. There is good evidence for a patterncentric origin, indicating that the MAE is primarily the result of adaptation in the systems responsible for detecting relative visual motion. There is evidence for a further contribution from the process that compensates retinocentric motion for eye movements, in the form of nonveridical information for eye movements. There may also be an effect at the level at which perceived distance and self-movement information are combined with egocentric motion to give a geocentric representation which provides the basis for reports of phenomenal experience. It is concluded that the MAE can be caused by changes in activity at more than one level of representation, and cannot be ascribed to a single underlying process.


2003 ◽  
Vol 14 (4) ◽  
pp. 357-361 ◽  
Author(s):  
Jean Vroomen ◽  
Beatrice de Gelder

In this study, we show that the contingent auditory motion aftereffect is strongly influenced by visual motion information. During an induction phase, participants listened to rightward-moving sounds with falling pitch alternated with leftward-moving sounds with rising pitch (or vice versa). Auditory aftereffects (i.e., a shift in the psychometric function for unimodal auditory motion perception) were bigger when a visual stimulus moved in the same direction as the sound than when no visual stimulus was presented. When the visual stimulus moved in the opposite direction, aftereffects were reversed and thus became contingent upon visual motion. When visual motion was combined with a stationary sound, no aftereffect was observed. These findings indicate that there are strong perceptual links between the visual and auditory motion-processing systems.


1967 ◽  
Vol 24 (3_suppl) ◽  
pp. 1263-1270 ◽  
Author(s):  
Thomas R. Scott ◽  
Robert A. Bragg ◽  
Augustus E. Jordan

Eysenck's claim that sodium amytal shortens and dexedrine lengthens the duration of spiral aftereffect was not borne out in any of four experiments designed to demonstrate it, including a replication of his study. A further replication, different only in the stimulus used, yielded no effect of amytal or dexedrine. Actual measurement of aftereffect rate immediately following the eliciting stimulus and after selected delays showed an exponential decay function for aftereffect rate but did not demonstrate any effect of the two drugs. This repeated failure to demonstrate a change in aftereffect as a result of the administration of drugs known to affect neuron firing thresholds has implications for the understanding of neurophysiology of visual motion perception. It was proposed that motion aftereffect is based on a comparison of the states of two neural systems both of which are equally affected by the drugs.


2019 ◽  
Vol 5 (1) ◽  
pp. 247-268 ◽  
Author(s):  
Peter Thier ◽  
Akshay Markanday

The cerebellar cortex is a crystal-like structure consisting of an almost endless repetition of a canonical microcircuit that applies the same computational principle to different inputs. The output of this transformation is broadcasted to extracerebellar structures by way of the deep cerebellar nuclei. Visually guided eye movements are accommodated by different parts of the cerebellum. This review primarily discusses the role of the oculomotor part of the vermal cerebellum [the oculomotor vermis (OMV)] in the control of visually guided saccades and smooth-pursuit eye movements. Both types of eye movements require the mapping of retinal information onto motor vectors, a transformation that is optimized by the OMV, considering information on past performance. Unlike the role of the OMV in the guidance of eye movements, the contribution of the adjoining vermal cortex to visual motion perception is nonmotor and involves a cerebellar influence on information processing in the cerebral cortex.


2021 ◽  
Vol 11 (1) ◽  
Author(s):  
Sichao Yang ◽  
Johannes Bill ◽  
Jan Drugowitsch ◽  
Samuel J. Gershman

AbstractMotion relations in visual scenes carry an abundance of behaviorally relevant information, but little is known about how humans identify the structure underlying a scene’s motion in the first place. We studied the computations governing human motion structure identification in two psychophysics experiments and found that perception of motion relations showed hallmarks of Bayesian structural inference. At the heart of our research lies a tractable task design that enabled us to reveal the signatures of probabilistic reasoning about latent structure. We found that a choice model based on the task’s Bayesian ideal observer accurately matched many facets of human structural inference, including task performance, perceptual error patterns, single-trial responses, participant-specific differences, and subjective decision confidence—especially, when motion scenes were ambiguous and when object motion was hierarchically nested within other moving reference frames. Our work can guide future neuroscience experiments to reveal the neural mechanisms underlying higher-level visual motion perception.


Sign in / Sign up

Export Citation Format

Share Document