scholarly journals Auditory Motion Does Not Modulate Spiking Activity in the Middle Temporal and Medial Superior Temporal Visual Areas

2017 ◽  
Author(s):  
Tristan A. Chaplin ◽  
Benjamin J. Allitt ◽  
Maureen A. Hagan ◽  
Marcello G.P. Rosa ◽  
Ramesh Rajan ◽  
...  

AbstractThe integration of multiple sensory modalities is a key aspect of brain function, allowing animals to take advantage of concurrent sources of information to make more accurate perceptual judgments. For many years, multisensory integration in the cerebral cortex was deemed to occur only in high-level “polysensory” association areas. However, more recent studies have suggested that cross-modal stimulation can also influence neural activity in areas traditionally considered to be unimodal. In particular, several human neuroimaging studies have reported that extrastriate areas involved in visual motion perception are also activated by auditory motion, and may integrate audio-visual motion cues. However, the exact nature and extent of the effects of auditory motion on the visual cortex have not been studied at the single neuron level. We recorded the spiking activity of neurons in the middle temporal (MT) and medial superior temporal (MST) areas of anesthetized marmoset monkeys upon presentation of unimodal stimuli (moving auditory or visual patterns), as well as bimodal stimuli (concurrent audio-visual motion). Despite robust, direction selective responses to visual motion, none of the sampled neurons responded to auditory motion stimuli. Moreover, concurrent moving auditory stimuli had no significant effect on the ability of single MT and MST neurons, or populations of simultaneously recorded neurons, to discriminate the direction of motion of visual stimuli (moving random dot patterns with varying levels of motion noise). Our findings do not support the hypothesis that direct interactions between MT, MST and areas low in the hierarchy of auditory areas underlie audiovisual motion integration.

2018 ◽  
Vol 48 (4) ◽  
pp. 2013-2029 ◽  
Author(s):  
Tristan A. Chaplin ◽  
Benjamin J. Allitt ◽  
Maureen A. Hagan ◽  
Marcello G. P. Rosa ◽  
Ramesh Rajan ◽  
...  

2008 ◽  
Vol 20 (6) ◽  
pp. 1094-1106 ◽  
Author(s):  
Maria Concetta Morrone ◽  
Andrea Guzzetta ◽  
Francesca Tinelli ◽  
Michela Tosetti ◽  
Michela Del Viva ◽  
...  

We report here two cases of two young diplegic patients with cystic periventricular leukomalacia who systematically, and with high sensitivity, perceive translational motion of a random-dot display in the opposite direction. The apparent inversion was specific for translation motion: Rotation and expansion motion were perceived correctly, with normal sensitivity. It was also specific for random-dot patterns, not occurring with gratings. For the one patient that we were able to test extensively, contrast sensitivity for static stimuli was normal, but was very low for direction discrimination at high spatial frequencies and all temporal frequencies. His optokinetic nystagmus movements were normal but he was unable to track a single translating target, indicating a perceptual origin of the tracking deficit. The severe deficit for motion perception was also evident in the seminatural situation of a driving simulation video game. The perceptual deficit for translational motion was reinforced by functional magnetic resonance imaging studies. Translational motion elicited no response in the MT complex, although it did produce a strong response in many visual areas when contrasted with blank stimuli. However, radial and rotational motion produced a normal pattern of activation in a subregion of the MT complex. These data reinforce the existent evidence for independent cortical processing for translational, and circular or radial flow motion, and further suggest that the two systems have different vulnerability and plasticity to prenatal damage. They also highlight the complexity of visual motion perception, and how the delicate balance of neural activity can lead to paradoxical effects such as consistent misperception of the direction of motion. We advance a possible explanation of a reduced spatial sampling of the motion stimuli and report a simple model that simulates well the experimental results.


PLoS ONE ◽  
2011 ◽  
Vol 6 (3) ◽  
pp. e17499 ◽  
Author(s):  
Souta Hidaka ◽  
Wataru Teramoto ◽  
Yoichi Sugita ◽  
Yuko Manaka ◽  
Shuichi Sakamoto ◽  
...  

i-Perception ◽  
10.1068/ic890 ◽  
2011 ◽  
Vol 2 (8) ◽  
pp. 890-890
Author(s):  
Souta Hidaka ◽  
Wataru Teramoto ◽  
Yoichi Sugita ◽  
Yuko Manaka ◽  
Shuichi Sakamoto ◽  
...  

PLoS ONE ◽  
2021 ◽  
Vol 16 (6) ◽  
pp. e0253067
Author(s):  
Benedict Wild ◽  
Stefan Treue

Modern accounts of visual motion processing in the primate brain emphasize a hierarchy of different regions within the dorsal visual pathway, especially primary visual cortex (V1) and the middle temporal area (MT). However, recent studies have called the idea of a processing pipeline with fixed contributions to motion perception from each area into doubt. Instead, the role that each area plays appears to depend on properties of the stimulus as well as perceptual history. We propose to test this hypothesis in human subjects by comparing motion perception of two commonly used stimulus types: drifting sinusoidal gratings (DSGs) and random dot patterns (RDPs). To avoid potential biases in our approach we are pre-registering our study. We will compare the effects of size and contrast levels on the perception of the direction of motion for DSGs and RDPs. In addition, based on intriguing results in a pilot study, we will also explore the effects of a post-stimulus mask. Our approach will offer valuable insights into how motion is processed by the visual system and guide further behavioral and neurophysiological research.


2021 ◽  
Vol 11 (1) ◽  
Author(s):  
L. A. M. H. Kirkels ◽  
W. Zhang ◽  
Z. Rezvani ◽  
R. J. A. van Wezel ◽  
M. M. van Wanrooij

AbstractVisual motion perception depends on readout of direction selective sensors. We investigated in mice whether the response to bidirectional transparent motion, activating oppositely tuned sensors, reflects integration (averaging) or winner-take-all (mutual inhibition) mechanisms. We measured whole body opto-locomotor reflexes (OLRs) to bidirectional oppositely moving random dot patterns (leftward and rightward) and compared the response to predictions based on responses to unidirectional motion (leftward or rightward). In addition, responses were compared to stimulation with stationary patterns. When comparing OLRs to bidirectional and unidirectional conditions, we found that the OLR to bidirectional motion best fits an averaging model. These results reflect integration mechanisms in neural responses to contradicting sensory evidence as has been documented for other sensory and motor domains.


2019 ◽  
Vol 9 (1) ◽  
Author(s):  
Minsun Park ◽  
Randolph Blake ◽  
Yeseul Kim ◽  
Chai-Youn Kim

AbstractSensory information registered in one modality can influence perception associated with sensory information registered in another modality. The current work focuses on one particularly salient form of such multisensory interaction: audio-visual motion perception. Previous studies have shown that watching visual motion and listening to auditory motion influence each other, but results from those studies are mixed with regard to the nature of the interactions promoting that influence and where within the sequence of information processing those interactions transpire. To address these issues, we investigated whether (i) concurrent audio-visual motion stimulation during an adaptation phase impacts the strength of the visual motion aftereffect (MAE) during a subsequent test phase, and (ii) whether the magnitude of that impact was dependent on the congruence between auditory and visual motion experienced during adaptation. Results show that congruent direction of audio-visual motion during adaptation induced a stronger initial impression and a slower decay of the MAE than did the incongruent direction, which is not attributable to differential patterns of eye movements during adaptation. The audio-visual congruency effects measured here imply that visual motion perception emerges from integration of audio-visual motion information at a sensory neural stage of processing.


Sign in / Sign up

Export Citation Format

Share Document