scholarly journals Manual Gestures Modulate Early Neural Responses in Loudness Perception

2021 ◽  
Vol 15 ◽  
Author(s):  
Jiaqiu Sun ◽  
Ziqing Wang ◽  
Xing Tian

How different sensory modalities interact to shape perception is a fundamental question in cognitive neuroscience. Previous studies in audiovisual interaction have focused on abstract levels such as categorical representation (e.g., McGurk effect). It is unclear whether the cross-modal modulation can extend to low-level perceptual attributes. This study used motional manual gestures to test whether and how the loudness perception can be modulated by visual-motion information. Specifically, we implemented a novel paradigm in which participants compared the loudness of two consecutive sounds whose intensity changes around the just noticeable difference (JND), with manual gestures concurrently presented with the second sound. In two behavioral experiments and two EEG experiments, we investigated our hypothesis that the visual-motor information in gestures would modulate loudness perception. Behavioral results showed that the gestural information biased the judgment of loudness. More importantly, the EEG results demonstrated that early auditory responses around 100 ms after sound onset (N100) were modulated by the gestures. These consistent results in four behavioral and EEG experiments suggest that visual-motor processing can integrate with auditory processing at an early perceptual stage to shape the perception of a low-level perceptual attribute such as loudness, at least under challenging listening conditions.

2021 ◽  
Vol 13 (1) ◽  
Author(s):  
Jannath Begum-Ali ◽  
◽  
Anna Kolesnik-Taylor ◽  
Isabel Quiroz ◽  
Luke Mason ◽  
...  

Abstract Background Sensory modulation difficulties are common in children with conditions such as Autism Spectrum Disorder (ASD) and could contribute to other social and non-social symptoms. Positing a causal role for sensory processing differences requires observing atypical sensory reactivity prior to the emergence of other symptoms, which can be achieved through prospective studies. Methods In this longitudinal study, we examined auditory repetition suppression and change detection at 5 and 10 months in infants with and without Neurofibromatosis Type 1 (NF1), a condition associated with higher likelihood of developing ASD. Results In typically developing infants, suppression to vowel repetition and enhanced responses to vowel/pitch change decreased with age over posterior regions, becoming more frontally specific; age-related change was diminished in the NF1 group. Whilst both groups detected changes in vowel and pitch, the NF1 group were largely slower to show a differentiated neural response. Auditory responses did not relate to later language, but were related to later ASD traits. Conclusions These findings represent the first demonstration of atypical brain responses to sounds in infants with NF1 and suggest they may relate to the likelihood of later ASD.


2020 ◽  
Vol 10 (12) ◽  
pp. 993
Author(s):  
Sara Mascheretti ◽  
Valentina Riva ◽  
Bei Feng ◽  
Vittoria Trezzi ◽  
Chiara Andreola ◽  
...  

Although substantial heritability has been reported and candidate genes have been identified, we are far from understanding the etiopathogenetic pathways underlying developmental dyslexia (DD). Reading-related endophenotypes (EPs) have been established. Until now it was unknown whether they mediated the pathway from gene to reading (dis)ability. Thus, in a sample of 223 siblings from nuclear families with DD and 79 unrelated typical readers, we tested four EPs (i.e., rapid auditory processing, rapid automatized naming, multisensory nonspatial attention and visual motion processing) and 20 markers spanning five DD-candidate genes (i.e., DYX1C1, DCDC2, KIAA0319, ROBO1 and GRIN2B) using a multiple-predictor/multiple-mediator framework. Our results show that rapid auditory and visual motion processing are mediators in the pathway from ROBO1-rs9853895 to reading. Specifically, the T/T genotype group predicts impairments in rapid auditory and visual motion processing which, in turn, predict poorer reading skills. Our results suggest that ROBO1 is related to reading via multisensory temporal processing. These findings support the use of EPs as an effective approach to disentangling the complex pathways between candidate genes and behavior.


2019 ◽  
Vol 30 (3) ◽  
pp. 942-951 ◽  
Author(s):  
Lanfang Liu ◽  
Yuxuan Zhang ◽  
Qi Zhou ◽  
Douglas D Garrett ◽  
Chunming Lu ◽  
...  

Abstract Whether auditory processing of speech relies on reference to the articulatory motor information of speaker remains elusive. Here, we addressed this issue under a two-brain framework. Functional magnetic resonance imaging was applied to record the brain activities of speakers when telling real-life stories and later of listeners when listening to the audio recordings of these stories. Based on between-brain seed-to-voxel correlation analyses, we revealed that neural dynamics in listeners’ auditory temporal cortex are temporally coupled with the dynamics in the speaker’s larynx/phonation area. Moreover, the coupling response in listener’s left auditory temporal cortex follows the hierarchical organization for speech processing, with response lags in A1+, STG/STS, and MTG increasing linearly. Further, listeners showing greater coupling responses understand the speech better. When comprehension fails, such interbrain auditory-articulation coupling vanishes substantially. These findings suggest that a listener’s auditory system and a speaker’s articulatory system are inherently aligned during naturalistic verbal interaction, and such alignment is associated with high-level information transfer from the speaker to the listener. Our study provides reliable evidence supporting that references to the articulatory motor information of speaker facilitate speech comprehension under a naturalistic scene.


2021 ◽  
Author(s):  
Sudha Sharma ◽  
Hemant Kumar Srivastava ◽  
Sharba Bandyopadhyay

AbstractSo far, our understanding on the role of the auditory cortex (ACX) in processing visual information has been limited to infragranular layers of the ACX, which have been shown to respond to visual stimulation. Here, we investigate the neurons in supragranular layers of the mouse ACX using 2-photon calcium imaging. Contrary to previous reports, here we show that more than 20% of responding neurons in layer2/3 of the ACX respond to full-field visual stimulation. These responses occur by both excitation and hyperpolarization. The primary ACX (A1) has a greater proportion of visual responses by hyperpolarization compared to excitation likely driven by inhibitory neurons of the infragranular layers of the ACX rather than local layer 2/3 inhibitory neurons. Further, we found that more than 60% of neurons in the layer 2/3 of A1 are multisensory in nature. We also show the presence of multisensory neurons in close proximity to exclusive auditory neurons and that there is a reduction in the noise correlations of the recorded neurons during multisensory presentation. This is evidence in favour of deep and intricate visual influence over auditory processing. The results have strong implications for decoding visual influences over the early auditory cortical regions.Significance statementTo understand, what features of our visual world are processed in the auditory cortex (ACX), understanding response properties of auditory cortical neurons to visual stimuli is important. Here, we show the presence of visual and multisensory responses in the supragranular layers of the ACX. Hyperpolarization to visual stimulation is more commonly observed in the primary ACX. Multisensory stimulation results in suppression of responses compared to unisensory stimulation and an overall decrease in noise correlation in the primary ACX. The close-knit architecture of these neurons with auditory specific neurons suggests the influence of non-auditory stimuli on the auditory processing.


2008 ◽  
Vol 123 (5) ◽  
pp. 3564-3564
Author(s):  
Anna C. Bonnel ◽  
Stephen McAdams ◽  
Bennett K. Smith ◽  
Armando Bertone ◽  
Jake A. Burack ◽  
...  

Author(s):  
Stuart Anstis

Visual motion stimuli can be defined as changes in luminance over space and time. Both kinds of change can alter the perceived speed and direction of motion. This chapter covers crossover motion; reverse phi, in which motion between a positive and a negative appears to go backward; the bicycle spokes illusion; the footsteps effect, in which smooth movement looks jerky if the background is striped; zigzag motion, whose direction appears to change with viewing distance; and the furrow illusion of motion, whose direction appears to change when viewed by the fovea versus the periphery. Other concepts covered include the chopstick illusion and the footsteps illusion.


1982 ◽  
Vol 54 (3) ◽  
pp. 723-750 ◽  
Author(s):  
Joseph M. Notterman ◽  
Daniel R. Tufano ◽  
Jeffrey Scott Hrapsky

The research described in this monograph uses control theory's pursuit-tracking paradigm of voluntary movement to identify several elementary psychomotor tasks. They are simple to administer and tap increasingly complex, nonverbal cognitive or perceptual attributes. Two series of experiments are reported. Study 1 examined the hypothesis that dissimilar arrays of individual differences, as determined through test-retest correlations, may exist among the same subjects: first, across various static and dynamic visual and motor “tasks” selected from the terms of control theory's tracking equations and, second, in the organization of these tasks as represented by pursuit tracking. The hypothesis could not be rejected. Study 2 determined that test-retest individual differences in visual-motor organization not only persisted in the absence of practice, but that they also withstood active intervention by practice. This study also showed that subjects differ reliably in their ability to plan, i.e., to take advantage of coherence in visual-motor information.


2020 ◽  
Vol 32 (9) ◽  
pp. 1654-1671
Author(s):  
Melisa Menceloglu ◽  
Marcia Grabowecky ◽  
Satoru Suzuki

Sensory systems utilize temporal structure in the environment to build expectations about the timing of forthcoming events. We investigated the effects of rhythm-based temporal expectation on auditory responses measured with EEG recorded from the frontocentral sites implicated in auditory processing. By manipulating temporal expectation and the interonset interval (IOI) of tones, we examined how neural responses adapted to auditory rhythm and reacted to stimuli that violated the rhythm. Participants passively listened to the tones while watching a silent nature video. In Experiment 1 ( n = 22), in the long-IOI block, tones were frequently presented (80%) with 1.7-sec IOI and infrequently presented (20%) with 1.2-sec IOI, generating unexpectedly early tones that violated temporal expectation. Conversely, in the short-IOI block, tones were frequently presented with 1.2-sec IOI and infrequently presented with 1.7-sec IOI, generating late tones. We analyzed the tone-evoked N1–P2 amplitude of ERPs and intertrial phase clustering in the theta–alpha band. The results provided evidence of strong delay-dependent adaptation effects (short-term, sensitive to IOI), weak cumulative adaptation effects (long-term, driven by tone repetition over time), and robust temporal-expectation violation effects over and above the adaptation effects. Experiment 2 ( n = 22) repeated Experiment 1 with shorter IOIs of 1.2 and 0.7 sec. Overall, we found evidence of strong delay-dependent adaptation effects, weak cumulative adaptation effects (which may most efficiently accumulate at the tone presentation rate of ∼1 Hz), and robust temporal-expectation violation effects that substantially boost auditory responses to the extent of overriding the delay-dependent adaptation effects likely through mechanisms involved in exogenous attention.


1982 ◽  
Vol 1 (3) ◽  
pp. 97-108 ◽  
Author(s):  
S. Benton ◽  
H.G. Leventhall

The role played by loudness in the assessment of annoyance is seen to effect an intensity dominated concept current in noise assessment practices. Such dominance is not supported by the complex processing nature of the auditory system. The individual is placed within a context which requires the auditory system to align the person to external stimuli whilst maintaining the production of appropriate behaviours. Development of the concepts associated with audition is a pre-requisite to establishing viable noise assessment criteria. The limitations of present day criteria, with an accepted assumption of intensity as the key parameter, are accentuated when assessments are made of low level low frequency noise. Once the individual is viewed as an active processor, bodily parameters may also serve to provide indices which are derived from the amount of ‘processor work’.


Sign in / Sign up

Export Citation Format

Share Document