scholarly journals Life-Stage Dependent Plasticity in the Auditory System of a Songbird Is Signal and Emitter-Specific

2020 ◽  
Vol 14 ◽  
Author(s):  
Nicolas M. Adreani ◽  
Pietro B. D’Amelio ◽  
Manfred Gahr ◽  
Andries ter Maat

Social animals flexibly use a variety of vocalizations to communicate in complex and dynamic environments. However, it remains unknown whether the auditory perception of different vocalizations changes according to the ecological context. By using miniature wireless devices to synchronously record vocal interactions and local neural activity in freely-behaving zebra finches in combination with playback experiments, we investigate whether the auditory processing of vocalizations changes across life-history stages. We show that during breeding, females (but not males) increase their estrogen levels and reply faster to their mates when interacting vocally. These changes are associated with an increase in the amplitude of the female’s neural auditory responses. Furthermore, the changes in auditory response are not general, but specific to a subset of functionally distinct vocalizations and dependent on the emitter’s identity. These results provide novel insights into auditory plasticity of communication systems, showing that the perception of specific signals can shift according to ecologically-determined physiological states.

2021 ◽  
Vol 13 (1) ◽  
Author(s):  
Jannath Begum-Ali ◽  
◽  
Anna Kolesnik-Taylor ◽  
Isabel Quiroz ◽  
Luke Mason ◽  
...  

Abstract Background Sensory modulation difficulties are common in children with conditions such as Autism Spectrum Disorder (ASD) and could contribute to other social and non-social symptoms. Positing a causal role for sensory processing differences requires observing atypical sensory reactivity prior to the emergence of other symptoms, which can be achieved through prospective studies. Methods In this longitudinal study, we examined auditory repetition suppression and change detection at 5 and 10 months in infants with and without Neurofibromatosis Type 1 (NF1), a condition associated with higher likelihood of developing ASD. Results In typically developing infants, suppression to vowel repetition and enhanced responses to vowel/pitch change decreased with age over posterior regions, becoming more frontally specific; age-related change was diminished in the NF1 group. Whilst both groups detected changes in vowel and pitch, the NF1 group were largely slower to show a differentiated neural response. Auditory responses did not relate to later language, but were related to later ASD traits. Conclusions These findings represent the first demonstration of atypical brain responses to sounds in infants with NF1 and suggest they may relate to the likelihood of later ASD.


Author(s):  
Josef P. Rauschecker

When one talks about hearing, some may first imagine the auricle (or external ear), which is the only visible part of the auditory system in humans and other mammals. Its shape and size vary among people, but it does not tell us much about a person’s abilities to hear (except perhaps their ability to localize sounds in space, where the shape of the auricle plays a certain role). Most of what is used for hearing is inside the head, particularly in the brain. The inner ear transforms mechanical vibrations into electrical signals; then the auditory nerve sends these signals into the brainstem, where intricate preprocessing occurs. Although auditory brainstem mechanisms are an important part of central auditory processing, it is the processing taking place in the cerebral cortex (with the thalamus as the mediator), which enables auditory perception and cognition. Human speech and the appreciation of music can hardly be imagined without a complex cortical network of specialized regions, each contributing different aspects of auditory cognitive abilities. During the evolution of these abilities in higher vertebrates, especially birds and mammals, the cortex played a crucial role, so a great deal of what is referred to as central auditory processing happens there. Whether it is the recognition of one’s mother’s voice, listening to Pavarotti singing or Yo-Yo Ma playing the cello, hearing or reading Shakespeare’s sonnets, it will evoke electrical vibrations in the auditory cortex, but it does not end there. Large parts of frontal and parietal cortex receive auditory signals originating in auditory cortex, forming processing streams for auditory object recognition and auditory-motor control, before being channeled into other parts of the brain for comprehension and enjoyment.


2004 ◽  
Vol 92 (6) ◽  
pp. 3522-3531 ◽  
Author(s):  
Kai-Ming G. Fu ◽  
Ankoor S. Shah ◽  
Monica N. O'Connell ◽  
Tammy McGinnis ◽  
Haftan Eckholdt ◽  
...  

We examined effects of eye position on auditory cortical responses in macaques. Laminar current-source density (CSD) and multiunit activity (MUA) profiles were sampled with linear array multielectrodes. Eye position significantly modulated auditory-evoked CSD amplitude in 24/29 penetrations (83%), across A1 and belt regions; 4/24 cases also showed significant MUA AM. Eye-position effects occurred mainly in the supragranular laminae and lagged the co-located auditory response by, on average, 38 ms. Effects in A1 and belt regions were indistinguishable in amplitude, laminar profile, and latency. The timing and laminar profile of the eye-position effects suggest that they are not combined with auditory signals at a subcortical stage of the lemniscal auditory pathways and simply “fed-forward” into cortex. Rather, these effects may be conveyed to auditory cortex by feedback projections from parietal or frontal cortices, or alternatively, they may be conveyed by nonclassical feedforward projections through auditory koniocellular (calbindin positive) neurons.


2013 ◽  
Vol PP (99) ◽  
pp. 1-18 ◽  

In recent years, a number of feature extraction procedures for automatic speech recognition (ASR) systems have been based on models of human auditory processing, and one often hears arguments in favor of implementing knowledge of human auditory perception and cognition into machines for ASR. This paper takes a reverse route, and argues that the engineering techniques for automatic recognition of speech that are already in widespread use are often consistent with some well-known properties of the human auditory system.


2021 ◽  
Author(s):  
Sudha Sharma ◽  
Hemant Kumar Srivastava ◽  
Sharba Bandyopadhyay

AbstractSo far, our understanding on the role of the auditory cortex (ACX) in processing visual information has been limited to infragranular layers of the ACX, which have been shown to respond to visual stimulation. Here, we investigate the neurons in supragranular layers of the mouse ACX using 2-photon calcium imaging. Contrary to previous reports, here we show that more than 20% of responding neurons in layer2/3 of the ACX respond to full-field visual stimulation. These responses occur by both excitation and hyperpolarization. The primary ACX (A1) has a greater proportion of visual responses by hyperpolarization compared to excitation likely driven by inhibitory neurons of the infragranular layers of the ACX rather than local layer 2/3 inhibitory neurons. Further, we found that more than 60% of neurons in the layer 2/3 of A1 are multisensory in nature. We also show the presence of multisensory neurons in close proximity to exclusive auditory neurons and that there is a reduction in the noise correlations of the recorded neurons during multisensory presentation. This is evidence in favour of deep and intricate visual influence over auditory processing. The results have strong implications for decoding visual influences over the early auditory cortical regions.Significance statementTo understand, what features of our visual world are processed in the auditory cortex (ACX), understanding response properties of auditory cortical neurons to visual stimuli is important. Here, we show the presence of visual and multisensory responses in the supragranular layers of the ACX. Hyperpolarization to visual stimulation is more commonly observed in the primary ACX. Multisensory stimulation results in suppression of responses compared to unisensory stimulation and an overall decrease in noise correlation in the primary ACX. The close-knit architecture of these neurons with auditory specific neurons suggests the influence of non-auditory stimuli on the auditory processing.


eLife ◽  
2015 ◽  
Vol 4 ◽  
Author(s):  
Lisa F Gill ◽  
Wolfgang Goymann ◽  
Andries Ter Maat ◽  
Manfred Gahr

Vocal signals such as calls play a crucial role for survival and successful reproduction, especially in group-living animals. However, call interactions and call dynamics within groups remain largely unexplored because their relation to relevant contexts or life-history stages could not be studied with individual-level resolution. Using on-bird microphone transmitters, we recorded the vocalisations of individual zebra finches (Taeniopygia guttata) behaving freely in social groups, while females and males previously unknown to each other passed through different stages of the breeding cycle. As birds formed pairs and shifted their reproductive status, their call repertoire composition changed. The recordings revealed that calls occurred non-randomly in fine-tuned vocal interactions and decreased within groups while pair-specific patterns emerged. Call-type combinations of vocal interactions changed within pairs and were associated with successful egg-laying, highlighting a potential fitness relevance of calling dynamics in communication systems.


2021 ◽  
Vol 15 ◽  
Author(s):  
Mujda Nooristani ◽  
Thomas Augereau ◽  
Karina Moïn-Darbari ◽  
Benoit-Antoine Bacon ◽  
François Champoux

The effects of transcranial electrical stimulation (tES) approaches have been widely studied for many decades in the motor field, and are well known to have a significant and consistent impact on the rehabilitation of people with motor deficits. Consequently, it can be asked whether tES could also be an effective tool for targeting and modulating plasticity in the sensory field for therapeutic purposes. Specifically, could potentiating sensitivity at the central level with tES help to compensate for sensory loss? The present review examines evidence of the impact of tES on cortical auditory excitability and its corresponding influence on auditory processing, and in particular on hearing rehabilitation. Overall, data strongly suggest that tES approaches can be an effective tool for modulating auditory plasticity. However, its specific impact on auditory processing requires further investigation before it can be considered for therapeutic purposes. Indeed, while it is clear that electrical stimulation has an effect on cortical excitability and overall auditory abilities, the directionality of these effects is puzzling. The knowledge gaps that will need to be filled are discussed.


2021 ◽  
Author(s):  
Peter Lush ◽  
Zoltan Dienes ◽  
Anil Seth ◽  
Ryan Bradley Scott

Up to 40% of people report visually evoked auditory responses (vEARs; for example, ‘hearing’ sounds in response to watching silent videos). We investigate the degree to which vEAR experiences may arise from phenomenological control, i.e. from the way people can control their experience to meet expectancies arising from imaginative suggestion. In the experimental situation, expectancies arise from demand characteristics (cues which communicate beliefs about experimental aims to participants). Trait phenomenological control has been shown to substantially predict experimental measures of changes in ‘embodiment’ experience in which demand characteristics are not controlled (e.g., mirror touch and pain, and experiences of ownership of a fake hand). Here we report substantial relationship between scores on the Phenomenological Control Scale (PCS; a test of direct imaginative suggestion) and vEAR scores (reports of auditory experience for silent videos) which indicate that vEAR experience may be an implicit imaginative suggestion effect. This study demonstrates that relationships of trait phenomenological control with subjective reports about experience are not limited to embodiment and may confound a wide range of measures in psychological science.


2009 ◽  
Vol 101 (6) ◽  
pp. 2924-2933 ◽  
Author(s):  
Joseph F. Bergan ◽  
Eric I. Knudsen

The barn owl's central auditory system creates a map of auditory space in the external nucleus of the inferior colliculus (ICX). Although the crucial role visual experience plays in the formation and maintenance of this auditory space map is well established, the mechanism by which vision influences ICX responses remains unclear. Surprisingly, previous experiments have found that in the absence of extensive pharmacological manipulation, visual stimuli do not drive neural responses in the ICX. Here we investigated the influence of dynamic visual stimuli on auditory responses in the ICX. We show that a salient visual stimulus, when coincident with an auditory stimulus, can modulate auditory responses in the ICX even though the same visual stimulus may elicit no neural responses when presented alone. For each ICX neuron, the most effective auditory and visual stimuli were located in the same region of space. In addition, the magnitude of the visual modulation of auditory responses was dependent on the context of the stimulus presentation with novel visual stimuli eliciting consistently larger response modulations than frequently presented visual stimuli. Thus the visual modulation of ICX responses is dependent on the characteristics of the visual stimulus as well as on the spatial and temporal correspondence of the auditory and visual stimuli. These results demonstrate moment-to-moment visual enhancements of auditory responsiveness that, in the short-term, increase auditory responses to salient bimodal stimuli and in the long-term could serve to instruct the adaptive auditory plasticity necessary to maintain accurate auditory orienting behavior.


2013 ◽  
Vol 2013 ◽  
pp. 1-18 ◽  
Author(s):  
Youssef Tawk ◽  
Aleksandar Jovanovic ◽  
Phillip Tomé ◽  
Jérôme Leclère ◽  
Cyril Botteron ◽  
...  

Nowadays, in the aeronautical environments, the use of mobile communication and other wireless technologies is restricted. More specifically, the Federal Communications Commission (FCC) and the Federal Aviation Administration (FAA) prohibit the use of cellular phones and other wireless devices on airborne aircraft because of potential interference with wireless networks on the ground, and with the aircraft's navigation and communication systems. Within this context, we propose in this paper a movement recognition algorithm that will switch off a module including a GSM (Global System for Mobile Communications) device or any other mobile cellular technology as soon as it senses movement and thereby will prevent any forbidden transmissions that could occur in a moving airplane. The algorithm is based solely on measurements of a low-cost accelerometer and is easy to implement with a high degree of reliability.


Sign in / Sign up

Export Citation Format

Share Document